Your comment to "study AI as an anthropologist would in studying an alien intelligence" really resonated with my experience. A few months after working ontensively with AI, I noticed that I was tapping into the skills I acquired as an anthropologist in the field to infer human motivation behind observable actions and utterances. AI is "alien" indeed but the problem is, I think, LLM can communicate so naturally with us in our language. Many ppl find it difficult to keep AI's alienness in mind because it sounds so "human."
I think that’s a really good point. I also think the apparent ease of use tends to cause many who haven’t used AI much to think there isn’t much skill involved.
Excellent analysis, it's realey insightful how you highlight the need for cognitive empathy with AI, but I'm curious if you think teaching students to model an AI's thinking without any emotional engagement might make them less empathetic overall towards humans, which is a concern that feels important for us educators.
Only if we ignore the similar skill we need with people. Usually it’s best to explain ai in comparison to humans, so I don’t see it as an either-or. Emotional empathy isn’t unimportant of course, but it isn’t super useful, even in dealing with people, as most think.
Your comment to "study AI as an anthropologist would in studying an alien intelligence" really resonated with my experience. A few months after working ontensively with AI, I noticed that I was tapping into the skills I acquired as an anthropologist in the field to infer human motivation behind observable actions and utterances. AI is "alien" indeed but the problem is, I think, LLM can communicate so naturally with us in our language. Many ppl find it difficult to keep AI's alienness in mind because it sounds so "human."
I think that’s a really good point. I also think the apparent ease of use tends to cause many who haven’t used AI much to think there isn’t much skill involved.
Excellent analysis, it's realey insightful how you highlight the need for cognitive empathy with AI, but I'm curious if you think teaching students to model an AI's thinking without any emotional engagement might make them less empathetic overall towards humans, which is a concern that feels important for us educators.
Only if we ignore the similar skill we need with people. Usually it’s best to explain ai in comparison to humans, so I don’t see it as an either-or. Emotional empathy isn’t unimportant of course, but it isn’t super useful, even in dealing with people, as most think.