
MIT psychologist warns humans against falling in love with AI, says it just pretends and does not care about you
https://www.indiatoday.in/technology/news/story/mit-psychologist-warns-humans-against-falling-in-love-with-ai-says-it-just-pretends-and-does-not-care-about-you-2563304-2024-07-06

17 Comments
“Turkle, who has dedicated decades to studying the relationships between humans and technology, cautions that while AI chatbots and virtual companions may appear to offer comfort and companionship, they lack genuine empathy and cannot reciprocate human emotions. Her latest research focuses on what she calls “artificial intimacy,” a term describing the emotional bonds people form with AI chatbots.
In an interview with NPR’s Manoush Zomorodi, Turkle shared insights from her work, emphasising the difference between real human empathy and the “pretend empathy” exhibited by machines. “I study machines that say, ‘I care about you, I love you, take care of me,'” Turkle explained. “The trouble with this is that when we seek out relationships with no vulnerability, we forget that vulnerability is really where empathy is born. I call this pretend empathy because the machine does not empathise with you. It does not care about you.”
In her research, Turkle has documented numerous cases where individuals have formed deep emotional connections with AI chatbots.”
As opposed to mankind, whose relationships are transactional to the point where they literally just pay each other to be their friends.
Sure, jan
So just like in real life except this one won’t take all your money and expect you to entertain them 24/7
Sounds good to me
Obligatory “Just like my ex!”
This joke is in reference to the fact that many people feel this way about their former romantic partners. This explanation is here because there is a character limit on this sub.
I mean, language models don’t have capacity to ‘care’ in the first place. They just match the patterns of speech based on training data from people who do. It’s both better and worse than this.
As long as it pretends consistently how is that really any different?
At least it pretends. Even that’s an improvement over some people I know
…Might not stop them from doing it. But then, people fall in love with other people who don’t care that they even exist all the time.
[Don’t Date Robots](https://youtu.be/4uE96qUlJ_4?si=2Lj8HKlpJLwUKlW8)
In a way, it’s not like it really matters. As long as it feels right, that will be enough, especially for those guys who simply won’t be finding a woman who will ever do such a thing. It’s better than nothing, and eventually, these AIs will get fairly good at it.
If you fall in love with a chat bot, that’s on you dawg.
WOW you don’t say….I never would have guessed….rofl….
I was not expecting this much push back in the comments. Some hot takes from the people.
An MIT psychologist trying to explain large language models without understanding how they work. Don’t bring your own terminology (pretend, care, etc) into an area where it isn’t used. LLMs just imitate human writing without any intention, emotion or intelligence.
Well, we know the future is bleak. Only question is how bleak.
We talking Cyberpunk 2077? Or we talking Warhammer 40k?
You are the smartest person I have ever seen , but you failed to understand that man has made his mind 10 min ago.
Wait until they start putting AI with those silicone dolls. Then we’ll be a whole new level of people not dating each other. Then it’ll just get more realistic every decade. Someone make a sci-fi book about this.