
I came across this article: https://www.theguardian.com/commentisfree/2025/may/15/mark-zuckerberg-loneliness-epidemic-ai-friends
Do you agree with Mark Zuckerberg about the norm changing where its normal for us to rely on AI for emotional support? I mean it as more of a friend or a companion, rather than AI therapists. I already seen so many ppl use Replika, c ai, janitor ai, endearing, etc. How far do you think we are away from it becoming a norm?

7 Comments
Fucking never. Real support is from FEELING beings.
Even Zuck suggests in the linked article that it would take time and the restructuring of how that stuff is talked about, it’s nowhere near the norm yet
Of course Zuckerberg would love that.
Don’t ever trust the likes of him to tell you what’s normal, for it will always align with the prescriptions of his products, or those of his ilk. Any understanding of normality is blurred from the height of their ivory towers; they see you just as a consumer of sterile products, whereas they live a life full of real experiences.
Do not buy into his dictates, for he is lower than a worm within the natural order of things. Trust your senses, and follow real experiences.
Apparently this is happening often. I’m not sure how. I get absolutely nothing from conversing with a bot besides using it as a informational google like search or some light editing of writing. It’s very odd to me but I think it’s happening. We’re creating a pathetic, isolated, lonely society coupled with a market economy that simply makes life really, really hard for a lot of people to gather or do anything they enjoy.
It’s not that expensive to run a local model to have a basic digital pillow to scream into with some feedback loop. Most gaming viable PCs from the past 5 years (maybe 10) can do that without any investments, and keeping everything private and personal.
If someone needs more than that to feel better or as some advice for their life troubles, they need real support in the form of living empathy from a fellow human beans, not some matrix calculator hallucinations from big companies who will capitalize on people’s tears and dreams without thinking twice.
This is implying that AI is the modern definition of LLM. I may have a different perspective on sentient digital life form with self-motivated ability to express empathy, but for now this is wildly hypothetical science fiction talk.
This might be true but no one is gonna trust an AI model owned by him for that. No one wants ad agencies snooping around there therapy sessions to see where they can make a quick buck.
OpenAI already knows this is bad for you.
>People who had a stronger tendency for attachment in relationships and those who viewed the AI as a friend that could fit in their personal life were more likely to experience negative effects from chatbot use.
[https://openai.com/index/affective-use-study/](https://openai.com/index/affective-use-study/)