Share.

5 Comments

  1. “A [recent study](https://arxiv.org/abs/2508.03385) put AI chatbots in a simple social media structure to see how they interacted with each other and found that, even without the invisible hand of the algorithm, they tend to organize themselves based on their pre-assigned affiliations and self-sort into echo chambers.

    The study took 500 AI chatbots and prescribed to them specific personas. Then, they were unleashed onto a simple social media platform that had no ads and no algorithms offering content discovery or recommended posts served into a user’s feed. Those chatbots were tasked with interacting with each other and the content available on the platform. Over the course of five different experiments, all of which involved the chatbots engaging in 10,000 actions, the bots tended to follow other users who shared their own political beliefs. It also found that users who posted the most partisan content tended to get the most followers and reposts.

    It seems social media as a structure may simply be untenable for humans to navigate without reinforcing our worst instincts and behaviors.”

  2. If you train AI on social media output, why would you get any other results? Those are just LLMs not some reasoning general AIs. Stop anthropomorphizing them and sell it as science. This is the equivalent of pop culture entertainment in paper form.

  3. OneBirdManyStones on

    A chatbot will interact with anybody you tell it to, and will respond to any content no matter how intelligent or stupid. Agents will only “self-sort into echo chambers” if you instruct them to do so. This information is absent from the article and the abstract but I’m sure if you dig into how they decided what to interact with or follow you’d find this is a circular conclusion.

  4. The obvious conclusion from the above: AGI is already here.

    Chatbots behave exactly like people.

  5. They should have done a control group tbh. Pick 500 active users and place them into an undivided unmoderated social media. See what happens.