Share.

25 Comments

  1. Basic-Focus2164 on

    This is a rehash of the post-truth concept.

    Social media algorithms creating silos of people who all agree with one another.

    Dystopian and beyond correction because of legalized corruption.

  2. ExpressLaneCharlie on

    I will read the article but it just doesn’t work the same on the left as it does on the right. That’s why there’s never been successes like Rush Limbaugh, Alex Jones, or Fox News on the left. Liberals typically want to learn new information while conservatives want to hear they’re right over and over and over again. That’s why there’s Wikipedia and Conservapedia. That’s why right wingers created think tanks when research and academia didn’t suit them. That’s why Christians are creating their own “scientific” journals because they can’t pass peer review in reputable scientific journals. 

  3. The algorithms are a problem. People are tied up in self reinforcing media spheres. However I think society’s belief about information and behavior is the bigger problem.

    Freedom of speech has long been championed as virtuous. In the social media era that respect for free speech has distorted itself into a sort of disrespect for being careful or thoughtful.

    Legacy media (print, TV, radio) is regulated. Journalists can be sued for stating falsehoods and there are decency standards for what’s presented. NBC can’t just air nude photos of Melanie Trump or Hunter Biden.

    As a result of being regulated Legacy media is edited and reporters tend to read scripts that have typically been reviewed for some degree of accuracy. In this Media environment scripted and edited speech doesn’t come across as Free speech and thus doesn’t come across as honest speech. People are conditioned to distrust legacy media.

    Meanwhile podcasters and social media influencers who are routinely giving hot takes and speaking without full knowledge or constraint get credit for their straight forwardness. Joe Rogan & Theo Von don’t have the FCC to contend with. On their pods they can say anything and don’t have to differentiate between what’s an ad vs a real thought. They can intermix paid promotions with hit takes and “just asking questions” speech and audiences accept it all as honest.

    The algorithms are bad. Our understanding of moderation and free speech is bad too. Being unencumbered by any rules doesn’t make one more prone to being authentic or honest. People who can take millions from advertisers without any requirement to tell their audience are NOT folks more inclined to tell the truth.

  4. Total_Brick_2416 on

    How algorithms are being manipulated, and preyed on by propaganda is a modern human rights violation imo. It’s a legitimate disaster for society.

    It’s incredibly alarming how our government was able to be hijacked in the way it has been, and algorithms/bots are a significant part of influencing public opinion.

  5. Yes, I agree with this concern. But let me present a counter argument as devils advocate: if people THINK they are happy, does it really matter if their life is objectively worse? Are people allowed to make their own choices, and if those choices lead to some impairment of their life, why should I worry about their self-induced misery?

    The most logical rebuttal is: okay, but I don’t want to be dragged down with them. Another answer is: that argument of freedom of choice presupposes some
    free will, and in a world of very sophisticated algorithms and behavioral psychology manipulation, it’s not a level playing field.

    On an individual level, I think the main things we can do are to intentionally turn away from algorithms in our daily lives and engage more with curated media, even though this requires that we pay for that media.

  6. Wouldn’t this be true for everyone that hangs out on the internet? I miss the days of private forums tbh, they might also be prone to opinion bubbles, but at least the users weren’t being manipulated by machines to cause all kinds of world wide problems…

  7. I like  a lot of security people have been sounding the alarms since FB tech is unrelated and we have no privacy laws.

    You willingly walked into giving your data away and now this has lead to were we are.

  8. cut_rate_revolution on

    It’s the concept of the mind prison. What things are you afraid to even think? You should examine why that is the case.

    Is it a good or a bad thing?

  9. Yup, recommendation engines suck and haven’t seen much improvement sense they were created. They become over it quickly which means people are locked into a small subset of content. 

    There isn’t much we can do at this point to improve them, what really need to happen is that we stop using them and start having people curate their own content. Like we did before social media 

  10. ScoobiesSnacks on

    This is the social issue that democrats should focus on. Most people dislike the algorithms and feel trapped or untrustworthy of them but don’t really know why. This should be an easy winning issue that most people can get behind, similar to Obama rallying against the 24 hour news cycle in his first campaign.

  11. Doctor-TobiasFunke- on

    IIIIIIIIIIII’m the maaaaan in the BOX

    Buuuuuuuuried innnnn my SHIT

    Woooooont youuuu cooooome and save me

    …save me

    FeeeeeeEEeEEeEEd my eyyyyesss

    Can you sew them shut?

    JeeeeeEEeEEeEEsus Chriiiiiiiist

    Deny your maker

    HeeeeeeeEEeEEeEE who triiiiEEes

    Will be wasted

    Ohh FeeeeeeEEeEEeEEd my eyyYYyess

    Now you’ve sewn them shut

  12. When you can convince a massive amount of people that legislative action is required to bully 12 transgender athletes, that immigrants are to blame for everything but not the business owners who hire them, that a rapist and felon is a better choice than a black woman, and that people who *told you they were lying to you* are the best choice….

  13. if you understand it well enough you can train algorithms via your inputs on social media. for example, i’ve selectively trained IG reels to feed me different content from TikTok. My IG basically exclusively shows me people breaking the law. TT shows me a few regular creators, food content, city content, politics and other stuff.

    Honestly social media algorithmic training could be very valuable.

  14. Ah Yes, people being exposed to other political views than the ones i have is a huge problem in society. Twitter was only just and fair before Musk bought it, when the FBI was still telling them what to censor that could harm Bidens chances of being elected.

    The enlightened big brains of reddit and bluesky are the last bastions of objectivity and reason. The endless circlejerk about killing Nazis and lunatic conspiracy theories are totally different than the deranged crazies on truth social calling the democrats child murdering communist.

    Normal people dont engage on these platforms, it’s extremist terminally online people fighting against imaginary fantasies in order to give their life meaning.

  15. Trapped? We literally walked into the cage and now we’re mad we’re there. We’ve collectively given just about every huge corporation, every bit of our information, willingly, then complain about how it’s used. I just can’t sympathize with (some) people sometimes.

  16. “Impose their version of reality on the public, even as they pursue an agenda that is nothing short of ruinous”

    Sums it up, and it’s so obvious it’s ruinous / race to the bottom. The broligarch incurious behavior to the world around them and what they’ve decided to invest in as pursuits will / is leading us all to demise. Capital will kill us all while we’re stare obediently into screens

  17. There’s a fantastic book on this called “Filterworld: How Algorithms Flattened Culture” that tells the story of how we got here.

  18. No shit lol.. It’s also not just America, it’s the entire world. The point is to keep you engaged and shovel more of the content you previously engaged with or has some properties they think might make you engage with is the easiest low hanging fruit for that.

    Funny thing is that in my experience, social media algorithms approach the same thing from different angles. TikTok and reddit find content they think I enjoy, while Instagram reels show me a mixture of stuff they think I like and things I hate (for example – my reels suggestions are nazi propaganda and Islam and I’m Jewish). Twitter will mostly show me stuff they think I’d hate or argue with.

  19. I wonder what is going to trigger a cognitive dissonance response in all those that voted the orange fucker and his loaded minion expecting all the cream. Is it going to take a disaster like in Nazi Germany?