Share.

16 Comments

  1. SS: Researchers at Cambridge are warning about a new ‘intention economy’ where AI tools predict and influence what you do – like booking a trip or even voting – and sell that info to the highest bidder. Using models like ChatGPT, AI could personalize suggestions based on your data, subtly steering you in specific directions. The study raises big questions about how this could impact elections, competition, and personal freedom. Should we be worried, or is regulation the answer before it gets out of hand?

  2. I don’t see how this is any different to the marketing algorithms we’re subject to already, just a different way to achieve the same thing.

  3. Soon? Yo! Have you ever visited Netflix or Amazon? TikTok? Youtube? Companies have been using AI tools to manipulate my online decision making for **years**.

  4. ZombiesAtKendall on

    Yeah, no real surprise there, they will probably have comments specifically targeted at you, for all you know, this is one of those comments. And maybe you say that I am just making this up, but maybe that’s what the algorithm wants me to say to make you think what you think, like nah this isn’t some targeted comment, when really it is, like reverse, inverse, counter intuitive, backwards, reworked psychology. Knowing you better than you know you. What makes you angry, what turns you on, what angrily turns you on. Ring a bell, suddenly you’re manhattan man, turning out tricks to subdue your underdoses, don’t understand what I mean? You soon will. Backstreet Boys reunion tour 2027.

  5. SmashinglyGoodTrout on

    That’s the goal yes. Everything is now made to part us with our money and steer our decision making in favour of what the powers that be want.

  6. Ive been thinking lately about all the ways the internet can be broken by AI. Fundamentally i think that it is possible for AI to undermine the trust that people have in using the internet. Once people become wise to it and the reality is that every time you interact with the internet its trying to manipulate you for someone else gain. That the point where trust is undermined. There will always be people who don’t care but there will also be others who will stop using it. The question I have is who is allowing AI chats to initiate conversation with them? I guess is not long before they have access to messaging apps and simply just send you a message. Yikes! Once trust is undermined its over.

  7. Why do these titles always say “may”? We already know THEY ARE… Do we really need to lie to ourselfs? There’s a reason why someone died of “lead poisoning” not long ago, due to these biased for profit “AI tools” denying claims that shouldn’t have been denied to begin with. And for what? To increase their profit, so some useless CEO and shareholders can make more money? I guess I can confidently say, that yes AI tools ARE indeed manipulating people online.

  8. Maybe it’ll make it faster or accessible by more businesses but Amazon has been manipulating every user of their site in this way for ages.

  9. Plenty_Advance7513 on

    I think it’s going to come out that states have been using a.i. behind the scenes for decision making scenarios or evaluating people in more ways than one