Grok, the Child Porn Generator, Should Be Illegal

https://www.burnsnotice.com/grok-the-child-porn-generator-should-be-illegal/

31 Comments

  1. BarbellsandBurritos on

    Just give it, and anyone who uses it for that reason all the same penalties as if it wasn’t AI.

  2. AnalogAficionado on

    the irony is the same sort of people who joined the moral majority and screamed about normal porn and obscenity in general in the past are now looking the other way. I guess because the “right sort” are the ones doing it?

  3. I thought we’d already established that the law doesn’t apply to these people?

    Deal with it, they are above the law.

  4. They’re going after the prompters instead of Grok. Which is just ridiculous. Would you go only after the guy buying meth and leave the dealer who gives it to him alone, just because he’s only doing what the guy asked?

  5. Can the “reporters and news agencies” finally grow a spine please?

    Having an “automatic” option to make nude images of any clothed person is incredibly creepy and should automatically violate any civilized nation’s privacy laws.

    Making it also do the same for children should automatically have an FBI or whatever international agency alphabet soup raid their headquarters and invite their developers into a lengthy discussion.

    If any normal person would do this, the police would come knocking faster than marines on shore leave.

  6. redyellowblue5031 on

    It’s not just Grok and I would hope people don’t just focus there because Elon.

    The problem of AI child porn is growing and our laws can’t easily keep up with it. Current president basically makes it such that if the image can’t be tied to a real person, you can’t really prosecute.

    Nevermind it could be used to blackmail, intimidate, or otherwise coerce someone and spread over the internet.

  7. ScaredAndImpaired on

    Hol’ up, if it’s making it then doesn’t that mean it must’ve been trained on it? Wtf kind of data is Elon feeding it with to get those results? Dude’s self-reporting

  8. This is why we can’t have nice things. Not that I think grok is nice, but still, what a shit show

  9. That would imply that some got into the training data right? Fucking hell what is going on with this shit. Arrest these fucking weirdos at Twitter

  10. It is not the simple usual idiotic “ban the knives” situation when the knives have the potential ability to detect bad intent and refuse working while at the same time call 911.

  11. the_red_scimitar on

    The underlying technical problem is that when any LLM generates an image, it doesn’t have the ability to know what it looks like before presenting it. All it does is output commands to separate image generation software, that it “thinks” matches what you asked for (and these commands can be very complex). When the software completes, it just sends the images. YOU have to review it and tell it what’s wrong.

    In fact, just yesterday I saw an article bragging how one company is just now adding the ability to “see” the image before sending it on. I hope so, because with that, if it STILL does it, you can presume more culpability and less “sorry, tech’s complicated, bruh”.

  12. They’re talking about computer generated images of bikini clad teenagers. Never would’ve guessed that by reading the clickbait headline.

    Ageing trans person paints teenage boys and early 20s men as pedophiles for being sexually interested in people who are roughly in their age group by torturing the most obscurely puritanical and technical meanings of pedophilia and pornography possible while treating computer generated images as photographic evidence since the two look similar.

  13. Aranthos-Faroth on

    It’s a complete fucking nightmare tool. I tried to generate some basic content for placeholder images for a website m I’m building but it went real borderline weird on some of them.

    Absolutely unusable and concerning that this can even happen. I genuinely want there to be some sort of serious investigation into what their tool was trained on.

    The fact that people can, either willingly or unwillingly, just sign up with a google account or whatever and suddenly produce this stuff is absolutely fucking mind blowing bad.

    It absolutely should be completely disabled until they’re able to fix this. You can’t just launch a fucking horrible tool like this and just let it slide day by day and fix it behind the scenes.

  14. Human-Place-3544 on

    It should be released, the developers know what the people use it for but money over morals

  15. astrozombie2012 on

    Honestly, ai for public consumption is a terrible idea mainly because of the kind of degenerates that use Grok/Twitter and this is just further evidence to prove it.

  16. The guy that comes into the post and innocently asks : “Wait, so it actually makes child porn and removes women’s clothes? How does that actually work?” 👀

    *cough* No, really…like, how does it really work? *cough*
    🤔

  17. Yeah this should more than enough reason to sue it into oblivion. That and all the copy rights.

  18. GetOutOfTheWhey on

    i dont use grok

    but is that shit still generating CP? Wtf?

    How is this not insta banned from every country?

  19. TwistedPepperCan on

    This is the type of thing someone would only do if they were

    A) The richest man in the world

    B) Felt they were completely untouchable and above any individual nation state.

  20. SarcasticBench on

    Careful, there’s a well known documentary about turning off AI. It’s directed by James Cameron

  21. Muffythepussyhunter on

    Surely every user that has tried asking for anything illegal will be arrested how is it the softwares fault