Just give it, and anyone who uses it for that reason all the same penalties as if it wasn’t AI.
AnalogAficionado on
the irony is the same sort of people who joined the moral majority and screamed about normal porn and obscenity in general in the past are now looking the other way. I guess because the “right sort” are the ones doing it?
b_a_t_m_4_n on
I thought we’d already established that the law doesn’t apply to these people?
Deal with it, they are above the law.
Sojum on
They’re going after the prompters instead of Grok. Which is just ridiculous. Would you go only after the guy buying meth and leave the dealer who gives it to him alone, just because he’s only doing what the guy asked?
Tequilla_Sunsett on
It’s owner should be illegal too
EasterEggArt on
Can the “reporters and news agencies” finally grow a spine please?
Having an “automatic” option to make nude images of any clothed person is incredibly creepy and should automatically violate any civilized nation’s privacy laws.
Making it also do the same for children should automatically have an FBI or whatever international agency alphabet soup raid their headquarters and invite their developers into a lengthy discussion.
If any normal person would do this, the police would come knocking faster than marines on shore leave.
redyellowblue5031 on
It’s not just Grok and I would hope people don’t just focus there because Elon.
The problem of AI child porn is growing and our laws can’t easily keep up with it. Current president basically makes it such that if the image can’t be tied to a real person, you can’t really prosecute.
Nevermind it could be used to blackmail, intimidate, or otherwise coerce someone and spread over the internet.
ScaredAndImpaired on
Hol’ up, if it’s making it then doesn’t that mean it must’ve been trained on it? Wtf kind of data is Elon feeding it with to get those results? Dude’s self-reporting
daHaus on
This is why we can’t have nice things. Not that I think grok is nice, but still, what a shit show
Skullfurious on
That would imply that some got into the training data right? Fucking hell what is going on with this shit. Arrest these fucking weirdos at Twitter
-XanderCrews- on
Hey now, it’s also a Nazibot. It can do two things.
CurrentlyLucid on
Why isn’t musk locked up for illegal porn? He seems to be supplying it.
It is not the simple usual idiotic “ban the knives” situation when the knives have the potential ability to detect bad intent and refuse working while at the same time call 911.
the_red_scimitar on
The underlying technical problem is that when any LLM generates an image, it doesn’t have the ability to know what it looks like before presenting it. All it does is output commands to separate image generation software, that it “thinks” matches what you asked for (and these commands can be very complex). When the software completes, it just sends the images. YOU have to review it and tell it what’s wrong.
In fact, just yesterday I saw an article bragging how one company is just now adding the ability to “see” the image before sending it on. I hope so, because with that, if it STILL does it, you can presume more culpability and less “sorry, tech’s complicated, bruh”.
Involution88 on
They’re talking about computer generated images of bikini clad teenagers. Never would’ve guessed that by reading the clickbait headline.
Ageing trans person paints teenage boys and early 20s men as pedophiles for being sexually interested in people who are roughly in their age group by torturing the most obscurely puritanical and technical meanings of pedophilia and pornography possible while treating computer generated images as photographic evidence since the two look similar.
Aranthos-Faroth on
It’s a complete fucking nightmare tool. I tried to generate some basic content for placeholder images for a website m I’m building but it went real borderline weird on some of them.
Absolutely unusable and concerning that this can even happen. I genuinely want there to be some sort of serious investigation into what their tool was trained on.
The fact that people can, either willingly or unwillingly, just sign up with a google account or whatever and suddenly produce this stuff is absolutely fucking mind blowing bad.
It absolutely should be completely disabled until they’re able to fix this. You can’t just launch a fucking horrible tool like this and just let it slide day by day and fix it behind the scenes.
j0y0 on
Really good choice of painting for this metaphor.Â
RosBlush on
The amount of fucked up shit that you’d run into over there is insane
Human-Place-3544 on
It should be released, the developers know what the people use it for but money over morals
astrozombie2012 on
Honestly, ai for public consumption is a terrible idea mainly because of the kind of degenerates that use Grok/Twitter and this is just further evidence to prove it.
TheDogtor-- on
The guy that comes into the post and innocently asks : “Wait, so it actually makes child porn and removes women’s clothes? How does that actually work?” 👀
*cough* No, really…like, how does it really work? *cough*
🤔
CorgiKnightStudios on
Glad I never used that.
willismthomp on
Yeah this should more than enough reason to sue it into oblivion. That and all the copy rights.
Grand0rk on
The title was 100% Ai Generated, lol.
GetOutOfTheWhey on
i dont use grok
but is that shit still generating CP? Wtf?
How is this not insta banned from every country?
TwistedPepperCan on
This is the type of thing someone would only do if they were
A) The richest man in the world
B) Felt they were completely untouchable and above any individual nation state.
SarcasticBench on
Careful, there’s a well known documentary about turning off AI. It’s directed by James Cameron
Muffythepussyhunter on
Surely every user that has tried asking for anything illegal will be arrested how is it the softwares fault
31 Comments
It’s owner should be prosecuted.
Just give it, and anyone who uses it for that reason all the same penalties as if it wasn’t AI.
the irony is the same sort of people who joined the moral majority and screamed about normal porn and obscenity in general in the past are now looking the other way. I guess because the “right sort” are the ones doing it?
I thought we’d already established that the law doesn’t apply to these people?
Deal with it, they are above the law.
They’re going after the prompters instead of Grok. Which is just ridiculous. Would you go only after the guy buying meth and leave the dealer who gives it to him alone, just because he’s only doing what the guy asked?
It’s owner should be illegal too
Can the “reporters and news agencies” finally grow a spine please?
Having an “automatic” option to make nude images of any clothed person is incredibly creepy and should automatically violate any civilized nation’s privacy laws.
Making it also do the same for children should automatically have an FBI or whatever international agency alphabet soup raid their headquarters and invite their developers into a lengthy discussion.
If any normal person would do this, the police would come knocking faster than marines on shore leave.
It’s not just Grok and I would hope people don’t just focus there because Elon.
The problem of AI child porn is growing and our laws can’t easily keep up with it. Current president basically makes it such that if the image can’t be tied to a real person, you can’t really prosecute.
Nevermind it could be used to blackmail, intimidate, or otherwise coerce someone and spread over the internet.
Hol’ up, if it’s making it then doesn’t that mean it must’ve been trained on it? Wtf kind of data is Elon feeding it with to get those results? Dude’s self-reporting
This is why we can’t have nice things. Not that I think grok is nice, but still, what a shit show
That would imply that some got into the training data right? Fucking hell what is going on with this shit. Arrest these fucking weirdos at Twitter
Hey now, it’s also a Nazibot. It can do two things.
Why isn’t musk locked up for illegal porn? He seems to be supplying it.
u/askgrok does this hurt your feelings ?
There’s a real chance it might get sanctioned – not in the US, of course, but places like [Australia and Indonesia and the EU](https://deadstack.net/cluster/grok-ai-sparks-backlash-over-explicit-image).
It is not the simple usual idiotic “ban the knives” situation when the knives have the potential ability to detect bad intent and refuse working while at the same time call 911.
The underlying technical problem is that when any LLM generates an image, it doesn’t have the ability to know what it looks like before presenting it. All it does is output commands to separate image generation software, that it “thinks” matches what you asked for (and these commands can be very complex). When the software completes, it just sends the images. YOU have to review it and tell it what’s wrong.
In fact, just yesterday I saw an article bragging how one company is just now adding the ability to “see” the image before sending it on. I hope so, because with that, if it STILL does it, you can presume more culpability and less “sorry, tech’s complicated, bruh”.
They’re talking about computer generated images of bikini clad teenagers. Never would’ve guessed that by reading the clickbait headline.
Ageing trans person paints teenage boys and early 20s men as pedophiles for being sexually interested in people who are roughly in their age group by torturing the most obscurely puritanical and technical meanings of pedophilia and pornography possible while treating computer generated images as photographic evidence since the two look similar.
It’s a complete fucking nightmare tool. I tried to generate some basic content for placeholder images for a website m I’m building but it went real borderline weird on some of them.
Absolutely unusable and concerning that this can even happen. I genuinely want there to be some sort of serious investigation into what their tool was trained on.
The fact that people can, either willingly or unwillingly, just sign up with a google account or whatever and suddenly produce this stuff is absolutely fucking mind blowing bad.
It absolutely should be completely disabled until they’re able to fix this. You can’t just launch a fucking horrible tool like this and just let it slide day by day and fix it behind the scenes.
Really good choice of painting for this metaphor.Â
The amount of fucked up shit that you’d run into over there is insane
It should be released, the developers know what the people use it for but money over morals
Honestly, ai for public consumption is a terrible idea mainly because of the kind of degenerates that use Grok/Twitter and this is just further evidence to prove it.
The guy that comes into the post and innocently asks : “Wait, so it actually makes child porn and removes women’s clothes? How does that actually work?” 👀
*cough* No, really…like, how does it really work? *cough*
🤔
Glad I never used that.
Yeah this should more than enough reason to sue it into oblivion. That and all the copy rights.
The title was 100% Ai Generated, lol.
i dont use grok
but is that shit still generating CP? Wtf?
How is this not insta banned from every country?
This is the type of thing someone would only do if they were
A) The richest man in the world
B) Felt they were completely untouchable and above any individual nation state.
Careful, there’s a well known documentary about turning off AI. It’s directed by James Cameron
Surely every user that has tried asking for anything illegal will be arrested how is it the softwares fault