
On a late August night in 2025, a 19-year-old Missouri State University student allegedly crept into a freshman parking lot and went on a rampage: 17 cars smashed, windows shattered, mirrors ripped off.
Minutes later, he did something millions of us do every day:
he opened ChatGPT.
According to court documents, he typed messages like:
and asked the bot whether he was going to jail. Police later recovered that conversation from his phone, and prosecutors in Missouri now say that his ChatGPT confession is part of the evidence used to justify his arrest and felony property-damage charges.
https://vector-space-ai.ghost.io/is-chatgpt-private-how-a-college-kids-arrest-exposed-the-truth-about-your-ai-chats/

8 Comments
This story raises a big question about the future of AI and privacy. If people use tools like ChatGPT during stressful moments or to think through personal decisions, what expectations of privacy should they actually have? As AI becomes more common, how should laws and platforms handle user data, and what rights should people have over their own chat history?
People are geniuses for expecting privacy online these days. Everyone is keeping logs, and courts are pulling those logs during your court hearings. Cops pulling up search history isnt new.
I am more surprised that people are actually surprised by this. Most governments have entire agency with the sole purpose to spy on everything you their citizens do. What do people think was going to happen? The government suddenly respecting your privacy because a chatbot is involved?
If it’s not your server, there is no reasonable expectation of privacy, chapter eleventy-six.
Of course it isn’t private. I cannot fathom how grown people think anything they do online with a traceable login cannot and will not be used against them by the surveillance state the second they step out of line, and especially not anything so beholden to federal or private capital funding.
I think the bigger question is…
Was he caught and then this conversation was discovered?
Or was he caught *because* of this conversation?
“Police later recovered that conversation from his phone”
So they opened the app and looked at it. That’s on the user.
Even if you theoretically keep the server side stuff secure and compartmentalized, you can still just open the app on the phone and read your previous prompt in plain text.