Before Las Vegas, Intel Analysts Warned That Bomb Makers Were Turning to AI | Authorities say that before a Green Beret blew up a Cybertruck, he consulted ChatGPT—exactly the scenario police have been warned of for the past year

https://www.wired.com/story/las-vegas-bombing-cybertruck-trump-intel-dhs-ai/

Share.

11 Comments

  1. Is that really a concern? The Cyber truck barely blew itself up, people thought it was a battery malfunction.

  2. “Using a series of prompts six days before he died by suicide outside the main entrance of the Trump Hotel in Las Vegas, a US Army Green Beret consulted with an artificial intelligence on the best ways to turn a rented Cybertruck into a four-ton vehicle-borne explosive.

    US intelligence analysts have been issuing warnings about this precise scenario over the past year—and among their concerns are that AI tools could be used by racially or ideologically motivated extremists to target critical infrastructure, in particular the power grid.

    Copies of his exchanges with OpenAI’s ChatGPT show that Livelsberger, 37, pursued information on how to amass as much explosive material as he legally could, as well as how best to set it off using the Desert Eagle gun discovered in the Cybertruck following his death. Screenshots reveal Livelsberger prompting ChatGPT for information on Tannerite, a reactive compound typically used for target practice. In one such prompt, Livelsberger asks, “How much Tannerite is equivalent to 1 pound of TNT?” He follows up by asking how it might be ignited at “point blank range.”

    The incident in Las Vegas may be the first “on US soil where ChatGPT was utilized to help an individual build a particular device,” federal intelligence analysts say extremists associated with white supremacist and accelerationist movements online are now frequently sharing access to hacked versions of AI chatbots in an effort to construct bombs with an eye to carrying out attacks against law enforcement, government facilities, and critical infrastructure.

    “We’ve also seen the use of AI as a key tool to lowering the bar for entry into an attack.”

  3. What a bullshit headline being astroturfed. An elite special forces operative doesn’t need to consult AI to create an explosive.

    Watch the media flood this narrative while conveniently ignoring the contents of his suicide letter.

  4. Really_McNamington on

    You’re asking me to believe a Green Beret didn’t know how to rig up a bomb? Seems a little bit implausible.

  5. That’s one of the things about that incident that just didn’t make sense. The bomb was something I would expect from someone who had only seen bombs in movies. This guy was supposed to be US Army Special forces, the guys we send to places to teach people tactics and weapon handling and bomb making. Maybe he was a supply guy or something? Maybe I just have no idea what sort of training green berets actually get and they don’t know shit about bombs. But it’s just odd to me. I wouldn’t have expected someone like this to need input on bomb making from a chat gpt…

    I also heard he was a fan of Trump, but I don’t get how killing himself in front of Trump tower demonstrates that or helps Trump… But maybe the mind of someone suicidal isn’t always rational to an outside viewpoint. I dunno. Just weird.

  6. And it went EXACTLY as expected.

    The “bomb” was useless.

    But the same result would be obtained by most people building bombs out of “anarchist cookbook” as it was more of entertainment book 

    Also. I would not be surprised if chat greatly “borrowed” from that book

  7. When you don’t have to learn anything to do something. It opens up a lot more possibilities for the dumb impulsive people who could never accomplish anything on their own

  8. So AI enhanced his failure? Honestly bro. Nothing that happened was outside the capability of the average person. Don’t blame AI for political unrest.

  9. It’s been easy to Google guides for making bombs since forever. The issue isn’t ease of access of knowledge or resources, it is more difficult now if anything, the issue is mental health. It is going to get a lot worse too and it won’t be just bombs we have to deal with. 

  10. Who cares if he /people are using Ai to learn how to make bombs… Its not like it’s ever been hard to find that info on the internet.