
Nuclear Experts Say Mixing AI and Nuclear Weapons Is Inevitable
https://www.wired.com/story/nuclear-experts-say-mixing-ai-and-nuclear-weapons-is-inevitable/?utm_brand=wired&utm_social-type=owned&utm_source=twitter&utm_medium=social&utm_campaign=aud-dev

2 Comments
The people who study nuclear war for a living are certain that artificial intelligence will soon power the deadly weapons. None of them are quite sure what, exactly, that means.
In the middle of July, Nobel laureates gathered at the University of Chicago to listen to nuclear war experts talk about the end of the world. In closed sessions over two days, scientists, former government officials, and retired military personnel enlightened the laureates about the most devastating weapons ever created. The goal was to educate some of the most respected people in the world about one of the most horrifying weapons ever made and, at the end of it, have the laureates make policy recommendations to world leaders about how to avoid nuclear war.
AI was on everyone’s mind. “We’re entering a new world of artificial intelligence and emerging technologies influencing our daily life, but also influencing the nuclear world we live in,” Scott Sagan, a Stanford professor known for his research into nuclear disarmament, said during a press conference at the end of the talks.
It’s a statement that takes as given the inevitability of governments mixing AI and nuclear weapons—something everyone I spoke with in Chicago believed in.
“It’s like electricity,” says Bob Latiff, a retired US Air Force major general and a member of the Bulletin of the Atomic Scientists’ Science and Security Board. “It’s going to find its way into everything.” Latiff is one of the people who helps set the Doomsday Clock every year.
“The conversation about AI and nukes is hampered by a couple of major problems. The first is that nobody really knows what AI is,” says Jon Wolfsthal, a nonproliferation expert who’s the director of global risk at the Federation of American Scientists and was formerly a special assistant to Barack Obama.
“What does it mean to give AI control of a nuclear weapon? What does it mean to give a [computer chip] control of a nuclear weapon?” asks Herb Lin, a Stanford professor and Doomsday Clock alum. “Part of the problem is that large language models have taken over the debate.”
We literally have dozens of movies about why this is a bad thing.