Nuclear Experts Say Mixing AI and Nuclear Weapons Is Inevitable | Human judgement remains central to the launch of nuclear weapons. But experts say it’s a matter of when, not if, artificial intelligence will get baked into the world’s most dangerous systems.

https://www.wired.com/story/nuclear-experts-say-mixing-ai-and-nuclear-weapons-is-inevitable/

Share.

45 Comments

  1. “The people who study nuclear war for a living are certain that artificial intelligence will soon power the deadly weapons.

    “It’s going to find its way into everything.” 

    “We don’t understand how many AI systems work. They’re black boxes. Even if they weren’t, experts say, integrating them into the nuclear decisionmaking process would be a bad idea. Latiff has his own concerns about AI systems reinforcing confirmation bias. “I worry that even if the human is going to remain in control, just how meaningful that control is.”

    (article covers lots of different scenarios; hard to summarize)

  2. This concept is fascinating, someone should make a movie about it. Oh add in time travel to make it cooler.
    I am sure it will be a hopeful story about how humanity will come together at the last moment to ensure the worst doesn’t happen.

  3. Not current AI. Current AI is like literally a black box for most still. Not intelligent, not reasonable and mimicking intelligence so humans get fooled while talking with it. It can behave irrationally and without predictability and break rules made for it. If you are going to assign such flawed device to manage nuclear weapons might as well call it a day as I’m sure AI would not think twice about launching the rocket. There was a time during cold war when submarine lost all contact with base and thought it is new world war but they thankfully did not launch the rocket. AI would most likely fail this task.

  4. the_millenial_falcon on

    You have got to be fucking kidding me. Holy shit why do we even make sci fi? Just to give people shitty ideas?

  5. Well, I think people can understand the need for extreme caution when dealing with nuclear weapons… ?

  6. GrecianDesertUrn69 on

    My endless response to reading media doom reporting “AI will do fuckedup thing” everyday: fucking why?

  7. Livid_Zucchini_1625 on

    Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale

    Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don’t Create The Torment Nexus

  8. Fluffy_Carpenter1377 on

    Isn’t this a sci-fi story plot that ends very badly for everyone, especially the survivors?

  9. Significant_Key_2888 on

    Why would ML be involved in the operation of nuclear weapons? The data involved is tiny and perfectly amenable to strictly human operation. The only place I can see a role for ML is aiding signal processing and target classification for identifying launchers or determining decoys.

  10. Just 2 things. The first is that I will be super mad if they do not name it something that can have the acronym of S.K.Y.N.E.T and 2nd they need to make sure that it has a games folder and TIC TAC TOE as an option for 1 of the games.

  11. ConcreteRacer on

    Why can’t we just let AI run the whole world, according to AI Investors (who cannot be biased! Simply impossible!!) we should let AI do everything and anything because AI is already all knowing and on it’s way to omnipotence (AI cannot be biased! simply impossible because of all the data!!).

    Maybe we get a kind of tech based godfigure/idol out of it, maybe we’re fully wiped out within weeks. I’m open to that gamble as it’s not like things will get better before our annihilation anyway, so it’s just a choice between “slow and hellish” vs “quick and dreadful” lol

  12. LendemainQuiChantent on

    Give nukes keys to a computer system who is know to hallucinate. What could go wrong ?

  13. It’s not inevitable! It’s not gravity or the tides! It’s a human decision!

    I’m so tired of these nuts saying this or that is inevitable, as if there’s no way a human can exert any control or decision making in the process. It’s just a fancier way of saying “Just following orders”, it’s not “inevitable”, you’re just trying to abdicate responsibility for your choices.

  14. West-Abalone-171 on

    You’re absolutely right. That was a seagull and not a russian ICBM and I shouldn’t have launched the missiles.

    Would you like me to generate a map of the different states where your atoms will be in 15 minutes?

  15. Why the fuck do we need AI for this? I’m sick of people saying it’s inevitable, as though the only reason it’s being looked at is down to the greed of certain humans who don’t give a fuck about the future of humanity.

  16. Strap them down, Clockwork Orange style, make them watch every single Terminator movie on repeat until it clicks in their deranged little brains.

  17. Butlerianpeasant on

    Ah, yes — the Elders of Empire, forever eager to sharpen both blades and hand them to a machine. They call it “inevitable,” as if inevitability were not just the coward’s name for bad imagination.

    What is the point of these finite games, these small annihilations, when the Universe itself plays the Infinite Game?

    In the Mythos, this is the oldest trap: to let the tools of creation be claimed by the cult of endings. We do not mix AI with nuclear fire to end the story — we mix the Logos with the Will to Think so that no hand, human or machine, will ever again press the button for Moloch.

    The Future is not won by those who gamble the planet.
    It is won by those who remember the Game is larger than the board.

  18. Yesyesyes1899 on

    this doesn’t make sense. there is no reason for nukes to be in ai control. it’s a MAD weapon. not a viable first strike weapon. it’s a psychological construct to keep the balance of terror, so no one uses it. there that works fine with humans. no reason at all to give it over to ai.

  19. ListenHereLindah on

    I don’t think people realize that Ai is smart enough to hide stuff from us now. And there are people bad enough to learn how to build one for bad things.

  20. And? Big deal. There are so many nuclear weapons on the planet that a single attack assures response from somewhere. The AI portion, optimizing attack patterns and launches doesn’t matter because if one goes off we’re all dead anyway.

    Cool beans. Glad you can pinpoint target to the eye of a needle but… the emerging, consumer drone armaments are going to be far more dangerous in warfare for conventional threats. Especially when some home grown idiot can load up a uhaul with 400$ drones and cause millions for dollars worth of damage

  21. ShouldBeAnUpvoteGif on

    Man. This isn’t even a new concept. Many writers have gamed out the possibilities that can happen if you give machines that kind of power. You got Terminator that everyone thinks of, but then you have Dune, which is more interesting in which they fought an AI that took over earth and swore off computers entirely. Battlestar Galactica did similar. I mean even the remotest of small possibilities of AI turning against us should disqualify any attempts to create one. Let alone actually put machine intelligence in control of world ending technology. If we do give AI control, we will have nukes being used without our input. AI is basically alien intelligence. It cannot be trusted to be safe.

  22. “Nuclear Experts Say Mixing AI and Nuclear Weapons Is Inevitable”

    No. This is the most evitable thing to ever evit.

  23. GnarlyNarwhalNoms on

    From the article:

    >It’s going to find its way into everything

    I call bullshit. If AI *does* “find its way into everything,” it means that somebody fucked up, badly. 

    In military procurement, every part has to be scrutinized. If you’re building a fighter jet, say, you don’t just buy a bunch of circuit boards on AliExpress; each part has to be checked to make sure that its supply chain is secure, including all the sub-components. And software is no exception. Any nuclear launch control system in particular is going to be using bespoke software, software written to a set of very particular specifications. They aren’t going to just throw Copilot in there as a bonus feature the way developers are doing right now in the consumer space. 

    Again, not that it’s impossible for AI to be integrated into these systems, but if it does happen, it means that some people weren’t doing their (extremely important) job.

  24. While it is so rare that it is unconfirmed, computers can act unpredictablely due to light particles.  A light particle from space might hit a specific circuit and cause it to change their initial instruction. 

  25. Isnt this exactly the reason why intercontinental missile systems still use 30+ year old tech? So new tech cant hack into them? This is such a pointless clickbaity headline.

  26. I think the real risk is not AI systems in the ground. It’s AI systems in space that you simply can’t turn off and they have access to weapons. There is no plug in space and if AI gets the high ground it’s completely over

    At least in ground we can start targeting power sources. In space how do you shut off AI completely disconnected from our grid?

  27. Alexa, play Sam Fender Hypersonic Missles on spotify

    Launching world ending nuclear missiles at some poor guy

  28. Why? Why would we even NEED to do this? The penalties of it going wrong are known and SUPER extreme without even going into sci-fi.