Share.

17 Comments

  1. “I do think there’s still a danger of a Terminator-style apocalypse where you put AI together with weapons systems, even up to the level of nuclear weapon systems, nuclear defence counterstrike, all that stuff,” Cameron said. “Because the theatre of operations is so rapid, the decision windows are so fast, it would take a super-intelligence to be able to process it, and maybe we’ll be smart and keep a human in the loop.

    “But humans are fallible, and there have been a lot of mistakes made that have put us right on the brink of international incidents that could have led to nuclear war.”

  2. LordSblartibartfast on

    >Cameron’s films, Avatar in particular, are actively engaged with AI in their execution, and the director has been positive about how the technology could help reduce production costs. Last September, he joined the board of directors of Stability AI and earlier this year said the future of blockbuster film-making hinges on being able to “cut the cost of [VFX] in half”.

    >He clarified that he hoped such cost-cutting would come not from human layoffs but speed acceleration.

    Having worked in VFX, I can tell you that the main cost for a shop is definitely the wages.

    If someone boasts about cost reduction measures that could halves the price of a bid, that definitely means employees getting fired or not getting hired.

    Either James Cameron is incredibly gullible when he says that he hopes that won’t lead to layoffs, either he is taking us for a ride.

  3. TheRoscoeVine on

    It’s just inevitable that some government or rogue agency will try out weaponized automatons. There’s just no stopping it. Every technology gets abused some kind of way and this will be no different.

    My curiosity is in whether or not it *could* work for good. How about an autonomous robot police force that *doesn’t* shoot unarmed men, because there’s no threat? No more rage beatings of already cuffed suspects? I’m talking about armed robots that can “stop” a perpetrator when necessary, but only as a last resort, and never in self defense. It’s just an idea.

  4. This is already the case, look at Gaza and the stealthy drones which kill people thanks to these technologies in the service of the fascism of the other crazy

  5. He’s being adorably naive if he thinks that the fascists that lurk in every government and party aren’t already salivating over an army of autonomous killbots that can beat down all resistance and install them as absolute rulers.

  6. They are already testing autonomous drones working with AI. Send out a drone carrier that can fly for days or weeks and let it decide, what to attack. China has a new military branch with estimated 100.000 drone pilots.

    The Russian invasion showed the world that most military experts were right. The future of warfare lies in drones. The videos also revealed how fragile and unprepared old school military is when it comes to this threat. One failure and you are dead. And AI won‘t need sleep, will never be distracted and will never feel remorse if it is wrong.

  7. AnonismsPlight on

    If we keep letting heartless CEOs make the AI it will be extremely problematic. The thing that would save us from AI overlords is simply companionship. If we can get AI to care for us on a level as low as them not wanting to be lonely they wouldn’t want to kill us. If CEOs keep controlling how they’re made then they will want to kill us for sure. Name one redeeming quality of someone like Zuckerberg or Musk. I’ll wait.

  8. James Cameron is a filmmaker, not a policy expert or AI researcher. I get that his ideas can be a useful spark for deeper conversations, but we shouldn’t treat his words as expert analysis.

  9. I’ve come to be less afraid of AI destroying us by accident than I am of the current “powers that be” driving the world into ruin on purpose for short term gain

  10. Important-Ability-56 on

    I think we have a long way to go before people using technology at their disposal are less of a threat than the technology by itself, which we can presumably just unplug without moral compunction.

  11. There are always unintended consequences. Sometimes they are minor, but sometimes it can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop… ever, until you are dead!

  12. There is already an indiscriminate AI kill chain in Gaza and that’s coming for all of us if we don’t stop it.

  13. We’re living in a world where any hobbyist can build an autonomous weapons system in a garage, and host its AI brain in a closet.

    Governments are the least of our worries. I’m honestly surprised we’re not seeing a lot more of it already.

  14. At least a terminator doesnt hate and discriminate and isnt rassist. They just target all humans. Rich and poor .