Share.

9 Comments

  1. Submission statement: if AI corporations knowingly release an AI model that can cause mass casualties and then it is used to cause mass casualties, should they be held accountable for that?

    Is AI like any other technology or is it different and should be held to different standards?

    Should AI be treated like Google docs or should it be treated like biological laboratories or nuclear facilities?

    Biological laboratories can be used to create cures for diseases but it can also be used to create diseases, and so we have special safety standards for laboratories.

    But Google docs can also be used to facilitate creating a biological weapon.

    However, it would seem insane to not have special safety standards for biological laboratories and it does not feel the same for Google docs. Why?

  2. If you see that a lot of AI bros are complaining then it’s for sure a good thing. Enough with this whole mentality of profits over anything and everything.

  3. > should the person using the tech be blamed, or the tech itself?

    Great article, and this is truly what it boils down to, and people would be very naive to think that while you cripple yourself with self-imposed restrictions the rest of the world works follow suit. At best you’d just follow in the footsteps of the Amish.

  4. Just use ai to watch out for bad stuff…. It can do that. Right?

    Unless you don’t trust it to?

  5. fuckthisshitupalread on

    Going after new tech before entrenched industry can’t have anything shake things up too much its not like people don’t want to work at McDonald’s or Walmart as a greeter forever. Could have pressed onĀ  agriculture, energy, oil, law, or medicine first but I guess that would be too much to ask?

  6. And they should be… by god… nobody should be responsible for their children causing nuclear destruction and rounding up the surviving humans for kill camps. WTF did skynet do with its non combat prisoners anyway ?

  7. SimplesVacation on

    Companies are always resistant to regulation, but it might be necessary for accountability.