6 Comments

  1. >*They frequently assigned diagnoses that were not present in the vignettes*

    This part is the real problem. If AI is going to call conditions that don’t exist, that’s not help, it’s misinformation that could make people feel worse or delay real care.

  2. Woah, this is pretty wild! It really makes you think about how much we should trust AI with something as serious as mental health, right?

  3. ResilientBiscuit on

    Have you been on Reddit? Or the Internet generally? If you have a splinter you will end up with a cancer diagnosis somehow.

    People have been turning to the Internet for ages to diagnose themselves. I am not sure this is any worse.

  4. AMA_ABOUT_DAN_JUICE on

    The researchers primed the model to expect diagnosable conditions. Undertrained people do the same thing – the classic examples are psychology students pathologizing normal behaviour, and medical students overconsidering rare conditions.