>*They frequently assigned diagnoses that were not present in the vignettes*
This part is the real problem. If AI is going to call conditions that don’t exist, that’s not help, it’s misinformation that could make people feel worse or delay real care.
BottAndPaid on
Ai chat bots should not be diagnosing anything
jakubb_69 on
Woah, this is pretty wild! It really makes you think about how much we should trust AI with something as serious as mental health, right?
ResilientBiscuit on
Have you been on Reddit? Or the Internet generally? If you have a splinter you will end up with a cancer diagnosis somehow.
People have been turning to the Internet for ages to diagnose themselves. I am not sure this is any worse.
AMA_ABOUT_DAN_JUICE on
The researchers primed the model to expect diagnosable conditions. Undertrained people do the same thing – the classic examples are psychology students pathologizing normal behaviour, and medical students overconsidering rare conditions.
6 Comments
>*They frequently assigned diagnoses that were not present in the vignettes*
This part is the real problem. If AI is going to call conditions that don’t exist, that’s not help, it’s misinformation that could make people feel worse or delay real care.
Ai chat bots should not be diagnosing anything
Woah, this is pretty wild! It really makes you think about how much we should trust AI with something as serious as mental health, right?
Have you been on Reddit? Or the Internet generally? If you have a splinter you will end up with a cancer diagnosis somehow.
People have been turning to the Internet for ages to diagnose themselves. I am not sure this is any worse.
The researchers primed the model to expect diagnosable conditions. Undertrained people do the same thing – the classic examples are psychology students pathologizing normal behaviour, and medical students overconsidering rare conditions.
Just telling people what they want to hear.