Share.

12 Comments

  1. Rodney_Reposter on

    AI was trained specifically on therapy. Interesting to think and realize no other LLMs have ever seen real therapy data. even many of the other tools doing this are just gpt wrappers and this is a foundation model. I’m skeptical how good it can be on 93 million. don’t foundation models require like a billion

  2. sad-potato-333 on

    Isn’t therapy usually a prolonged engagement? It will be interesting to see how it handles a humongous context window.

    Off the top of my head there could be a different model that extracts a concise summary of recent appointments and uses that as context for the new one. Not sure if that could introduce problems.

  3. Not a therapist but wife is one. I’ve taken my own therapy that’s my basis for my thoughts.
    Have serious doubts as to how effective this will be. Part of therapy is seeing how your client reacts to your questions. Not just what they say but what their face does. How does their body language change? Taking human interaction out of what is usually trying to fix human connections is also not what we should be aiming for.

    Another thought, therapists are mandated reporters. What will the AI do when it learns about such information?

  4. The limited context windows are going to be hilariously bad for this. Even if they fix the context issues with whatever hack-y agent workarounds, “AI” isn’t anywhere close to being a generalist-level human replacement. Augment, enhance, yes. Replace? GTFO.

  5. My guess is that this will be a “budget therapy” option not covered by insurance with a huge disclaimer and terms of service page that releases them of all liability. There’s no way AI can replace legitimate therapy it’s too nuanced.

  6. I wish this was open source so I could RP really dark stuff with it. It would probably be really good at that with all this psychological knowledge that it has, assuming any guardrails could be worked around.

    Actually using it for therapy? No thank you.

  7. i knew this dude in college. he was a total jerk. Hopefully he uses his own product. lol.

  8. apocecliptic on

    Would probably surpass a couple of subpar therapists I’ve seen, but can it decipher body language or something like voice inflection?   And just imagine the privacy violations, even with confidentiality and HIPAA.

  9. NY_State-a-Mind on

    Yes name your expiremental mental health app on the black burnt waste of fire. That will work out well

  10. Any_Ambition2251 on

    A lot of discussion on this thread revolves around “could this technically work” or “does AI have the capability to do this as well as a trained human”.

    Those are good questions. But I think the bigger question is “should we replace this critical human to human connection for that of a human to machine?”

    At what point do we really start to lose our humanity?

    Replacing therapists with a machine (even if it technically works) is one of those final goal post events in which we have essentially given up our humanity.

  11. Guess he didn’t see where an AI client had a horrendous week and the AI therapist suggested a little meth to help.