Please for the Love of God Stop Building AI Therapy Chatbots

blogtherapy.substack.com

14 points by rpastuszak 14 hours ago


reverendsteveii - 12 hours ago

but my meditation app didn't get bought by Google and shut down like I'd planned and I can only build so many to-do lists

darepublic - 9 hours ago

I think llm could be helpful for therapy. I've never tried it for that purpose but it's the impression I get. Mostly it is learning to talk to yourself and chain of thought to a better mindset

quantified - 14 hours ago

> Maybe that sounds like a no-brainer: don’t trust Meta with your mental health.

aaronbaugher - 12 hours ago

Meh. I'll agree that chatbots shouldn't be allowed to claim they have credentials, like anyone else. But a lot of therapists do nothing except sit there and listen to you, tell you what you want to hear, give some standard advice you could have gotten from a self-help book, and hand you a bill. An LLM can replace the first three functions just fine.

If you can get your money's worth from talking to most therapists, you're probably self-aware enough to get it from bouncing your thoughts off an LLM.

BizarroLand - 12 hours ago

Counter Argument:

Talking to therapists is difficult, time-consuming, painful, annoying, and expensive. The number of hoops you have to jump through just to get to the starting line is crazy, even if you are willing and able to pay the few hundred dollars an hour out of pocket yourself.

Therapy isn't like confession or church or a religious experience. Its proposed benefits will never be life changing, it will not build you up, it will not give you anything you didn't already have, and you have to pay for it.

If I can work out a few kinks in my psyche on my own I'll do that. If I bounce some words off of a convenient lie bot, then fine.

The only way I can get more f'd in the head would be to start killing, and at this point I just don't have the drive to take on murder as a new hobby.

- 14 hours ago
[deleted]