Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
You're viewing a single thread.
Sue that therapist for malpractice! Wait....oh.
Pretty sure you can sue the ai company
I mean, in theory... isn't that a company practicing medicine without the proper credentials?
I worked in IT for medical companies throughout my life, and my wife is a clinical tech.
There is shit we just CAN NOT say due to legal liabilities.
Like, my wife can generally tell whats going on with a patient - however - she does not have the credentials or authority to diagnose.
That includes tell the patient or their family what is going on. That is the doctor's job. That is the doctor's responsibility. That is the doctor's liability.
I assume they do have a license. And that's who you sue.
Pretty sure its in the Tos it can’t be used for therapy.
It used to be even worse. Older version of chatgpt would simply refuse to continue the conversation on the mention of suicide.
What? Its a virtual therapist. Thats the whole point.
I don't think you can sell a sandwich and then write on the back "this sandwich is not for eating" to get out of a case of food poisoning