Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

futurism.com
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Well could been worse, could have given them a recipe to cook spaghetti with motor oil.
Its the fact that AIs are too eager to please. You can wear them down and make them agree that your ideas are great.
From experience, when the higher ups start running ideas through chatGPT and are persistent enough that it starts “agreeing” with “improvements” it leads to vibe management.
It's to prove you aren't an addict. Everything is great in moderation. Drug wise. There are some really sickos.
...you can make LLMs say pretty much anything you want them to.