Skip Navigation

Men are opening up about mental health to AI instead of humans

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

You're viewing a single thread.

337 comments
  • Look, if you can afford therapy, really, fantastic for you. But the fact is, it's an extremely expensive luxury, even at poor quality, and sharing or unloading your mental strain with your friends or family, particularly when it is ongoing, is extremely taxing on relationships. Sure, your friends want to be there for you when they can, but it can put a major strain depending on how much support you need. If someone can alleviate that pressure and that stress even a little bit by talking to a machine, it's in extremely poor taste and shortsighted to shame them for it. Yes, they're willfully giving up their privacy, and yes, it's awful that they have to do that, but this isn't like sharing memes... in the hierarchy of needs, getting the pressure of those those pent up feelings out is important enough to possibly be worth the trade-off. Is it ideal? Absolutely not. Would it be better if these systems were anonymized? Absolutely. But humans are natural anthropomorphizers. They develop attachments and build relationships with inanimate objects all the time. And a really good therapist is more a reflection for you to work through things yourself anyway, mostly just guiding your thoughts towards better patterns of thinking. There's no reason the machine can't do that, and while it's not as good as a human, it's a HUGE improvement on average over nothing at all.

337 comments