Skip Navigation

Men are opening up about mental health to AI instead of humans

A profound relational revolution is underway, not orchestrated by tech developers but driven by users themselves. Many of the 400 million weekly users of ChatGPT are seeking more than just assistance with emails or information on food safety; they are looking for emotional support.

“Therapy and companionship” have emerged as two of the most frequent applications for generative AI globally, according to the Harvard Business Review. This trend marks a significant, unplanned pivot in how people interact with technology.

337 comments
  • Look, if you can afford therapy, really, fantastic for you. But the fact is, it's an extremely expensive luxury, even at poor quality, and sharing or unloading your mental strain with your friends or family, particularly when it is ongoing, is extremely taxing on relationships. Sure, your friends want to be there for you when they can, but it can put a major strain depending on how much support you need. If someone can alleviate that pressure and that stress even a little bit by talking to a machine, it's in extremely poor taste and shortsighted to shame them for it. Yes, they're willfully giving up their privacy, and yes, it's awful that they have to do that, but this isn't like sharing memes... in the hierarchy of needs, getting the pressure of those those pent up feelings out is important enough to possibly be worth the trade-off. Is it ideal? Absolutely not. Would it be better if these systems were anonymized? Absolutely. But humans are natural anthropomorphizers. They develop attachments and build relationships with inanimate objects all the time. And a really good therapist is more a reflection for you to work through things yourself anyway, mostly just guiding your thoughts towards better patterns of thinking. There's no reason the machine can't do that, and while it's not as good as a human, it's a HUGE improvement on average over nothing at all.

  • Sometimes I wonder if I am, in fact, a man, because every time an article like this rolls around I'm like the fuck I do.

  • Of course men will go to an AI for their problems, they can't fathom going to a woman for honest advice.

    And as a result, they gaslight themselves with a worse version of ELIZA.

    • Of course men will go to an AI for their problems, they can't fathom going to a woman for honest advice.

      This seems a bit far-fetched, don’t you think? There could be so many reasons as to why someone would rather use AI than going to another person for advice (this is not just about women).

      Honestly, as someone who actually went to therapy and yes, my therapist was a woman. It’s was quite tough to open up and be vulnerable.

      I think for some people using AI, they might feel as if they’re not that vulnerable because it is not a person. However, they don’t realize that there’s data is being gathered.

      And as a result, they gaslight themselves with a worse version of ELIZA.

      With this, I can’t figure out whether you’re serious, trolling or just writing randomly.

    • they can’t fathom going to a woman for honest advice.

      Honest advice may not be good advice. I could tell a person "go kill yourself", and be VERY honest about it. Yet it's not good advice, now is it?

337 comments