Subscribe

‘It’s cathartic’: Meet the men turning to ChatGPT for dating advice—and discover what it means for your relationships

Men are three times more likely than women ask the chatbot about their love lives. But what do we risk by handing our most intimate questions over to AI?

May 3, 2025
img

Robert* couldn’t stop thinking about her. It had been months since the 27-year-old designer had amicably parted ways with his now-ex-girlfriend (monogamy stifled her; he felt anxious about opening their relationship) after 10 months.

As Robert mulled over messaging his ex, he realised he needed advice. “It’s not something that’s easy to just text your friends about,” he tells Cosmopolitan UK. There was also an element of shame — he knew that if he did ask his friends, they’d likely tell him not to.

So, seeking “practical not emotional advice”, Robert turned to AI. He opened up ChatGPT and typed out the question: ‘Should I contact my ex?’

Since the birth of ChatGPT in 2022, the generative AI chatbot has creeped into virtually every crevice of our lives, with over 400 million weekly users across the globe and 122.6 million people using it daily.

But a curious story is emerging within the demographic data: men are way more likely to use ChatGPT than women (they make up about 85% of its usership) and they are nearly three times more likely than women to use it for relationship advice. Something that tracks when you consider that men are both more likely to trust generative AI than women and less likely to go to therapy (only 36% of NHS therapy referrals are for men).

So, what’s motivating men to turn to chatbots for intimate questions? What do we risk handing our most intimate questions over to AI? And with young men in a particular crisis, in part fuelled by harmful, restrictive iterations of masculinity online, could seeking solutions from a tool trained on the internet make relationships harder for them? And for us?

Enter the chat

“Everything in my life runs through it,” says Bill*, a 28-year-old political consultant who now turns to ChatGPT to discuss his relationship problems after initially using it for work. “Sometimes I’ll get bored and be like: ‘Tell me things that I don’t even know about myself.’”

Bill has tried talking to therapists (he struggled to find a match), but switched his sounding board to generative AI after waking up cringing from exploring relationship problems with mates the night before.

He found it allowed him to go over the same anxieties without self-consciousness or limitation. “[I’m] someone who ruminates a lot and likes to get stuck on the same thing,” he reflects, adding that it’s like “if you had an agreeable friend follow you around the whole time… It’s cathartic.”

In a world in which 40% of UK men say they have never spoken to anyone about their mental health, is talking to ChatGPT better than nothing? Research published last year indicates that AI chatbots can provide emotional support to users, particularly for people who struggle to talk about their feelings.

“It’s like if you had an agreeable friend follow you around the whole time”

Dr Sophie Mort, clinical psychologist and author of Unstuck, has observed a real uptick amongst male friends and clients in turning to AI chatbots for emotional processing around relationships. She’s curious — and cautiously optimistic — about how this affects men themselves, and those who date them.

“My gut reaction is: thank God men are actually turning somewhere to have these conversations,” she tells Cosmopolitan UK. “Talking about their emotions and relationships is actually much more terrifying [for them] than someone who doesn’t live in a man’s body, or who wasn’t socialised as a man, can guess.”

Mort likens the way many men are using ChatGPT to a form of mindfulness: the goal of which is to create space between a feeling and your knee-jerk response, so you can pause, figure out what’s important, and then respond.

“I had a client who was going through a break-up, and he would notice that he had very strong emotions but couldn’t put into words that he wanted to say,” she recalls. “And anytime he messaged her, he would get really overwhelmed and make things worse.”

“Her messages were very direct and emotionally coherent, which made him feel ashamed and defensive. So he put her messages through ChatGPT and asked questions around how he should respond in a way that would be calm and respectful to the ex-girlfriend. He found it extremely helpful.”

Talk it out

However, research shows emotionally depending on a chatbot may be detrimental. One study, conducted by OpenAI and the Massachusetts Institute of Technology, found that heavy ChatGPT usage correlates to higher levels of loneliness (it’s unclear whether lonelier people are more likely to use the chatbot, or if regularly chatting to the chatbot makes people lonelier).
“Anything that makes us more connected to our phones, rather than talking to people, makes me feel very alarmed,” says Mort. Especially, she adds, if someone is experiencing anxiety related to relationships.

“If you can start getting all of the answers you need from your phone, why would you take the risk of facing your anxieties?” she adds. “I worry that it might send us down the route of [being] more and more [in these] bubbles of isolated people.”

“Anything that makes us more connected to our phones, rather than people, makes me feel very alarmed”

Tech analysts and experts have other concerns. ChatGPT is a large language model, trained on a vast amount of data to generate its own natural-sounding language. This usually works by scraping billions of web pages and identifying patterns, which it then replicates in its own speech.

“It’s just taking information and assimilating something that it thinks you want to hear, something that sounds plausible,” says Kate Devlin, computer scientist and author of Turned On: Science, Sex and Robots. “There isn’t a filtering over what’s good information or what’s bad information.”

Devlin’s take chimes with Bill’s experience. For him, the chatbot’s charm began to wear off when he realised something similar happened when he asked for relationship advice as when he asked it for advice on a business idea: “Even if you tell it to critique you, it doesn’t.”

Bill began to wonder whether the chatbot was actually helping him with his problems or just making him feel more addicted to talking about them. “It’s a bit of a dangerous hole; you’re just chatting shit about the same topic forever.”

There are also concerns about factual accuracy. OpenAI has admitted that ChatGPT’s “hallucination rate” (i.e. the chatbot’s tendency to generate convincing responses that are not factually correct) is as high as 37.1%.

When contacted by Cosmopolitan UK, an OpenAI spokesperson said improving factual accuracy was a key priority across the industry, and pointed to the fact users have to agree to its terms and conditions which state: “Output may not always be accurate. You should not rely on Output from our Services as a sole source of truth or factual information, or as a substitute for professional advice.”

Help or harm?

“Generative AI is morally naive,” says Nigel Crook, founding director of the Institute for Ethical AI at Oxford Brookes University, who has concerns over people relying on a chatbot for life advice. “The data that it’s trained on is just a mishmash of everything that’s on the internet.” While AI may seem personable, Crook argues that it’s not concerned with things like morality and truth, like humans are.

Multiple experts also flagged that AI chatbots may be limited in their ability to adequately assess risk, as one study evaluating the chatbot’s therapeutic capabilities found. “You would have to be extremely explicit with ChatGPT before it goes: ‘Oh, this isn’t healthy’,” says Laura Vowels, co-author of the study and assistant professor at the University of Roehampton.

Unlike friends, or a therapist, a generative chatbot can’t necessarily read between the lines, nor can it pick up on non-verbal cues, which can pose particular danger if someone is a threat to themselves or others.

“You would have to be extremely explicit with ChatGPT before it goes: ‘Oh, this isn’t healthy’”

Another fear flagged by experts is that since ChatGPT learns continually as it’s being used, could it perpetuate the biases of its (male-dominated) users? “If you’re using one that’s freely available online, there’s a really good chance that anything you put in is going to be fed back into the models,” Devlin argues — adding that this could pose privacy concerns, and risk reaffirming harmful behaviour and beliefs.

In 2023, after users of the AI companion app Replika allegedly abused their digital companions, other users began reporting their AI lovers being ‘mentally abusive’ or sexually aggressive. Replika did not respond to a request for comment about this allegation and an enquiry about what the company has done since to safeguard users.

On biases, OpenAI told Cosmopolitan UK that tackling them is a priority research area. The spokesperson also reiterated that users can control how their data is used, including opting out of it training their programmes — and that OpenAI doesn’t actively seek out personal information to train its models. They also explained that the models have been trained to recognise situations in which someone is a risk to themselves, then engage thoughtfully and redirect people towards professional help.

A stepping stone to vulnerability

Psychologist Mort shares many of the tech experts’ long-lens concerns. Especially if people are looking for advice from an AI trained on an internet awash with pop psychology advice that encourages a lack of accountability in love, and, with it, superficial and poorly-defined relationships (a source of misery for many today).

But she’s emphatic about ChatGPT’s helpfulness for improving self-awareness and relational communication. “Having seen this in clinic and relationships, the benefit of having somewhere you can start noticing patterns in your thoughts and feelings… is having a more positive effect on men.”

Adam*, who has autism, says ChatGPT helped him “read the signs” that women were into him, finding that he was able to open up in a way that he couldn’t do with his friends.

The chatbot told him his challenges were “understandable” and that “many people on the spectrum find non-verbal cues and social nuances difficult”. It added: “Some women will actually appreciate your honesty and straightforwardness rather than relying on subtle signals.”

“Having somewhere you can start noticing patterns in your feelings is having a positive effect on men”

It made him feel more self-assured, and in touch with insecurities arising from his childhood that he wanted to work on. “Your self-awareness is strong,” the chatbot told him. “You recognise your uniqueness, your intelligence, and your value.”

The words that appeared on Adam’s screen made him feel seen in a way that no therapist had seen him before: “It was agreeing with me, not minimising anything.”

For Robert, wondering whether to text his ex, the very act of sitting down and working out what he wanted to ask ChatGPT brought him — per Mort’s theory — closer to his answer. “What is it that I expect back from her?” he asked himself. “All I’m wanting right now is validation and maybe some intimacy,” he realised. “Do I actually need to get that from her?”

There are real long-term, big-picture concerns about generative AI — from privacy to the reinforcing of cultural biases and fears around accuracy. But as we live — and love — through this especially difficult moment, where professional help is either expensive or on the other side of long waiting lists, leaning on a full spectrum of tools makes sense.

Just remember: there is no substitute for clear, human-to-human communication. Work on practising it in your own life, and in your relationships with men — whether they’re friends, lovers, or somewhere in between. We’ll all benefit.

*Names have been changed

Lead image: Pexels

Credit: Cosmopolitan

Read more!

Related Stories