No couch. No clock. No warm silence between words. Just a glowing screen and an algorithm that always knows what to say.
Therapy, once defined by delicate human connection, is shedding its skin. In its place stands something cleaner, quicker, and eerily agreeable: artificial intelligence. It remembers everything, responds instantly, and never flinches at your darkest truths.
In a randomised trial published in NEJM AI in March 2025, researchers tested Therabot, a generative AI-powered chatbot designed for mental health treatment. The study, which included 210 adults with major depressive disorder, anxiety, or eating disorder risks, found that Therabot significantly reduced symptoms across all groups compared to a waitlist control. Over four weeks, users reported high engagement and rated the AI’s therapeutic alliance on par with human therapists.
But what does it mean when the messy, unpredictable process of healing is flattened into an optimised exchange? If discomfort is the crucible of growth, what do we lose in a world where therapy never pushes back?
When the algorithm becomes the ally
Praveen Kumar Tangella, a cyber expert from Hyderabad, points to the stigma still associated with mental health in India. Conversations around topics like sexual identity, substance use, or personal trauma are often laced with shame or secrecy. Even traditional therapy sessions can be constrained by cultural taboos and social fear.
“With AI, people feel freer. There’s a certain detachment—no fear of judgment, no risk of gossip. That emotional distance can be liberating,” he says.
For a growing number of Indians, the most intimate relationship is not with another person, but with their smartphone. These devices know what users watch, what they search for, and what they skip. “If you’re watching a string of breakup videos or interacting with emotionally charged content, AI can pick up on that pattern—sometimes better than a trained therapist,” Tangella adds.
According to the Indian Journal of Psychiatry, India has just 0.75 psychiatrists per 100,000 people, while the WHO recommends at least 3 per 100,000. “Apps like Woebot, Wysa, Youper, and Replika use conversational AI rooted in cognitive behavioural therapy to help users manage anxiety, depression, and stress,” says Shishir Gupta, CEO of smart tech firm Oakter. “AI can be a safe and stable entry point into mental health care—but it’s not a replacement. It can track behaviour, flag patterns of distress, and even predict emotional dips. But healing still needs a human touch.”
AI tools like chatbots and machine learning algorithms can democratise access to mental health care, reduce administrative burdens, and enhance psychological research. At the University of Notre Dame, psychologists Ross Jacobucci and Brooke Ammerman are developing an algorithm that captures screenshots of patients’ online activity to flag language linked to suicide or self-harm. Combined with real-time data from smartwatches and ecological momentary assessments, the system aims to alert clinicians to potential suicide risks before they escalate.
Gupta adds, “Researchers are increasingly turning to natural language processing tools to aid in medical diagnostics. At Drexel University in Philadelphia, a team has demonstrated that GPT-3—the powerful language model developed by OpenAI—can help identify early signs of dementia by examining subtle changes in a person’s speech patterns.”
Therapy by code, but not without consequences
Yet even as AI proves its utility, it raises deeper questions about what therapy truly is—and what it requires. A cornerstone of psychotherapy is the therapeutic alliance—what Carl Rogers called “unconditional positive regard.” At its heart, it is a human relationship.
“Therapists offer more than technique,” says Dr Kshama Dwivedi, counselling psychologist and director of the Swami Vivekanand Group of Institute. “They bring attunement, unpredictability, and sometimes even discomfort—elements essential for deep transformation. AI, by design, avoids discomfort. It may affirm us, but it cannot truly challenge us.”
Algorithms may recognise patterns in text, but they miss what isn’t said—the body language, the silences, the metaphors that often hold the key to emotional truth. “AI doesn’t know you,” Dwivedi says. “And sometimes, healing begins with what’s left unsaid.”
There are also serious concerns around privacy. Many AI apps request access to photo galleries, social media accounts, and browsing history in order to offer personalised support. But that intimacy comes at a cost. “Imagine if one of these platforms gets hacked,” warns Tangella. “Suddenly, the most vulnerable details of someone’s life are exposed. It’s not just dangerous—it’s devastating.”
In January 2023, the mental health nonprofit Koko faced widespread criticism after offering counselling to 4,000 people, without disclosing that the support came from ChatGPT-3.
AI can also misinterpret data. Tangella offers an example:
“Say I had an argument with a colleague and I’m upset for a few minutes—an AI might misread that temporary dip in mood as a depressive episode. It’s not always accurate.”
He worries that as AI therapy becomes more accessible, it could normalise emotional dependency on screens.
“Today, everyone seems to identify with some mental health label. The question is—are we diagnosing ourselves, or just following a trend?”
This overidentification, he argues, could spiral if AI begins validating these self-perceptions without clinical rigour.
“When chatbots start affirming every expression of sadness as depression, we might be looking at a wave of misdiagnosis, not awareness.”
In a nutshell, AI is not the end of human healing, but it is transforming it. These tools are best viewed as supplements, not replacements.
“They can assist with access, consistency, and even early interventions—but they cannot conduct all forms of psychotherapy, nor replace the full spectrum of human clinical care. At least not yet—and perhaps never fully,” concludes Dwivedi.
Lead image: Pexels
Also read: Your ChatBot bestie might be delulu...and it's feeding your main character syndrome