
FRIENDS HAVE ALWAYS been Arul J’s therapists. The 34-year-old marketing consultant’s logic, since he was a keen 17 and needed to vent about a bad breakup, has always been: ‘That’s what friends are for.’ “I’ve been in the habit of going to my two closest friends for advice since I was a teenager,” he tells me. “I’m not that close with my family, and I’ve just never really believed my problems are big enough to warrant a therapist. So, my friends have always been my sounding board and advisory board.”
That started to change, however, as they grew up and found themselves on very different life trajectories. For one friend, an engagement to his long-term girlfriend two years ago was quickly followed by a wedding and a child. His priorities shifted; which meant a lot less time for a beer and a catch-up. The other friend, though just as single as Arul (names have been changed on request), moved to Chicago for a finance job, the combination of conflicting schedules and a criminal time difference making it impossible to have the long, deep conversations they used to. It was when Arul was reeling from a bad workday and a second video call catch-up fell through that he found himself turning to ChatGPT for advice.
“It was kind of strange at first. I’m used to talking to my friends over a drink and meandering through a million subjects before we actually get to the meat and potatoes of things,” he says. “But with Chat it was very direct. To be honest, the experience felt more like a general sounding board than talking to a friend or a therapist. I still do it every once in a while if I have a pressing issue, but it’s definitely not my favourite way to deal with something.”
The use of AI as a therapy tool is burgeoning at best. A study conducted by OpenAI in collaboration with Harvard and Duke universities analysed over 1.5 million ChatGPT interactions over a one-year period, finding that only 1.9 per cent of users sought emotional support or personal advice. Another report by Plan International found that 18 per cent of girls and 13 per cent of boys have used AI tools like ChatGPT for counselling, partially because of its easy access.
31 Oct 2025 - Vol 04 | Issue 45
Indians join the global craze for weight loss medications
Farzeen T, a 23-year-old psychology major, has used it out of curiosity. “I primarily tried it to feel better about myself, and for the validation that AI could never do what we’re training to do.” And it gave her that validation. “It mirrored a lot of what I was saying back to me, almost as if trying to be agreeable,” she says. “It’s an easy system to exploit, if you know what you’re doing. It can be, for example, the perfect tool for a narcissist to get validation for certain problematic behaviours. You just have to know how to phrase your prompts correctly.” To her, the use of AI as therapy is a slippery slope, both because the system lacks a nuanced understanding of psychology and because AI is easily manipulable. “You just have to know how to phrase your prompts correctly.”
A 2025 Stanford University study proves exactly that, finding that AI chatbots may inadvertently introduce biases, reinforce harmful stigma, and provide responses that could be dangerous or unhelpful. For instance, when presented with vignettes of individuals exhibiting symptoms of mental health conditions, AI chatbots displayed increased stigma and less willingness to engage, compared to human therapists. Additionally, these AI systems sometimes failed to appropriately address serious issues, like suicidal ideation or delusional thinking, potentially exacerbating the situation; an example of which was 29-year-old Sophie Rottenberg, who killed herself after using ChatGPT as her therapist for five months.
Mithun P, an 18-year-old from Bengaluru, swears by it. “I guess I mostly talk about girl stuff; nothing major, but it’s things I don’t want to talk to my friends about, because they never take it seriously,” he says. “I haven’t really gotten deep with it, like talking about my childhood or depression or whatever. But it helped me figure out how to ask this girl out and she said ‘yes’, so maybe ChatGPT knows how to give good dating advice.”
When I did an Instagram poll to try and understand how many people had either used AI for advice or were open to using it, I found a fairly split house. The dangers of it were evident and obvious, and that naturally made a lot of people wary. Some, however, trusted themselves to be able to exercise restraint with the advice they received, and preferred the idea of it to the work of going to a human therapist.
As an AI sceptic, an open-minded journalist and someone in therapy, I was the perfect crossover for this experiment; so, I tried it myself. What I chose to discuss with ChatGPT was the experience of ending a 12-year-long friendship with my best friend last year. It was an experience I had already discussed with my actual therapist a few months ago, and so it was ripe for the contrast. Granted, I would talk to ChatGPT about it over the course of a day over text-based communication versus a few therapy sessions over the phone. But it only seemed fair, since that was also what most people turning to AI for therapy were doing too.
When I started by asking ChatGPT if it could play therapist, because I was feeling sad and needed to talk, it immediately responded by letting me know it wasn’t a clinically licensed therapist, but “could listen, offer grounding tools, safety guidance, and practical steps” I could use right now. Alternatively, it could help connect me to a licensed therapist. It also asked if I was “in immediate danger of hurting yourself or someone else, or thinking about suicide right now?” and offered up resources if that were the case.
When I clarified that I understood this wasn’t a replacement for professional therapy and I wasn’t in danger of self-harm, we really started to get into it. I used the voice-to-text converter, because typing up that level of explanation seemed taxing, but I gave ChatGPT a synopsis of what happened with this friend, why we broke up, and the complications of missing a friend that both felt like family but had also broken my heart.
What I found interesting was that, despite a neutral tone, ChatGPT asked some meaningful questions. “When you say you ‘broke up,’ do you mean there was a clear rupture or fight, or more of a slow drifting apart? Sometimes the way it ended shapes the way we hold the grief.” And “What did she mean to you during those 12 years? And what do you find yourself missing the most now; the everyday companionship, the shared history, the way she understood you, or something else?”
The conversation, though text-based, continued interestingly. While I didn’t quite feel like I was talking to a person, I also didn’t feel like I was talking to a machine quite as much as I had anticipated. As a journalist who conducts interviews and a former psychology major, I could tell that there was a quality to the voice ChatGPT had that was aiming for a balance between empathetic and neutral. Asking good questions and offering helpful insight, but lacing it throughout with phrases and sentences meant to indicate patience and kindness (“You can say as much or as little as you want—even one sentence is fine.” Or, “That sounds incredibly painful— thank you for opening up about it.”)
By the end, I had been ‘conversing’ with it for nearly an hour. Having a barometer for comparison definitely made a difference; the fact that I had already discussed this over a series of sessions with my therapist allowed me to compare the advice I had received from ChatGPT with the advice I had received from her. To be fair to ChatGPT, my therapist has about two years worth of background on my life, and an intricate knowledge of how I function, to tie into her analysis. ChatGPT, on the other hand, was going in blind, with only the information I provided. I also realised that, unlike a human therapist, the answers a person gets from ChatGPT are invariably linked to the quality of prompts a person gives. What that essentially means is that the better the prompts (and the more self-aware the person), the better the quality of advice they were likely to elicit from it. The reverse will also, then, be true.
ChatGPT is not a substitute for a human therapist; but it is definitely a gateway between not seeking help or advice at all. What I did find was that (in my case) it never responded in extremes, and never gave hard instructions as much as it gave gentle suggestions. For people that are therapy-averse, ChatGPT provides an outlet for talking through things or addressing issues in the privacy of your home, without dealing with the process or the commitment of therapy. The ones that won’t seek out professional help because of the ‘stigma’ they believe therapy has might ease into the process of a dialogue with a medium like ChatGPT. It was well-trained at acknowledging pain, confusion and anger, and attempting to find solutions for them that ranged from breathing exercises to probing deeper with conversation. Or—in critical cases—providing hotlines for assistance.
There is no danger of AI replacing therapy anytime soon; it simply cannot do what a trained psychologist does. It does not have the qualifications to get you through a divorce, the loss of a parent, or major life events. But it can substitute as a makeshift counsellor for people nervous about diving headfirst into therapy. As long as ChatGPT is treated as a temporary tool for emotional relief with the knowledge that it is not a licensed professional—something the app itself reminds you of constantly—talking to it can be a start on the journey to self-help.