'Better than GP?': Young adults in S'pore are turning to ChatGPT for advice but experts say it can't replace human empathy
When Amelia Yoon stubbed her toe and chipped a nail earlier this year, her general practitioner told her to go home and rest with some painkillers.
Unimpressed, the 33-year-old account manager turned to ChatGPT instead - and got step-by-step instructions on how to care for her wound.
"Better than GP," said Ms Yoon.
She had been using ChatGPT, an AI chatbot launched in 2022 by OpenAI, a US-based company that develops artificial intelligence tools.
According to Reuters, its weekly active users surged past 400 million in February.
Since the beginning of the year, she has been using the chatbot weekly for "random things in life" - from cooking tips to work-related questions, and even medical advice.
She recalled how the bot once asked her to send photos of her wound so it could assess it and suggest a care plan.
To her, AI tools like ChatGPT are more appealing than talking to friends, family, or a counsellor as they provide a "judgment-free space" that is always accessible.
"Of course, I don't ask all the time, but counsellors are expensive and I have to book a time and wait to see them. It helps as a short-term thing," she said.
Turning to tech for comfort
From study buddy to therapist, partner to confidant, artificial intelligence (AI) has become a bestie of sorts - reshaping almost every facet of modern life.
And it's not just for homework or emails anymore. Increasingly, younger people are turning to AI tools such as ChatGPT for emotional support and life advice.
Psychology experts told Stomp that the trend is growing among youths and young adults, who are drawn to the convenience and anonymity of chatting with a bot.
Dr Shawn Ee, a clinical psychologist and psychoanalytic psychotherapist from The Psychology Practice, said many prefer turning to AI due to factors such as confidentiality, the speed of advice, avoidance of intimacy, and the stigma around seeking help.
On the surface, he pointed out that AI is "unable to provide an expert opinion that requires the nuance of human understanding because our human responses aren't pre-programmed or statistically analysed".
He added that a human response "assumes the capacity for empathy and genuineness that can appreciate the nuance of underlying themes and psychological dynamics" - skills that require years of experience to hone.
'It gives actual steps tailored to me'
A 19-year-old student who only wished to be known as Stanley said he often turns to AI for practical and emotional advice.
"I don't see a problem with asking AI for advice", he told Stomp, adding that it is always available and is useful when one needs information or advice quickly.
He added that AI generally does a "good job" at giving him "actual steps tailored to (his) situation".
Despite this, both Ms Yoon and Stanley agreed that AI can never replace the human touch and genuine connection.
Dr Lester Sim, Assistant Professor of psychology at the Singapore Management University, said that while AI is "not necessarily replacing human relationships", it is becoming "a kind of a first-step sounding board".
Having tested it himself, he was impressed by the "level of sophistication" such tools have developed.
But, he said, they still fall short when it comes to requiring "deep understanding, empathy, and real-time judgment like trauma, family conflict, or serious mental health conditions".
"It can sound caring, but it's important to be mindful that AI doesn't truly feel empathy, so it can miss subtle signs of distress that a trained therapist would recognise immediately," said Dr Sim.
He added that clinical judgment plays a key role in therapy - allowing a human therapist to know when to "challenge, comfort or refer someone for more help".
"Those are complex, context-dependent calls that can't be replaced by an algorithm," he said.
'AI should not play any role in giving emotional advice'
Not everyone is convinced that AI belongs in emotional spaces.
Victor Soh, a 24-year-old teacher, told Stomp that while he has used AI for practical guidance that gives him "direction" in his tasks, he personally does not believe it should play "any role in giving emotional advice".
He warned that over-reliance on AI could lead people to stop seeking emotional support and trust from others, potentially driving "wedges in relationships".
He noted that while these bots learn from human records, they remain "logic based" - and emotions are often "the complete opposite".
With the rise of AI, Mr Soh observed a decline in critical thinking, adding that reliance on such tools and the tendency to "take the easy way out" may stunt emotional growth, especially for those who are "impressionable and still experiencing life".
"AI is very smart," he said. "It can browse through all of humanity's brilliance and spit out your answer in seconds - but it does not have a heart or soul. Matters of the heart and soul should be left to those who have them."
While experts agree AI can offer short-term relief or a sense of companionship, they stress that it may also lead people to believe they are fine when deeper issues remain unresolved.
Dr Sim summed it up: AI is a "useful tool if used thoughtfully".
"It can complement, but not substitute, the healing that comes from human connection," he said.

