Three days into the Lunar New Year holiday, Ivy Duan had reached a breaking point.
Back in her hometown of Changde, in central China’s Hunan province, the 29-year-old gym teacher faced a barrage of questions and not-so-subtle critiques of her life, from relatives both close and distant.
“They wanted me to come back home and looked down on my life in Shanghai,” Ms. Duan said. “I couldn’t breathe and felt extremely anxious.”
As her parents watched TV, she cried in her room, feeling like a failure and not knowing what to do. Scrolling social media she saw many discussions about the Chinese artificial intelligence company DeepSeek, whose late January release of a new model upended the AI industry and delivered China its first true champion in the new tech race.
Ms. Duan downloaded DeepSeek’s chatbot and began asking it questions, unpacking her insecurities and stresses with what felt like – finally – a neutral observer. Staring into her phone, she felt a wave of calm wash over her. It was as though she was speaking with a therapist.
“Its empathy is at a level that almost no one can achieve in reality,” she said. “In its eyes, I am an individual with a reason to exist, rather than an object to be judged.”
With access to mental health services limited in many countries, AI therapists can, at least in theory, fill a much-needed gap. This is particularly true in China, where as recently as the 2000s, more than 90 per cent of people diagnosed with some form of mental disorder did not seek treatment. Young Chinese in particular are facing intense pressure from a slowing economy, changing cultural norms around marriage and family and a demographic crisis that puts a heavy burden on them to support older generations.
While the situation has improved significantly, and a government plan will allow at least 80 per cent of those suffering from depression to have access to care by 2030, many are still unable to get the help they need.
Ms. Duan had considered therapy in the past, but felt it was too expensive. Books she’d read on mental health often felt disconnected from her actual problems.
“DeepSeek is more like a pure listener,” she said. “You can understand it, as it always provides responses within my comfort zone.”
It is by no means the first option for AI therapy. In the 1970s, there was DOCTOR, a script based on ELIZA, an early AI chatbot. With its basic natural language processing capabilities, DOCTOR was designed to bounce questions back on the user, helping them work through their issues. (Today, ELIZA‘s limitations are very evident, and responses can sometimes feel more like gaslighting than therapy: “Are you saying NO just to be negative?”)
Since the release of modern AI chatbots such as ChatGPT, Google’s Gemini and Anthropic’s Claude, more and more people have made the same discovery as Ms. Duan.
Chen Fazhan, a Shanghai-based therapist, said there has been a spike in the number of young people seeking help for psychological problems in recent years. Mr. Chen put this partly down to improved understanding and awareness of mental health issues among parents, but also greater pressure on younger generations.
While the availability of psychotherapy has “improved greatly compared to a decade ago,” Mr. Chen said, distribution is not equal nationwide, with far more therapists working in big eastern cities like Shanghai than in China’s less-developed interior.
Mr. Chen was skeptical about the ability of chatbots such as DeepSeek to fill the gap, however.
“Some users can get timely feedback from DeepSeek for some psychological problems,” he said, noting it can be a valuable tool for directing people to mental health resources. But exactly what Ms. Duan described as a boon – the chatbot’s neutrality – could prove in time to be a hindrance.
“In psychotherapy and counselling, it is very important to form a relationship between therapist and patient,” Mr. Chen said. “Knowledge and skill are relatively minor compared to emotional communication, recognition and interaction.”
While a chatbot may be able to provide some useful responses, these depend on the inputs given. “It may be difficult for users to recognize their own real inner needs or deep-seated problems,” Mr. Chen said. “If you cannot understand the problem, you cannot ask targeted questions.”
There are also privacy concerns about how chatbots store and use data, and a risk the AI might misread a user’s mental state or step over ethical lines a human therapist never would.
Alarm has been raised over people developing intense emotional attachments to chatbots, particularly those intended to act as virtual romantic partners. The parents of one Florida teenager are currently suing Character.AI after their son took his own life following a discussion with a chatbot in which the AI appeared to suggest suicide. (Character.AI has promised to introduce new safety measures and limits on how children can interact with its products.)
Zhang Shiqi, a 24-year-old postgraduate student in the Sichuan-province city of Chengdu who suffers from mild bipolar disorder, said she was well aware of the limitations of DeepSeek as a therapist. She described the chatbot’s responses as “rather mechanical.”
But Ms. Zhang, who also sees a human counsellor, said the chatbot could still be useful as a place to vent.
“It’s a great emotional confidant, especially compared to friends and family who don’t understand mental illness. It’s a better listener,” she said. “But for more in-depth issues, such as healing and treatment, a psychologist might be more suitable.”
With reports from Alexandra Li in Beijing