Is AI Therapy Replacing Human Connection in Asia’s Mental Health Crisis?

Is AI Therapy Replacing Human Connection in Asia’s Mental Health Crisis?
Photo by eswaran arulkumar / Unsplash

Young people in Taiwan and China are turning to AI chatbots for mental health support—but at what cost? As rates of anxiety and depression surge, a generation is embracing ChatGPT and domestic alternatives like Baidu’s Ernie Bot as cheaper, faster, and more discreet than traditional therapy. But experts warn this trend risks normalizing tech-driven isolation. Let’s dive in.


🌍 The Rise of AI Therapy: A Double-Edged Sword

  • 1 in 3 Gen Z users in Taiwan and China now cite mental health support as a primary reason for using AI chatbots (Harvard Business Review, 2025).
  • 72-hour wait times for therapy appointments in major Chinese cities vs. 24/7 chatbot access.
  • #AIBestTherapist has over 500,000 posts on Weibo, with users praising AI’s “non-judgmental” responses.
  • Tragic cases like the 2024 suicide of a Beijing student who relied solely on chatbots highlight systemic gaps.

✅ The Promise: AI as Mental Health’s New First Responder

Tech giants and therapists alike see potential:

  • Baidu’s Ernie Bot now uses sentiment analysis to detect crisis keywords, routing users to hotlines.
  • ✅ Taiwan’s DeepSeek offers culturally tailored CBT exercises, addressing collectivist stigma around “burdening” others.
  • ✅ Therapists report 30% faster breakthroughs when patients use AI journals between sessions.

Dr. Yi-Hsien Su notes: “For those too ashamed to seek help, AI can be the bridge—not the destination.”


🚧 The Risks: When Algorithms Miss the Human Nuance

  • ⚠️ 55% of chatbot users delay professional care, believing AI advice is sufficient (Taiwan Counselling Psychology Association).
  • ⚠️ “Overly positive” responses from bots risk minimizing severe symptoms like suicidal ideation.
  • ⚠️ No ethical guidelines govern AI therapy tools—unlike licensed practitioners.

As one user confessed: “ChatGPT told me to ‘stay positive’ about my cancer diagnosis. I felt more alone than ever.”


🚀 Final Thoughts: Can AI and Humans Coexist in Mental Healthcare?

The path forward requires:

  • 📈 Hybrid models: Use AI for triage and journals, but mandate human follow-ups for high-risk cases.
  • 🤖 Transparent AI training: Involve diverse therapists to reduce cultural bias in responses.
  • 🎯 Government action: Subsidize teletherapy to compete with free chatbots’ convenience.

As Yang, the Guangdong user, realized: “AI taught me to voice my pain—but real healing began with a human.” Where should we draw the line between tech and touch in mental health?

Let us know on X (Former Twitter)


Sources: Helen Davidson. In Taiwan and China, young people turn to AI chatbots for ‘cheaper, easier’ therapy, 22 May 2025. https://www.theguardian.com/world/2025/may/22/ai-therapy-therapist-chatbot-taiwan-china-mental-health

H1headline

H1headline

AI & Tech. Stay Ahead.