In the face of overwhelming demand and prolonged wait times within the UK’s National Health Service (NHS), a growing number of young people are turning to artificial intelligence tools—particularly ChatGPT—for mental health support. This emergent trend highlights both the increasing mental health struggles among youth and the gaps in access to timely professional care.
Mental health concerns have surged dramatically over the past decade, especially among adolescents and young adults. However, the healthcare system has struggled to keep pace. With NHS waitlists for psychological services extending for months—and in some cases, over a year—young individuals are seeking alternative means of support. AI chatbots, with their 24/7 accessibility and low barrier to entry, are filling that gap, though not without controversy.
The Growing Reliance on AI for Emotional Support

TikTok, one of the most influential platforms for younger demographics, has seen an explosion in content related to AI therapy. In March 2025 alone, over 16.7 million posts were tagged with themes like “ChatGPT therapist” or “AI therapy.” Many users have shared personal experiences using the chatbot for everything from daily journaling and anxiety relief to simulated therapy sessions.
This growing trend appears to be driven by multiple factors. The anonymity and immediacy of AI interactions offer a low-pressure alternative to traditional mental health services. For some, it’s a stopgap while awaiting therapy; for others, it’s perceived as a sufficient solution in itself.
Benefits and Limitations of Using AI as a Mental Health Tool
Advocates of AI tools like ChatGPT highlight their benefits in providing accessible mental health support to underserved populations. For individuals facing stigma, geographic limitations, or financial barriers, AI can offer a private and free outlet to express emotions and process thoughts. The constant availability of these tools is especially valuable in moments of acute emotional distress.
However, mental health professionals warn that AI lacks the emotional intelligence, contextual understanding, and ethical responsibility of trained therapists. While a chatbot can mimic empathetic responses and even apply therapeutic language, it cannot replace human connection, clinical diagnosis, or nuanced interventions. Relying exclusively on AI may lead to misdiagnosis, delayed treatment, or reinforcement of maladaptive behaviors.
Expert Concerns and Ethical Implications

The rise of AI as a mental health aid has raised ethical questions about data privacy, misinformation, and accountability. Unlike licensed professionals who operate under strict confidentiality agreements and ethical codes, AI systems are governed by algorithms and terms of service that may not adequately protect sensitive user information.
Mental health experts also express concern about how young users interpret and implement AI-generated advice. Without professional guidance, individuals may unknowingly follow incorrect or potentially harmful recommendations. The illusion of emotional support can provide comfort, but it may not translate into genuine healing or resolution of deeper psychological issues.
What This Trend Says About Mental Health Infrastructure
The popularity of AI-driven therapy reflects a larger issue: the systemic failure to provide timely, affordable, and effective mental health care. Young people are not turning to ChatGPT simply out of curiosity—they are doing so because existing systems are failing them. With rising levels of depression, anxiety, and suicide among youth, the need for reform in mental health services has never been more urgent.
Governments and healthcare institutions must view this trend as a wake-up call. While AI can play a role in supporting mental health, it should not become a substitute for comprehensive, human-led care. Instead, efforts must focus on reducing wait times, expanding access to therapy, and integrating technology responsibly within existing treatment models.
Main Takeaways
- A growing number of young people in the UK are using ChatGPT for mental health support due to long NHS wait times.
- Social media, especially TikTok, has played a key role in popularizing AI tools for emotional support.
- While AI offers accessibility and anonymity, it lacks the depth, empathy, and expertise of trained mental health professionals.
- Experts warn of ethical concerns, including data privacy and the risk of misinformation.
- The trend highlights a broader systemic issue—insufficient access to timely and affordable mental health care.