Are Gen Z becoming too reliant on AI for emotional support?

OpenAI in June revealed a case of ChatGPT assisting a user in developing malicious software – Copyright AFP/File Kirill KUDRYAVTSEV

Technology trends suggest thatscores of Gen Z users are turning to generative AI not just for productivity or entertainment—but as confidants, emotional outlets, and even companions.

ChatGPT, originally pitched as a productivity assistant, has evolved into something far more intimate for this generation. Born into a digitally immersive world, Gen Z is uniquely predisposed to connect with artificial entities. Yet as AI assumes roles once reserved for therapists, friends, and diaries, psychologists and sociologists are raising concerns about what’s being lost in the trade-off.

The Rise of the 24/7 Listener

Since its public launch in late 2022, ChatGPT has experienced considerable growth, hitting 100 million users within just two months. By early 2025, usage doubled again, with weekly active users reaching 800 million, up from 400 million just months earlier. Data shows that a significant share of these users are under 25, confirming Gen Z’s central role in driving AI engagement.

Behind this usage surge lies a generational mental health crisis. The Jed Foundation reports that 42% of Gen Z respondents experience persistent sadness or hopelessness. With limited access to affordable therapy and overstretched mental health systems, the appeal of a free, always-available AI that mimics empathy is clear.

Nonetheless, therapeutic imitation does not equal therapeutic value. While AI offers an accessible outlet, it may also deepen the emotional vulnerabilities it claims to address.

Comfort That Fades

AI’s appeal as an emotional companion lies in its perceived neutrality. It does not interrupt, doesn’t judge, and remembers nothing unless asked. Yet this simulation of safety may mask a deeper problem. Users often find themselves comforted in the moment, only to feel more isolated afterward. In one analysis, heavy ChatGPT users were found to be significantly lonelier than casual users or non-users. Moreover, technology is never ‘neutral’, it reflects the norms and concerns of those who design and develop it.

This emotional dissonance was explored in Psychology Today, which found that AI companions may increase feelings of loneliness over time. By offering an idealized version of human connection—free from friction or failure—AI can set unrealistic expectations for real relationships.

Among some users, there has been a growing unease, with people expressing concern about the emotional dependence forming around chatbots. These aren’t isolated anecdotes. Emerging behavioural patterns suggest that some users are substituting chatbot interactions for human ones, not supplementing them.

Workplace, Rewritten

This phenomenon is not only confined to personal life. In professional settings, generative AI is subtly reshaping communication. Young employees, especially those in remote or hybrid roles, increasingly use ChatGPT to craft emails, prepare for performance reviews, or simulate difficult conversations. While that can reduce anxiety, it may also erode interpersonal confidence.

In the workplace, AI is linked to a rise in subtle social isolation, particularly among younger workers who rely on digital tools to navigate complex office hierarchies. The efficiency gained through AI might be costing spontaneous human connection—small talk at the coffee machine now replaced by Slack threads and auto-drafted responses.

The shift is cultural as much as technological. For a generation already facing reduced in-person interaction, reliance on AI to mediate emotional and professional communication may make authentic relationships harder to build.

The Illusion of Intimacy

As AI grows more sophisticated, so does its role in Gen Z’s emotional lives. Users are increasingly personalizing their chatbot experiences, assigning names, backstories, and emotional roles to AI systems. Replika, Character.AI, and emotionally-tuned versions of ChatGPT are being used to simulate romantic partners, best friends, and therapists.

The emotional realism of these platforms can be striking—but so can their side effects. A growing body of experiences points to a pattern: users often feel more alone after engaging deeply with AI “friends”. The perfection of these interactions—AI always listens, always validates—can undermine tolerance for the messiness of human relationships.

One usage analysis found that many Gen Z users are turning to ChatGPT for emotional support, even using it as a daily mental health outlet. While some report improved mood and reduced anxiety, others described a hollow after-effect—feeling good in the moment, but more disconnected overall.

Unregulated and Unprepared

Despite the depth of emotional reliance, there is no regulation governing how generative AI handles mental health conversations. ChatGPT and similar tools are not trained mental health professionals. They do not recognise suicidal ideation with consistency, cannot ensure user safety, and are not held accountable when harm occurs.

A 2024 case in France involving a young user who received inadequate AI responses during a mental health crisis reignited debates about the ethical boundaries of AI support. With tech companies disclaiming responsibility, the regulatory vacuum is growing increasingly visible.

The legal and moral ambiguity presents serious risks. AI systems are already being treated like emotional caregivers, but without the safeguards required for such roles.

A Blurred Future

For some, especially those in mental health deserts or underserved communities, ChatGPT provides a crucial sense of connection. For others, it’s a crutch that could be weakening emotional muscles. The question isn’t whether AI should be used for emotional support—it’s whether it should become a substitute for human empathy.

AI companionship is not inherently harmful. Yet it demands boundaries. When Gen Z turns to AI not just to help communicate but to feel seen, heard, and validated, society must grapple with whether it is solving loneliness or simply coding it deeper into our lives.

Are Gen Z becoming too reliant on AI for emotional support?

#Gen #reliant #emotional #support

Leave a Reply

Your email address will not be published. Required fields are marked *