September 10, 2025
As a trauma therapist, I’ve been tracking the impact of Artificial Intelligence (AI) on our mental well-being. Not just how we use mental health apps, but how being surrounded by AI all day affects our nervous systems, our thoughts, and our relationships.
Let’s talk about the intersection between mental health and technology. To help you distinguish opinions and fears from fact.
Artificial Intelligence Is Now a Constant Companion
AI is no longer something we “use occasionally.” It’s embedded in our routines—whether you’re getting voice reminders from Alexa, reading AI-curated news, interacting with a chatbot at work, or getting parenting tips from social media that were selected by an algorithm.
Here’s the thing: not all AI is bad for our mental health. In fact, some research shows it can help—especially for older adults. A recent study reported that 78% of seniors using AI-powered devices for daily tasks reported high satisfaction, and 80% reported strong mental health and reduced loneliness (New York Post, 2024).
But in work settings? The picture isn’t as sunny.
The UK’s Institute for the Future of Work found that people working alongside Artificial Intelligence tools—like chatbots, productivity trackers, or automated schedulers—report significantly lower well-being and higher job insecurity (The Guardian, 2024). Their days are more efficient—but more stressful.

The Psychological Weight of AI in Everyday Life
Even outside of work, constant Artificial Intelligence engagement brings new kinds of mental load. A 2025 study from Romania examined 217 adults and found that AI saturation contributed to “technostress”—a mix of confusion, pressure to keep up, and a sense of loss of control. That stress was strongly linked to anxiety and depression symptoms (Enăchescu et al., 2025).
Another study from South Korea tracked over 400 professionals and found that while AI didn’t cause burnout directly, it did increase job stress, which then led to burnout—particularly among people who felt unsure about their ability to adapt (Kim & Park, 2024).
There’s also early evidence that some people are becoming emotionally entangled with Artificial Intelligence. In a recent observational study, users formed dependency loops with AI chatbots, especially those designed for companionship. Some individuals developed delusional thinking or worsened depressive symptoms, particularly if they were already isolated (Yin et al., 2025).
That’s not a tech problem—it’s a human nervous system trying to attach to whatever connection it can find.

So, is Artificial Intelligence Bad for Us?
This is a tricky question and will be a moving target as AI advance. But for now. let’s break it down with the information we have.
When AI can help mental health:
- Reduces isolation
- Frees up bandwidth
- Detects early signs of distress
- Improves access to care
When AI can harm mental health:
- Increases pressure or surveillance at work
- Replaces human connection instead of supplementing it
- Fosters emotional dependency or distorted thinking
- Is used without understanding its limits
Even public health agencies are weighing in. The UK’s NHS issued a warning in 2025 that Artificial Intelligence therapy bots can give misleading advice and even reinforce harmful beliefs, especially for people managing trauma, OCD, or suicidal thoughts (The Times UK, 2025).
That’s why some U.S. states are stepping in. Illinois, Nevada, and Utah have passed laws requiring licensed professionals to oversee any AI-based mental health services—banning stand-alone bots for therapy (Washington Post, 2025).

AI Tools That Support Mental Health
Not all AI tools are problematic. In the right hands, and with the right expectations, they can be game-changers.
- Early detection of depression:
Artificial Intelligence can catch early signs of depression through speech, tone, and facial expression. When paired with clinical oversight, this helps reduce missed symptoms (Mardini et al., 2025). - Personalized coping strategies:
Some apps adapt to your responses, offering better prompts and mindfulness tools over time. These are not therapy—but they can increase awareness. - Empathy support for peer helpers:
A Stanford study found that peer support agents using a subtle AI “coach” showed a 38% improvement in empathic communication (Park et al., 2022). - Helping clinicians, not replacing them:
Only 3.8% of psychiatrists in a recent survey believed Artificial Intelligence could replace them. Most saw it as a tool to reduce admin tasks—not relational work (Yang et al., 2019).

6 Ways to Protect Your Mind While Engaging with Artificial Intelligence
Here’s your real-world playbook for mentally healthy AI engagement.
1. Track how AI affects your nervous system
Pause after interacting with AI tools—chatbots, voice assistants, content feeds—and check in. Are you calmer or more tense? Connected or hollowed out? Awareness is your first defense.
2. Use AI as a support tool, not an emotional crutch
Let Artificial Intelligence remind you to breathe, track your mood, or organize your to-do list—but don’t rely on it solely to validate, comfort, or emotionally process with you. Instead, supplement human connection with AI not the other way around.
3. Learn just enough to feel empowered, not overwhelmed
You don’t need to master AI—just understand the basics. Learn how recommendation algorithms work. Know what “AI-generated” really means. Understanding the system helps you stay in charge of your mental space. This video gives some great advice on how to use Chat GPT safely, efficiently, and effectively.
4. Prioritize real human relationships
Whether it’s texting a friend, checking in with a therapist, or talking face-to-face with your partner—protect your emotional bandwidth for humans who can attune to you, consider your perspective, and offer empathic concern. Artificial Intelligence can do all these things, but sometimes we need someone “with skin on.”
5. Set clear boundaries around your AI exposure
Build “tech-off” zones: during meals, ninety minutes or more before bed, or when spending time with your kids. AI shouldn’t shape your mood from morning scroll to midnight sleep aid. In short, you want to engage with AI mindfully; not mindlessly.
6. Model balanced AI use for your family
Especially if you’re a parent, your behavior is the blueprint. Let kids see you question Artificial Intelligence, take breaks from it, and choose connection over convenience. Modeling healthy technology hygiene for your children.

Bringing it all together: mindful vs. mindless use
AI is woven into how we live, work, and feel. It can lift us up—or wear us down.
The difference comes from how you use it.
When used mindfully, AI can be supportive and even healing. When used mindlessly, on autopilot and without boundaries, it can erode well-being.
Understanding that distinction—and teaching it to those around us—is the key. Let me show you how. Email me today to get started. You aren’t alone to navigate this new tool.
References
- Seniors report mental health boost from AI use – New York Post
- AI and well-being in the workplace – The Guardian
- Technostress from AI adoption – Frontiers in Psychology
- Burnout, job stress, and AI self-efficacy – Nature Humanities and Social Sciences Communications
- AI-induced delusion and dependency – arXiv
- NHS warns against AI therapy bots – The Times UK
- Illinois bans stand-alone AI therapy – Washington Post
- AI-enhanced depression detection – BMC Psychiatry
- Empathy boost via AI guidance – arXiv
- Psychiatrists on AI’s role in mental health – arXiv
If you are already stressed out from all things technology, check out my tools page or my going deeper page for valuable, easy to use, free resources to help you come back into balance.
