AI and Mental Health
AI is everywhere. And I mean everywhere.
I’m not a big fan of it. I turn off almost all of my AI options—email, phone, web search, and even charting. I disable them whenever I can.
I understand that AI can be helpful for some people, especially those living in Western countries where English is not their first language. However, using AI to write a book or asking it for advice can be problematic. Let’s not even go into the environmental impact.
Yes, AI can be helpful in writing a resume or drafting a better email response for someone with English as a second language, because, let’s be honest here… We live in a capitalistic society where one is judged for one's proper spelling and grammar.
Take this blog, for example, if there are many grammatical or spelling errors, what impression would you have of me? Uneducated? Sloppy? Lazy for not double-checking my post?
So while some features of AI can be helpful, especially for folks of the global majority, relying on AI for advice that could easily come from friends or family may actually contribute to people feeling more isolated.
The Danger of Using AI for Your Mental Health
When I hear people say they ask ChatGPT what to do when they feel anxious or depressed, it worries me.
One concern about using AI for mental health support is that it encourages people to rely on their cognitive brain to solve emotional experiences. Instead of checking in with themselves and exploring the conditions that may be causing their anxiety or depression, they try to “think” their way out of their feelings.
There have already been several lawsuits where families claim their children used AI platforms for comfort, and it led to tragic outcomes.
An example
While writing this blog, I decided to test one of the popular AI platforms by asking a simple question:
“What should I do? I feel really sad today.”
The response included some helpful suggestions, such as:
“If the sadness feels really intense or overwhelming, or you’re feeling unsafe or thinking about hurting yourself, it’s important to talk to someone right away. If you’re in Canada, you can call or text 988 to reach the Suicide Crisis Helpline any time. They’re there 24/7 and you don’t have to be in a crisis to talk.”
However, the reply ended with this:
“If you want, you can also tell me:
• Did something specific happen today, or did the sadness just show up?
• What usually helps you, even a little, when you feel like this? I’m here to listen.”
Do you see the issue with that last response?
The platform encourages users to share personal details—which raises privacy concerns—and fails to foster human-to-human connections.
While AI can be helpful when you feel stuck or need a quick answer, relying on it for mental health support can be risky.
Humans need connection. In this situation, if the person had reached out to a loved one or their therapist, they might receive better support. When we feel down, stressed, or in need of support, we need to be seen, listened to, and held. A computer cannot do that.
Yes, the system must change. The increasing isolation, lack of connection, and rising drug use among young people cannot be solved with quick fixes. What we need is a deeper shift in our society—one where asking for help is seen as strength, where people can openly share their feelings without judgment, and where our self-worth is not measured by how we look or how much money we have.
If you’re struggling, please reach out to your therapist, a trusted friend, a family member, or your community. You do not have to go through difficult moments alone.
📞 If you need help with your mental health, contact me today. Book a free 15-minute consultation.