“I talked to ChatGPT before coming to you.”
I’ve heard this more and more lately in my therapy office.
And honestly? I get it.
We live in a world where answers are instant. You can ask a question at 2:00am and get a response in seconds. When you’re overwhelmed, anxious, or confused, that kind of access is incredibly tempting.
So the real question becomes:
Is AI a good tool for managing your mental health?
The honest answer is… it depends.
Not the most satisfying answer in a time when certainty feels hard to find, but mental health has never been a simple equation.
AI was created to help with efficiency — organizing information, answering questions, streamlining tasks. But mental health isn’t something that can be optimized for efficiency.
It’s subjective.
It’s relational.
It’s deeply human.
And that’s where things get complicated.
Where AI Can Be Helpful
AI can absolutely be useful.
It can help you:
organize your thoughts
generate journal prompts
create routines
learn coping strategies
find resources
In that sense, it can function like a tool.
The problem begins when it starts replacing connection.
What AI Cannot Replicate
In therapy, there is something happening beneath the surface that technology cannot recreate.
Humans have something called mirror neurons — parts of our brain that subconsciously help us read and respond to the emotional states of others.
Have you ever yawned after someone else yawned?
That’s mirror neurons at work.
But they’re also involved in empathy, emotional attunement, and the subtle ways we regulate each other in relationships.
When someone is navigating heartbreak, grief, trauma, or deep confusion about their life, information alone is not what heals them.
What heals people is often the experience of being felt and understood by another human being.
There’s a reason therapists often say:
What is broken in relationship heals in relationship.
A robot can offer information.
But it cannot offer presence.
The Line That Concerns Me
The biggest concern I have about AI and mental health is not that it exists.
It’s that self-regulation is already hard for humans.
Where do we draw the line between:
“I need help creating a grocery list”
and
“I feel hopeless and need someone to talk to.”
We are social beings living in a world that increasingly prioritizes instant responses over meaningful connection.
And when you’re distressed, there is nothing more appealing than an immediate answer.
But sometimes what we actually need is a slower conversation with another human.
My Personal Boundaries With AI
Because of this, I’ve decided to keep a few boundaries with AI in my own life.
Not because AI is bad — but because overreliance on anything can slowly erode skills we actually need.
My boundaries are:
• I will not use AI to process my own mental health.
• I will not use it to cut corners in work that deserves human care and attention.
• I will pause and use discernment before asking it something.
• I will notice if I’m using it because I’m overwhelmed or overcommitted.
AI is a tool.
But tools should support our lives — not replace the parts of being human that actually help us heal.
These are simply my boundaries.
Consider them food for thought as we all figure out how to live with technology that is evolving faster than our emotional systems.
Tools to Help Use AI Mindfully
If you do use AI, here are a few mental-health-friendly guardrails you can use:
1. The 2-Question Rule
Before asking AI something, ask yourself:
Is this an information question?
Or is this an emotion question?
Information → AI may help
Emotion → Talk to a human
2. The 10-Minute Pause
If you’re upset, wait 10 minutes before asking AI for advice.
Often the pause helps you realize you actually want connection, not just answers.
3. The “Would I Tell a Friend?” Test
If the question is something you would normally ask a friend, therapist, or trusted person…
That’s a clue that human connection may be the better option.
4. Use AI for Structure, Not Processing
Healthy uses:
journaling prompts
meal planning
scheduling
brainstorming ideas
Less healthy uses:
replacing emotional conversations
relationship advice in moments of distress
processing trauma or deep personal struggles