Why it’s imperative to take a grounded look at where AI helps—and where it absolutely doesn’t.
Over the past couple of years, tools like ChatGPT have quietly slipped into people’s daily lives. What began as a curiosity—a clever chatbot—has evolved into something far more embedded: a sounding board, a problem-solver, even, for some, a late-night confidant.
And that’s exactly where things start to get blurred.
Because while ChatGPT can sound insightful, supportive, and even compassionate… it is not a therapist. It’s not a trained emotional health practitioner. And crucially, it doesn’t work at the level where real, lasting transformation happens.
So let’s unpack this properly.
Why People Are Turning to AI for Emotional Support
It’s not hard to see the appeal.
- It’s available 24/7
- There’s no judgement
- No cost per session
- No vulnerability required to “open up” to a human
- Responses are often thoughtful, structured, and calm
For anyone feeling overwhelmed, anxious, or stuck, that can feel like relief.
In many ways, AI provides something people are often missing: space to think, reflect, and articulate what’s going on.
And that, in itself, has value. But here’s the problem…
The Illusion of Therapy
ChatGPT can simulate therapeutic language.
It can:
- Reframe thoughts
- Suggest coping strategies
- Offer perspective
- Validate feelings
And because it does this fluently, it can create the impression that something deeper is happening.
But the truth is, it isn’t. What we’re getting is pattern recognition and language generation, not therapeutic intervention.
There’s no:
- Emotional attunement
- Somatic awareness
- Subconscious access
- Real-time calibration to your nervous system
- Skilled challenge or disruption of deep-rooted beliefs
In other words… it works at the level of thinking (cognition), not at the level of transformation.
The Core Issue: Thinking Isn’t the Problem
Most people who struggle with anxiety, self-sabotage or emotional reactivity don’t lack answers. They already know:
- “I shouldn’t feel this way”
- “I need to think more positively”
- “I just need to let it go”
And yet… nothing changes. Why?
Because the issue isn’t logical. It’s subconscious and emotional.
It’s rooted in:
- Early conditioning
- Learned responses
- Stored emotional patterns
- Identity-level beliefs
This is where approaches like RTT (Rapid Transformational Therapy) operate: beneath the surface of conscious thought, where real change occurs.
AI doesn’t go there.
“Kicking the Can Down the Road”
One of the biggest risks of using AI as a pseudo-therapist is this: it can make us feel like we’re addressing the issue… without actually resolving it.
We get:
- A new perspective
- A temporary sense of clarity
- A rational explanation
But the underlying emotional pattern remains untouched. So what happens?
The issue returns. Again and again. Just dressed in slightly different language.
This is what I often describe as just “kicking the can down the road”. We’re merely managing symptoms, not transforming the cause – it’s a ‘sticking plaster’.
Where ChatGPT Does Have Value
Now, to be clear—this isn’t an attack on AI. Used correctly, it can be incredibly helpful.
For example:
- Clarity and Structure
If our thoughts are scattered or overwhelming, AI can help organise them.
- Practical Problem-Solving
It’s excellent at:
- Breaking down decisions
- Offering options
- Creating action plans
- Education and Insight
Understanding concepts like:
- Anxiety
- Habits
- Behaviour patterns
- Reflection Tool
It can act as a mirror for our thinking.
But notice something… all of this sits firmly in the realm of:
Thinking, analysing and solving
Not healing.
The Risks of Misuse
This is where a “health warning” becomes important.
If we begin to rely on AI as a substitute for real therapeutic work, several problems can arise:
- False Sense of Progress
We feel like we’re “working on ourself”… but nothing fundamentally shifts.
- Emotional Avoidance
We stay in our head, in the cognitive, rational, logical mind, avoiding deeper emotional processing. It’s as if we’re still operating “in The Matrix”.
- Reinforcement of Existing Patterns
AI often reflects our input back to us—meaning it can unintentionally reinforce our current worldview. It lacks empathy, perspective and mirroring.
- Lack of Challenge
A skilled therapist knows when to gently challenge us, our beliefs and perceptions. AI tends to stay agreeable and safe.
- No Accountability
There’s no real relationship, no commitment, no follow-through.
And perhaps most importantly…
- No Subconscious Change
Without accessing the subconscious, lasting transformation simply doesn’t occur. Chatgpt might help our rational, cognitive mind, but it doesn’t work with our underlying beliefs, feelings and perceptions of our place in the world.
Therapy Is Not Just Talking
There’s a common misconception that therapy is just: “Talking about our problems and getting advice”. But real therapeutic work—especially modalities like RTT—goes far beyond that.
It involves:
- Identifying root causes
- Rewiring subconscious beliefs
- Releasing stored emotional responses
- Creating new neural pathways
- Embodying change, not just understanding it
This is why people can spend years “thinking about” their problems: without ever resolving them. Because insight alone is not transformation.
So Where Does That Leave Us?
To be clear: ChatGPT is not a therapist. It’s not an emotional health app. And it shouldn’t be used as one. Ask it and it will tell you this!
What it is, is: A highly intelligent solutions provider.
It helps you:
- Think more clearly
- See options
- Solve practical challenges
And that’s valuable.
But if you’re dealing with:
- Anxiety
- Emotional triggers
- Deep-rooted patterns
- Self-sabotage
- Lack of direction or purpose
Then you’re not looking for better thinking. You’re looking for transformation.
And that requires going beyond the conscious, cognitive mind.
A Simple Way to Think About It
If your problem is:
- Practical, solutions-based → AI can help
- Emotional or subconscious → You need therapeutic work
Use the right tool for the right job.
Final Thought
We’re entering a world where AI will become increasingly sophisticated, increasingly human-like in its responses. But no matter how advanced it becomes, it will still lack one crucial element:
True human connection and the ability to work at a subconscious, emotional level.
So by all means use ChatGPT. But use it well; use it wisely. But don’t confuse clarity with change.
Because real transformation doesn’t come from better answers, it comes from changing the part of you that’s asking the question in the first place.
For a free, confidential and no-obligation chat, you can use this link.
Sent with love
Dorian