Smartphones brought constant connectivity. AI brings something different: a presence that responds, reasons, and adapts. As psychologists, we're watching this shift closely.
Over the past few years, AI has moved from a background technology to something many of us interact with daily: at work, at home, and increasingly in contexts that feel very personal. People are using AI tools to process difficult emotions, talk through problems, and even seek advice they might otherwise take to a therapist.
This raises genuinely interesting questions for those of us who work in mental health. Not "is AI good or bad": that framing is too simple: but: what psychological effects might this technology have, and what do we need to pay attention to?
The appeal is real
It's worth starting by taking seriously why people are turning to AI for emotional support. The reasons aren't trivial:
- It's available at 3am, when human support isn't
- There's no fear of judgment or burden: the AI won't be hurt by what you share
- It's patient, consistent, and doesn't get tired
- For some people, it feels like a low-stakes way to practise articulating thoughts and feelings
These are real needs. And for many people: particularly those without access to professional support: AI may fill genuine gaps. We shouldn't dismiss that.
What the research is starting to show
We're still in early days when it comes to research on AI and mental health. But a few things are emerging:
For mild to moderate distress, AI-assisted tools (particularly those based on cognitive-behavioural principles) show some evidence of benefit: particularly in the area of accessibility. If someone is on a waitlist for therapy, having a structured self-help tool can be meaningfully better than nothing.
For complex presentations: trauma, personality difficulties, severe depression, psychosis: the picture is different. These presentations require the kind of nuanced, attuned human relationship that current AI cannot replicate. A therapeutic relationship is itself a mechanism of change, not just a delivery vehicle for techniques.
A useful question to sit with: Am I using AI to process and move forward, or am I using it to avoid the discomfort of sitting with something that needs more than a chatbot can give?
The dependency question
One thing I watch for clinically is the role that any coping strategy plays in a person's life: whether it's expanding their capacity to tolerate difficult experiences, or contracting it. This applies to AI as much as anything else.
Used well, AI tools can build skills: helping someone practise self-reflection, think through a problem more clearly, or access information they didn't have. Used in a certain way, they can also become another form of avoidance: a way of never quite sitting with the discomfort long enough for something to shift.
This isn't unique to AI. The same question applies to social media, alcohol, busyness: any of the things we reach for when we're not quite ready to be present to ourselves.
What I actually tell my clients
I'm not anti-technology. Some of my clients use AI tools in ways that complement their therapeutic work: journalling with an AI prompt, using a mood tracking app, accessing psychoeducation between sessions. That can be genuinely useful.
What I encourage is intentionality: knowing why you're reaching for a tool, what you're hoping to get from it, and whether it's serving you over time. The same reflective stance that helps in therapy helps here too.
And for anyone who finds themselves relying heavily on AI for emotional support: it might be worth asking what that reliance is telling you about unmet needs, and whether a human conversation might serve those needs more fully.
A note on the future
AI is going to change healthcare, including mental healthcare, in ways we can't fully predict. There will be genuinely positive developments: better accessibility, improved tools for monitoring and early intervention, reduced burden on overstretched systems.
But some things are unlikely to change: the capacity for one human to truly see another, the repair that happens in a genuine relationship, the particular power of feeling understood by someone who knows what it is to struggle. These aren't features that can be engineered. They're what make us human.
Ready to talk to a real psychologist?
Our team offers evidence-based support across a wide range of presentations. Book online or call us on 07 5573 2200.