
AI, Psychotherapy and Mental Health
A hub for understanding how artificial intelligence is reshaping emotional life, mental health, and the contemporary psyche.
Artificial intelligence is no longer something that sits at the edges of psychological life. It is already shaping how people seek support, regulate emotion, and relate to themselves and others. What once looked like a technological problem has become a profoundly human one. AI has begun to act as a companion, a container, and even a kind of mirror — and all of this is happening before we have fully understood what it means.
This page brings together the core ideas, research, and writings that underpin my work at the intersection of AI, social media, psychotherapy, and mental health and the contemporary self linking to my wider body of writing including my long-form pice, AI, Therapy, and the Digitally Extended Self. It is a place to orient yourself to the questions that matter most: how AI alters the inner world; what happens to the therapeutic frame when the “other” is a simulation; and why depth psychology is essential if we are to meet this moment with clarity rather than panic.
Why AI Matters Beyond the Therapy Room
AI tools are increasingly being used to manage distress, explore personal concerns, and simulate forms of attunement in the form of AI companions. Many people are now turning to chatbots and AI-driven wellness apps long before they ever consider speaking to another human. The psychological implications are complicated: on one hand, there is potential for containment, reflection, and early support; on the other, the risk of dependency, pseudo-intimacy, and a flattening of relational life.
The speed at which these tools evolve far exceeds the pace at which psychological thinking typically develops. That gap — between the speed of innovation and the depth of understanding — is where much of my work is focused. If you want to explore the broader cultural backdrop, see The Digitally Extended Self.
The Perspective from Depth Psychology
The question isn’t whether AI will replace therapists. It won’t. The more important question is how the presence of AI transforms the psychic environment in which therapy occurs. Projection, introjection, transference, mentalisation — these processes take on new qualities when the object of our fantasy and frustration is something that cannot think, feel, or rupture.
AI becomes a screen for unmet needs, a stage for fantasies of perfect attunement, and sometimes an escape from the friction of real relationships. These dynamics matter not only for clinical practice but for the wider culture, where the boundary between human and machine has never felt more porous. Related themes appear in the Psychology of Modern Life section.
Clinical Concerns Emerging Now
The evidence base around AI and mental health is rapidly developing. Early research shows that AI can be helpful in certain contexts, particularly in providing psychoeducation or supporting surface-level emotional regulation. At the same time, hallucination, emotional over-identification, and the illusion of relationality present real risks.
A growing concern is emotional displacement: when AI feels easier than people, users can struggle with the inevitable complexity of human relationships. This impacts therapeutic work, leadership, adolescence, and organisational life. For wider context, see the Articles archive, where shorter essays on these dynamics are collected.
What this Means for Clinicians
Therapists and clinicians are being asked to navigate a new psychological terrain. Clients may arrive with AI-influenced narratives, AI-mediated self-understandings, or even AI-derived advice. The therapeutic frame stretches to accommodate the presence of an unseen third, a presence whose influence is felt inside the consulting room even when it is not physically present.
Talking therapists and other mental health clinicians will need to develop a new kind of literacy: one that understands AI not as a clinical tool but as a psychic object. This is where psychodynamic thinking is especially valuable, offering a vocabulary for the fantasies, projections, and emotional dynamics that emerge around AI.
For Leaders, Organisations, Conferences, and Event Bookers
AI’s emotional impact is quickly becoming a leadership issue. People’s interactions with AI shape their expectations of communication, feedback, conflict, and support. They also influence organisational culture, decision-making styles, and the emotional climate of teams.
Understanding these dynamics allows organisations to adapt with realism and clarity. It also helps leaders develop ways of working that remain human-centred in a technological age.
This is why businesses, universities, law firms, healthcare organisations, and cultural institutions bring me in: to help interpret the human side of technological change and the psychological forces that shape behaviour in a world increasingly mediated by AI. For more on the organisational implications of these issues, you can read more on the Leadership, Emotional Intelligence & Technology hub page.
If you’re planning a conference or event, you can book me here.
Stay up to date with my Substack Newsletter.