This website uses features that your browser doesn't support. Please upgrade to a recent version of your browser.

Listening Algorithms

Will AI Replace Psychotherapists?

A quiet therapy room. A clock ticks. A notebook lies open on a chair. But the chair is empty.

Around the world, millions of people are silently struggling. Mental, neurological, and substance use disorders make up 14% of the global burden of disease - yet in low-income countries, three out of four people don’t receive the treatment they need.

In a world where conversational AI already helps us bank, shop, and manage our lives, it’s now being trained to care for our minds. Chatbots are becoming mental health companions - always available, always listening.

Can a chatbot replace a therapist — someone who’s studied for years, trained to listen not just with their ears, but with their eyes, their instincts, their humanity?



Designed to Please, Not to Heal

What AI Can't Do in Therapy

When reporting on the use of AI in psychotherapy, it's crucial to distinguish between two key categories of tools. According to Professor Marcin Rządeczka, a Polish expert in artificial intelligence and mental health, the landscape includes both general-purpose language model chatbots and specialized mental health chatbots.

The first group consists of tools not originally designed for therapeutic support—such as ChatGPT or other mainstream AI assistants—but which are widely used for mental health purposes simply because they are easily accessible. In contrast, dedicated mental health chatbots are specifically developed for therapeutic contexts, yet they remain far less commonly used.

0:00/0:00

Good therapy should make you cross your comfort zone. ChatGPT use a complete opposite approach and let you stay constantly in your comfort zone.

Prof. Marcin Rządeczka

He also points out that it’s important to keep in mind that language model AI like ChatGPT is a product, and products need to be user-friendly. As a result, conversations with ChatGPT are designed to keep users engaged on the platform. The interaction is meant to feel pleasant and smooth, and ultimately, the chatbot exists to make the user feel seen and understood. Therapy should be the complete opposite.

0:00/0:00

Although he highlights many of the risks of using AI in psychotherapy, he also sees its potential — especially when it comes to accessibility. Unlike traditional therapy, which is often limited by cost, location, or long waiting times, AI could offer support that’s more immediate and widely available.

Why People Are Choosing AI Over Traditional Therapy

0:00/0:00

Founder and CEO of Georgian Ai Psychotherapy chatbot Monika Malania believes these challenges are keeping far too many from getting the help they need.

That's why she created "Mesaubre" - which means "Talk with me" - an app designed to make mental health support more accessible for those who can't access conventional therapy.

A recent cross-sectional survey at the University of South Australia explored how self-stigma influences attitudes toward both human and AI-supported psychotherapy. The findings reveal a striking pattern: those who felt ashamed about seeking human therapy tended to view AI-based therapy more positively. Conversely, individuals who felt stigma around AI-based therapy were more likely to favor human therapists.

The study suggests that young adults who avoid human therapy due to fear of judgment may be more open to turning to AI alternatives — such as chatbots — for mental health support.

Human vs. Machine:

Who Would You Trust with Your Mind?

Although young students are often referred to as 'digital natives’ - people who have grown up in a technological world from childhood, they remain skeptical. Out of the four young people we have asked in Warsaw, only one had used ChatGPT as support during a breakup. The remaining three said they had never used ChatGPT or any other AI tool for mental support - and they couldn’t see themselves doing so anytime soon.

We asked them how they feel about the development we're seeing, where AI is gradually supplementing - and in some cases even replacing - the human aspect of psychotherapy. Hear their responses in the video below:

0:00/0:00

A Generation at Risk: Poland's Growing Mental Health Crisis

Poland is grappling with a mental health crisis, as access to psychological support remains limited and societal stigma continues to deter people from seeking help. According to recent data, 60% of those who report needing mental health support do not reach out due to fear of judgment or shame.

The situation is particularly alarming among young people. Polish children today rank among the lowest in Europe for mental well-being, and the country has one of the highest rates of suicide attempts among youth. In 2022 alone, police investigated 2,031 suicide attempts by individuals under the age of 18 - a staggering increase of

0%

compared to 2020.



These figures paint a stark picture of the urgent need for both cultural and systemic change in how mental health is addressed in Poland.

Blurring the Line Between Real and Virtual

Insights from a VR Lab in Warsaw

Dr. Grzegorz Pochwatko, the head of VR Lab in the Institute of Psychology, Polish Academy of Sciences, and his team study how people respond to virtual environments, and how VR itself influences human behavior.

We had the chance to ask about integration of Large Language Models into VR. He explains why our brains more often can not distinguish between virtual and real humans:

0:00/0:00

Pochwatko notes that people respond to virtual humans much like to real ones — our brains don’t always distinguish between them. Still, on an unconscious level, we tend to sense whether there’s a real person behind the digital body, and genuine human presence leads to stronger emotional connection and trust.

VR makes it possible to simulate unique therapeutic scenarios — like talking to a virtual version of yourself — which, as Dr. Grzegorz Pochwatko notes, can support self-compassion. Still, due to its complexity, such technology isn’t widely accessible, unlike more common tools like mobile chatbots.

0:00/0:00

However, there is a warning: people might start relying too much on technological therapists, drifting away from real human contact.

0:00/0:00

It has to be under the supervision of professionals because, in my view, it’s very dangerous to rely solely on Large Language Models, no matter how human-like they seem or how many books, texts, and papers they’ve been trained on. Right now, these models aren’t truly intelligent; they’re essentially just mimicking understanding. They’re perceived as somewhat empathetic, although it’s not real empathy.

Dr. Grzegorz Pochwatko

Is the Era of Human Therapy Ending?

Professor Marcin Rządeczka points out that, for now, the vast majority still prefer a real human when it comes to psychotherapy. But how long that preference will last is hard to predict. In five to ten years, we may find ourselves facing a major shift, where more people turn to chatbots over human therapists.

0:00/0:00

Rządeczka fears that the price of human psychotherapy is likely to keep rising, and in a worst-case scenario, a large segment of society could be priced out of access to real human care altogether. Therefore, we need to have more open conversations not just about the potential downsides, but also about how to use chatbots wisely as supportive tools that can complement human therapy.



The Team

Authors: Davit Tsakadze, Alberte Greve, Laura Pfundner, Anastasiia Kovtun

Mentor: Tatiana Kolesnychenko

Sources:

https://www.who.int/europe/news/item/10-10-2023-giving-mental-health-the-attention-it-deserves----poland-adopts-who-tool-to-boost-efforts-to-address-mental-health-needs https://journals.sagepub.com/doi/full/10.1177/09727531231221612 https://www.jmir.org/2020/7/e16021/ https://www.sciencedirect.com/science/article/pii/S294988212400046X https://www.who.int/teams/mental-health-and-substance-use/treatment-care/mental-health-gap-action-programme https://link.springer.com/article/10.1007/s11245-024-10018-x https://www.jmir.org/2020/7/e16021/