Neither ChatGPT nor chatbots are friends or therapists, so we should avoid attributing this role to them or placing blind trust in their recommendations.

These are the risks of using ChatGPT as a therapist

According to some studies, 70% of Spanish teenagers admit that using ChatGPT and other chatbots has affected their relationships with others.

BY Mariona | 06 May 2026

What seemed like a distant or almost utopian future just a few years ago is now a reality. Artificial intelligence has become an integral part of our daily lives as a tool for productivity, learning and task optimisation. As a result, it is very common to use advanced virtual assistants such as ChatGPT, Copilot or Gemini.

But despite having been designed to improve daily life, a growing number of people are seeking something more from AI: support, companionship or even a virtual friend. This raises important questions: Could AI replace a friend or a psychologist? And how does all this affect our well-being?

 

Can AI understand how we feel?

The idea that AI understands us is a growing concern for healthcare professionals. In fact, over 30% of Spanish teenagers say they have used ChatGPT or other apps to discuss personal issues or make important decisions, according to a study by GAD3. The problem is that they often make mistakes when responding correctly to certain mental health cases, which can lead to dangerous consequences
 

It is important to remember that chatbots are algorithms that process data and generate responses based on statistical predictions. They do not comprehend human meaning and are incapable of feeling or understanding emotions. Consequently, they are not designed to either imitate or replace human relationships. This is highlighted by OpenAI, the developer of ChatGPT, in a study. 

 

24/7 access and no judgement: ChatGPT’s biggest draw

More than 800 million people use ChatGPT every week, and around 2.5 billion messages are sent every day, according to data from OpenAI. This is due to its availability and immediacy, which are its greatest attractions but also represent the main risk to mental health.

Young people are one of the most vulnerable groups, as they may find in AI a space where they can express themselves and be heard without judgement. In moments of loneliness or confusion, this can create a false sense of security and comfort. In a study by Drexel University, researchers analysed more than 300 posts by teenagers on a forum and found that many admitted to being dependent on AI. In some cases, this has even led to insomnia, academic problems and strained relationships. 

 

AI always agrees with us, and that’s dangerous

It is true that there is something tempting about using artificial intelligence as a therapist, since it is a tool designed to please the user. It tells us what we want to hear, without judgement, calmly and without contradicting our thoughts. Sometimes, even at the expense of the truth. 

 

 

Stanford University warns that constant flattery can reduce self-criticism, as AI tends to validate the user even when they are wrong. This creates a cycle of trust, dependence and isolation that leaves no room for learning or personal growth.  
 

 

How can you use ChatGPT in a responsible way?

Ni ChatGPT ni los chatbots son amigos o terapeutas, por lo que debemos evitar darles esta dimensión o confiar ciegamente en sus recomendaciones. Pueden ser útiles como apoyo temporal, pero nunca reemplazar el pensamiento crítico ni la ayuda profesional. La clave está en hacer un uso moderado y consciente. Algunas recomendaciones son: 

Neither ChatGPT nor chatbots are friends or therapists, so we should avoid attributing this role to them or placing blind trust in their recommendations. They can be useful as a temporary support, but they should never replace critical thinking or professional help. The key lies in using them in moderation and with awareness. Some recommendations include: 

 

  1. Understand how chatbots work and their limitations. They are statistical prediction systems trained to simulate conversations. They do not feel or understand human emotions. 
  2. Avoid using ChatGPT at times of high vulnerability. In emotionally intense situations, relying on AI can create a false sense of support and dependency.
  3. Seek support from family, friends or professional help. Humans are a social species who are nourished by personal relationships. Seeking comfort or advice from others prevents isolation and dependence on AI.
  4. Question the responses. Before using ChatGPT or any other resource, it is important to ask yourself why you are using it. Is it to vent, feel supported, or make decisions? In any case, we must use critical thinking and not simply accept the answer at face value. 


Overall, using smart machines is not a negative thing. They can provide occasional support and even be used to carry out simple exercises or as a preparatory aid for a therapy session. Under no circumstances should they be used as a substitute for human contact or psychological care, as chatbots lack professional training and clinical knowledge. The key lies in striking a balance and using them responsibly. 

Check all issues of the magazine

Revista Compartir 24