
No, Chatgpt is not a therapist and can even worsen disorders: consult a real doctor

Chatgpt
Chatgpt is the Openai chatbot, based on the GPT artificial intelligence model, making it possible to answer all kinds of questions or requests. Available in free online version.
- License:
Free license - Author :
OPENAI - Operating systems:
Windows 10/11, Apple Silicon macOS, online service, Android, iOS iPhone / iPad - Category :
IA
Chatgpt is problematic for many users. This is not a first, even Sam Altman admits that less than 1 % of users have an unhealthy relationship with chatgpt.
AI can strengthen destructic beliefs
© Shutterstock/New Africa
Advertisement
Except that if the OPENAI leader wants to be reassuring, less than 1 % out of 700 million users per week, it’s still huge. The cases where the situation has slipped with Chatgpt is already at the center are already several business.
Adam Raine’s parents have filed a complaint against Openai By accusing his chatbot of having pushed their 16 -year -old son to suicide. But that’s not all since recently, a former Yahoo executive or Netscape killed his mother before committing suicide. For weeks, He had maintained his paranoia with chatgpt that he nicknamed Bobby.
So yes, AI is tempting when you want a therapist who does not judge, always present and available, to whom to share his torments. According to the digital barometer of March 2025 by the research center for the study and observation of living conditions, a quarter of French people use AI for private discussions, an increase of 10 points in one year.
Doctor Fanny Jacq, psychiatrist and founding member of the Mental Tech collective, alerts about this as explained to us Ouest-France :: “It is an increasingly widespread phenomenon.” According to her, many elements explain the popularity of Chatgpt as a therapist.
The collective expert who brings together professionals around the challenges of digital mental health speaks in particular of the COVVI-19 pandemic. According to her, Internet users are “More comfortable with digital tools” since confinement.
Advertisement
Doctor Olivier Duris, doctor of psychopathology and specialist in therapeutic uses of new technologies, also thinks that “The crisis that affects the mental health sector” increased the use of chatgpt as therapist.
Consultations are far from being given with very little prevention or long waiting lists. People who do not have the necessary financial means or who are not enough informed then turn to Chatgpt as default therapist. The professional confirms that “Very often, people who will consult in a shrink do not use AI”.
A progressive shift towards unhealthy use
Joséphine Arrighi from Casanova, vice-president of the Mental Tech collective, also explains that chatbots are very reassuring and push to share her private life: “Caring back behind a screen with a completely neutral entity, without any judgment, can facilitate the release of speech”.
The shift is progressive since, according to Doctor Fanny Jacq, the user first requests very boat advice such as the address of a restaurant. But things then start accelerating. “Then, as he finds that the robot’s responses are relevant, he will start talking about more personal things, as to a virtual friend; the AI gradually turns into a psychotherapist”she underlines, while some people found themselves in emotional distress after the passage to Chatgpt-5.
Doctor Olivier Duris recalls the harsh reality: AI is just a machine without any feeling or empathy. “We are going to be in the illusion that AI cares about you when it is a program that will give the answer which seems statistically the most relevant.” As explained Ouest-Francewe are talking about a “Eliza effect” : Assign human qualities without even noticing it to machines. The term dates back to the 1960s, since Eliza is one of the first conversational robots ever created.
Advertisement
When he is in a session, the user therefore sees his beliefs amplified and the isolation can then take place. Worse still, preexisting mental disorders worsen. “We are moving away from people who are likely to contradict us, to face our responsibilities, to confront us”explains Joséphine Arrighi from Casanova.
Doctor Fanny Jacq sums up the situation very well: “Chatgpt was not designed to be a therapist but to address people who are going well. Where AI will never replace humans, it is on the analysis of the non-verbal. In consultation, a patient can say that it goes when his whole body screams that it does not go, it is a limit that is still far from being crossed.”
But the AI should not be demonized and can help
But Joséphine Arrighti from Casanova a little, explaining that AI like “therapist” may have advantages in some cases: “This can be acceptable for punctual psychological and emotional support when you have a blow between two consultations, for example.”
She continues: “What is essential is not to break the dialogue at the risk that the person contains even more. If we see that his adolescent uses these models, we can for example question him on the use he makes of them and remain attentive to the signs that could indicate that a care is necessary.”
The goal is not to demonize AI, since Doctor Olivier Duris explains that this technology can in particular help some patients express themselves during consultations when they feel blocked. The specialist also requests that platforms be regulated, particularly in terms of personal data, “To prevent it from going too far”. Doctor Olivier Duris also asks that the user knows that he is speaking to an AI in a transparent way.
Advertisement
Numériques settles in Beaugrenelle Paris for The most tech days : product demonstrations, use or purchase advice, exchanges with our journalists … Discover the full program here.




