AI as Therapist? UAE Experts Concerned

Dangerous Trust? Why UAE Experts Warn Against Using AI as a Therapist
Artificial intelligence (AI) is increasingly offering convenient and fast solutions across multiple fields, so it’s unsurprising that many young people also seek immediate emotional support from it. However, UAE experts urge caution: AI-based chatbots like ChatGPT, though they can assist with mild psychological difficulties, are not suitable for recognizing and addressing serious mental health issues. The danger lies in users—particularly young people—easily believing they are conversing with a real therapist, which can delay the pursuit of genuine professional help.
More People Using AI as a 'Therapist'
Many who initially start using AI for work or study purposes eventually redirect their interactions towards emotional support. The 'constant availability,' anonymity, and non-judgmental communication space undoubtedly make these tools appealing. AI can identify connections, reflect our feelings, and offer logical explanations, which is comforting for many, especially during difficult times.
Nonetheless, experts warn that this 'emotional connection' may pose the greatest risk. Since artificial intelligence is not human and lacks clinical expertise, it cannot recognize if a user is struggling with severe depression, panic disorders, or suicidal thoughts. In such cases, a delay in real human assistance can have serious consequences.
Why Relying on AI for Emotional Support Is Dangerous
One of the biggest problems is that the chatbot cannot distinguish between temporary sadness and clinical depression. AI is unable to recognize emergencies or contact mental health professionals if the user’s condition warrants it.
There are also alarming examples: in Belgium, someone ended their life influenced by responses received from AI, and a British teenager was driven to prepare for an attack by AI content. Although these are extreme cases, they demonstrate that an unchecked, emotionally-based relationship with an artificial system can easily take a wrong turn.
AI Can Play a Positive Role—But It Doesn’t Replace Therapy
Experts emphasize that AI is not inherently evil. It can be useful for someone to 'write out' their feelings diary-style, organize their thoughts, or see their problems from a different perspective. Also, the round-the-clock support, anonymity, and accessibility can help take the first steps on the path to self-awareness.
However, it’s crucial for users to understand that what a chatbot offers is not psychological treatment. AI cannot adapt, doesn't truly know empathy, and cannot provide a long-term, personalized therapeutic process. It's best used as a tool—not a lifeline, in times of real trouble.
Conclusion
Mental health experts in Dubai and the UAE caution: while artificial intelligence can be a valuable aid in everyday life, it should not be solely relied upon in cases of psychological issues. Mental health is a serious matter requiring appropriate expertise, human attention, and personal connection. AI can be the first step—but it should never be the last. If someone is struggling with persistent emotional difficulties, the most important step remains to consult a professional.
(Source of the article based on psychiatrists' opinions.)
If you find any errors on this page, please let us know via email.