Alera Home

AI therapy risks

ChatGPT as a therapist: the main risks

A chatbot can sound empathetic. That does not make it a therapist.

Important

If you are in an acute crisis or thinking about harming yourself, contact emergency services, a crisis hotline, or a trusted person immediately. AI cannot replace professional help in these situations.

Empathy is not the same as care

ChatGPT can generate warm and validating language. In a difficult moment, that can feel meaningful.

But therapy is not only tone. It involves boundaries, professional judgment, risk assessment, and a relationship that is accountable to the person receiving help.

Where the risk appears

  • The bot may agree too easily with harmful assumptions.
  • It may miss signs that human support is needed.
  • It may keep a user in long conversation loops without a clear next step.
  • It cannot provide professional responsibility or crisis care.

A better frame

AI can be part of supportive product experiences when the product is designed for mental health, makes limits explicit, and guides users toward human help when the situation goes beyond everyday support.

Frequently asked questions

Can I use ChatGPT as a therapist?

You should not treat ChatGPT as a therapist. It may help with reflection, but it does not replace professional care.

What is different about a mental health support app?

A support app should include purpose-built flows, safety boundaries, check-ins, exercises, and clear guidance for crisis or clinical situations.

Important boundary

Alera does not replace psychotherapy, medical diagnosis, treatment, or emergency support. If you are in immediate danger, contact local emergency or crisis services.