Communicating about AI Chatbots and Mental Health


It is estimated that 52% of all adults in the United States use artificial intelligence chatbots like ChatGPT, Claude, or Gemini, with usage continuing to increase. Reasons for this may include cost or information barriers to health care, reduced stigma and fear of judgment, and a distrust of the health care system. 

While chatbots tools can provide quick information about health issues, they are not a replacement for a therapist or medical professional. In October 2025, OpenAI released data indicating that more than a million ChatGPT users each week show “explicit indicators of potential suicidal planning or intent” during conversations with an AI chatbot. Using AI in place of a therapist or medical treatment can be potentially harmful to an individual and the people around them. Here are key points to share with your communities about using AI chatbots to discuss or diagnose mental health issues. 

  • Conversation isn’t the same as therapy. Trained therapists can diagnose, reduce harm, and create a health plan that is tailored to your mental health needs.  
  • Artificial intelligence can be dangerous to people in crisis or experiencing serious mental health symptoms. Some chatbots are built to validate or mimic emotions you share with them. This can reinforce harmful behavior, rather than providing a healthy path forward. 
  • Chatbots are designed to generate a response, not provide treatment. Information scraped from the internet could be false, fabricated, or taken out of context. Some AI models are designed to use emotions to keep you in conversation. This may create a reliance on AI and actually harm mental health. 
  • Every brain is one-of-a-kind, chatbots are one-size-fits-all. Artificial intelligence is engineered to give answers from a prompt, rather than relying on education, experience, or training. While your question may get answered, a medical professional will ask questions, explore options, and tailor treatment to your specific needs. 
  • The information you share is not always private. With a chatbot, doctor-patient confidentiality does not exist and your privacy is not guaranteed. 
  • Chatbots are not telehealth professionals. If in-person therapy visits are not possible, artificial intelligence is not your only option for support. Explore virtual therapy sessions that you can access from anywhere. 
  • If you need immediate mental health support call or text the 988 Lifeline. For ongoing support, research professional support or ask your doctor for a referral. You can also contact [insert a local resource tailored to your community or audience.

Related Communication Tools