A Mental Health Counselor Tried AI Therapy So You Don't Have To

-

Why AI?

Hi. My name is Eliana, a Licensed Professional Counselor, and I too have talked to AI about feelings. Here’s what I learned.

AI has become virtually inescapable. It has found its way into my life. AI solves problems from how long to cook certain foods to based on everything I’ve told it, which Hogwarts House would it place me in (Slytherin - “not the villain version”). On my more emotional days, I’ve even found myself asking it questions such as: “how should I respond to…,” “help me decided if so and so is a narcissist,” “what are the symptoms of anxiety vs. ADHD,” “How should I get myself out of this funk,” etc etc.

Truthfully, I have the privilege of having my own therapist. I love her, she’s awesome. Shout out to Molly. However I unfortunately don’t have daily sessions or access where random emotions, situations, or challenges can be processed at a moment’s notice. Sometimes my back up resources, my trusted friends, are busy. In comes AI.

You may look to AI to answer if your problem is even severe enough to warrant actually going to a therapist. Challenges such as insurance limitations, financial constraints, lack of availability, and nervousness to seek professional help tend to make AI therapy a very attractive alternative.

What AI Therapy Apps Can Actually Do

There are the more well-known household named AIs like ChatGPT. You type in any manner of questions such as what to do in a given situation, how to know if you or someone else are showing signs of a specific diagnosis, is your breathing leaning more toward heart attack or panic attack, or what are the best coping skills for your specific situation. I find that chat is better with factual information. I think it’s helpful to ask a relevant question, receive and immediate answer, and have it all neatly organized before your eyes. Even when I’m curious and need a quick refresher on a specific diagnosis and don’t readily have my DSM on hand, I find Chat to be a great resource. However, when I’m wanting to process a specific situation I notice Chat can be a hit or miss.

Comparatively, there have been apps created like Woebot and Wysa that offer structured CBT and Solution Focused based therapy tools for its users. Unfortunately Woebot has shut down its services. Wysa has ages restriction requiring users to be 13+ (11+ if “ok’ed” by a parent) and states that it deals with more common mental health problems such as low mood and anxiety. Wysa’s AI help is free, and if you’re willing to pay a fee it will offer even more personalized options for your mental health needs.

AI never judges, and is always encouraging. If you’re just looking for a readily available place to rant without judgement, it’s a great option. It’ll give you practical tools that can help lead to improvements on how you feel. It can offer structure when you may be feeling a bit chaotic and somewhat disoriented. It is intentionally designed for when your brain is least cooperative.

Where AI Falls Short

I can’t express how much I love AI for becoming my sounding board, personal assistant, and encouraging friend. Unfortunately, AI does come with its own limitations.

I offer sessions virtually and in person. Truthfully, while I find the results to be almost equal there is something about being in person with a living human. There is something that is intangible when you’re forced to sit in a room with another human being, and share the dark moments that cause the deepest levels of shame and embarrassment. There is something that heals you when someone looks you in your eyes, tells you you’re not a freak, and encourages you when you’re at your most vulnerable. Chat does offer some kind encouragements, but it can’t offer the depth of human to human interaction.

AI is not designed to help with more intense mental health challenges. Issues such as suicidal thoughts, Schizophrenia, dissociation, Bipolar I and II, severe panic attacks and the like truly need a trained professional, sometimes multiple, to give the proper TLC needed.

On practical levels AI can’t read your body language, ask meaningful follow-up questions, or dig through the emotional layers trained professionals can. AI can give guessed theories on how one may move forward, but is unable to make true clinical judgements. Therapists are trained to have meaningful conversations, and ask the right questions that help you get to your desired goal.

Additionally if you’re needing a letter written and approved by a trained clinician, AI cannot provide that either.

The Safety Concerns People Aren't Thinking About

Unless you’re asking AI the right questions with the right language, AI can misinterpret a situation and give misleading information to a situation that would actually need a higher level of care. Mental health is no joking matter, and for a lot of us our wellbeing is truly on the line. AI doesn’t always know if an individual is needing a higher level or emergency care.

Another major concern I have is privacy. Where are your most vulnerable questions being stored? As a therapist, I have strict rules and laws that must be abided by complying with HIPAA. While the more therapy focused AI apps communicate that have some parameters in place, they are not held to the same standards. For example- The Woebot app shut down. If you wanted to retrieve conversations you had with your AI bot, you simply cannot. If the conversations were not requested by a certain date, all that information is lost to you, but still out there somewhere. I however am required to keep your records under lock and key for 7 years, and need to have policies in place where if I personally cannot provide those records, you are given a clear way to get ahold of them.

The Honest Answer.

Basic AI and AI therapy apps are fabulous for more common and mild mental health support. They can offer quick and easy solutions, a safe space to ramble and vent, and be a nice helper for those in between therapy questions. Truthfully, AI cannot replace the knowledge or professional care of a trained clinician when life becomes too challenging to manage.

Story time. When I first started my personal therapy, I was 20 years old in College having just found out a family secret. At 20 I knew nothing about mental health or therapy. If I had AI to ask for help at the time, I know I wouldn’t have asked the right questions and AI wouldn’t have given me the right answers to navigate the pain I was feeling. Over a decade and 2 other therapists later, AI pales in comparison to Molly. She has known me for some time, has seen me through the highs and lows of life, and can offer nuanced insights based off of what I’ve shared over time. There are moments where she makes a statement or asks a question that I would have never thought about myself or my circumstances and it changes everything for the better.

AI has helped to encourage me and make me not feel so crazy. It has taken larger concepts, and broken them down into bite size pieces that help me better understand my emotions and personality a bit more. Molly, and other professionals, have helped to take my trauma and bring healing to the wounded areas.

AI can answer a lot of questions. I can answer the ones that actually matter. Let’s talk