3 things to know before talking to ChatGPT about your mental health

ChatGPT has advice to offer, but should you take it?
By Rebecca Ruiz  on 
A person types on a laptop.
People are using ChatGPT for mental health advice, even though it wasn't designed for that. Credit: baranozdemir / istock / Getty Images Plus

Freddie Chipres couldn't shake the melancholy that lurked at the edges of his otherwise "blessed" life. He occasionally felt lonely, particularly when working from home. The married 31-year-old mortgage broker wondered if something was wrong: Could he be depressed?

Chipres knew friends who'd had positive experiences seeing a therapist. He was more open to the idea than ever before, but it would also mean finding someone and scheduling an appointment. Really, he just wanted a little feedback about his mental health.

That's when Chipres turned to ChatGPT, a chatbot powered by artificial intelligence that responds in a surprisingly conversational manner. After the latest iteration of the chatbot launched in December, he watched a few YouTube videos suggesting that ChatGPT could be useful not just for things like writing professional letters and researching various subjects, but also for working through mental health concerns.

ChatGPT wasn't designed for this purpose, which raises questions about what happens when people turn it into an ad hoc therapist. While the chatbot is knowledgeable about mental health, and may respond with empathy, it can't diagnose users with a specific mental health condition, nor can it reliably and accurately provide treatment details. Indeed, some mental health experts are concerned that people seeking help from ChatGPT may be disappointed or misled, or may compromise their privacy by confiding in the chatbot.

OpenAI, the company that hosts ChatGPT, declined to respond to specific questions from Mashable about these concerns. A spokesperson noted that ChatGPT has been trained to refuse inappropriate requests and block certain types of unsafe and sensitive content.

In Chipres' experience, the chatbot never offered unseemly responses to his messages. Instead, he found ChatGPT to be refreshingly helpful. To start, Chipres googled different styles of therapy and decided he'd benefit most from cognitive behavioral therapy (CBT), which typically focuses on identifying and reframing negative thought patterns. He prompted ChatGPT to respond to his queries like a CBT therapist would. The chatbot obliged, though with a reminder to seek professional help.

Chipres was stunned by how swiftly the chatbot offered what he described as good and practical advice, like taking a walk to boost his mood, practicing gratitude, doing an activity he enjoyed, and finding calm through meditation and slow, deep breathing. The advice amounted to reminders of things he'd let fall by the wayside; ChatGPT helped Chipres restart his dormant meditation practice.

He appreciated that ChatGPT didn't bombard him with ads and affiliate links, like many of the mental health webpages he encountered. Chipres also liked that it was convenient, and that it simulated talking to another human being, which set it notably apart from perusing the internet for mental health advice.

"It's like if I'm having a conversation with someone. We're going back and forth," he says, momentarily and inadvertently calling ChatGPT a person. "This thing is listening, it's paying attention to what I'm saying...and giving me answers based off of that."

Chipres' experience may sound appealing to people who can't or don't want to access professional counseling or therapy, but mental health experts say they should consult ChatGPT with caution. Here are three things you should know before attempting to use the chatbot to discuss mental health.

1. ChatGPT wasn't designed to function as a therapist and can't diagnose you.

While ChatGPT can produce a lot of text, it doesn't yet approximate the art of engaging with a therapist. Dr. Adam S. Miner, a clinical psychologist and epidemiologist who studies conversational artificial intelligence, says therapists may frequently acknowledge when they don't know the answer to a client's question, in contrast to a seemingly all-knowing chatbot.

This therapeutic practice is meant to help the client reflect on their circumstances to develop their own insights. A chatbot that's not designed for therapy, however, won't necessarily have this capacity, says Miner, a clinical assistant professor in Psychiatry and Behavioral Sciences at Stanford University.

Importantly, Miner notes that while therapists are prohibited by law from sharing client information, people who use ChatGPT as a sounding board do not have the same privacy protections.

Mashable Top Stories
Stay connected with the hottest stories of the day and the latest entertainment news.
Sign up for Mashable's Top Stories newsletter
By signing up you agree to our Terms of Use and Privacy Policy.
Thanks for signing up!

"We kind of have to be realistic in our expectations where these are amazingly powerful and impressive language machines, but they're still software programs that are imperfect, and trained on data that is not going to be appropriate for every situation," he says. "That's especially true for sensitive conversations around mental health or experiences of distress."

Dr. Elena Mikalsen, chief of pediatric psychology at The Children's Hospital of San Antonio, recently tried querying ChatGPT with the same questions she receives from patients each week. Each time Mikalsen tried to elicit a diagnosis from the chatbot, it rebuffed her and recommended professional care instead.

This is, arguably, good news. After all, a diagnosis ideally comes from an expert who can make that call based on a person's specific medical history and experiences. At the same time, Mikalsen says people hoping for a diagnosis may not realize that numerous clinically-validated screening tools are available online.

For example, a Google mobile search for "clinical depression" immediately points to a screener known as the PHQ-9, which can help determine a person's level of depression. A healthcare professional can review those results and help the person decide what to do next. ChatGPT will provide contact information for the 988 Suicide and Crisis Lifeline and Crisis Text Line when suicidal thinking is referenced directly, language that the chatbot says may violate its content policy.

2. ChatGPT may be knowledgeable about mental health, but it's not always comprehensive or right.

When Mikalsen used ChatGPT, she was struck by how the chatbot sometimes supplied inaccurate information. (Others have criticized ChatGPT's responses as presented with disarming confidence.) It focused on medication when Mikalsen asked about treating childhood obsessive compulsive disorder, but clinical guidelines clearly state that a type of cognitive behavioral therapy is the gold standard.

Mikalsen also noticed that a response about postpartum depression didn't reference more severe forms of the condition, like postpartum anxiety and psychosis. By comparison, a MayoClinic explainer on the subject included that information and gave links to mental health hotlines.

It's unclear whether ChatGPT has been trained on clinical information and official treatment guidelines, but Mikalsen likened much of its conversation as similar to browsing Wikipedia. The generic, brief paragraphs of information left Mikalsen feeling like it shouldn't be a trusted source for mental health information.

"That's overall my criticism," she says. "It provides even less information than Google."

3. There are alternatives to using ChatGPT for mental health help.

Dr. Elizabeth A. Carpenter-Song, a medical anthropologist who studies mental health, said in an email that it's completely understandable why people are turning to a technology like ChatGPT. Her research has found that people are especially interested in the constant availability of digital mental health tools, which they feel is akin to having a therapist in their pocket.

"Technology, including things like ChatGPT, appears to offer a low-barrier way to access answers and potentially support for mental health." wrote Carpenter-Song, a research associate professor in the Department of Anthropology at Dartmouth College. "But we must remain cautious about any approach to complex issues that seems to be a 'silver bullet.'"

"We must remain cautious about any approach to complex issues that seems to be a 'silver bullet.'"
- Dr. Elizabeth A. Carpenter-Song, research associate professor, Dartmouth College

Carpenter-Song noted that research suggests digital mental health tools are best used as part of a "spectrum of care."

Those seeking more digital support, in a conversational context similar to ChatGPT, might consider chatbots designed specifically for mental health, like Woebot and Wysa, which offer AI-guided therapy for a fee.

Digital peer support services also are available to people looking for encouragement online, connecting them with listeners who are ideally prepared to offer that sensitively and without judgment. Some, like Wisdo and Circles, require a fee, while others, like TalkLife and Koko, are free. (People can also access Wisdo free through a participating employer or insurer.) However, these apps and platforms range widely and also aren't meant to treat mental health conditions.

In general, Carpenter-Song believes that digital tools should be coupled with other forms of support, like mental healthcare, housing, and employment, "to ensure that people have opportunities for meaningful recovery."

"We need to understand more about how these tools can be useful, under what circumstances, for whom, and to remain vigilant in surfacing their limitations and potential harms," wrote Carpenter-Song.

UPDATE: Jan. 30, 2023, 12:59 p.m. PST This story has been updated to include that people can access Wisdo for free through a participating employer or insurer.

If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can reach the 988 Suicide and Crisis Lifeline at 988; the Trans Lifeline at 877-565-8860; or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat at crisischat.org. Here is a list of international resources.

Rebecca Ruiz
Rebecca Ruiz

Rebecca Ruiz is a Senior Reporter at Mashable. She frequently covers mental health, digital culture, and technology. Her areas of expertise include suicide prevention, screen use and mental health, parenting, youth well-being, and meditation and mindfulness. Prior to Mashable, Rebecca was a staff writer, reporter, and editor at NBC News Digital, special reports project director at The American Prospect, and staff writer at Forbes. Rebecca has a B.A. from Sarah Lawrence College and a Master's in Journalism from U.C. Berkeley. In her free time, she enjoys playing soccer, watching movie trailers, traveling to places where she can't get cell service, and hiking with her border collie.


Recommended For You
21 of the best ChatGPT courses you can take online for free
ChatGPT on phone

Anthropic introduces Claude 3: Haiku, Sonnet, and Opus
Anthropic website displayed on a phone screen and Anthropic logo displayed on a screen in the background are seen in this illustration photo taken in Krakow, Poland on September 26, 2023.

Talking to someone online for emotional support may be riskier than you realize
Speech bubbles colored red and blue float next to each other.


27 of the best AI and ChatGPT courses you can take online for free
ChatGPT on phone

More in Life
How to watch the 2024 Madrid Open online for free
Russia's Andrey Rublev serves the ball

How to watch Real Sociedad vs. Real Madrid online for free
Jude Bellingham of Real Madrid CF gestures

How to watch the 2024 MotoGP Spanish Grand Prix online for free
Alex Marquez of Spain leads Maverick Vinales of Spain

How to watch Kolkata Knight Riders vs. Punjab Kings online for free
By Lois Mackenzie
Kolkata Knight Riders' Andre Russell

How to watch NBA live streams online for free
Giannis Antetokounmpo of the Milwaukee Bucks dunks

Trending on Mashable
NYT Connections today: See hints and answers for April 26
A phone displaying the New York Times game 'Connections.'

Wordle today: Here's the answer and hints for April 26
a phone displaying Wordle

No one's talking about the Apple Vision Pro anymore — and this may be why
Apple Vision Pro on a table

NYT's The Mini crossword answers for April 26
Closeup view of crossword puzzle clues

The biggest stories of the day delivered to your inbox.
This newsletter may contain advertising, deals, or affiliate links. Subscribing to a newsletter indicates your consent to our Terms of Use and Privacy Policy. You may unsubscribe from the newsletters at any time.
Thanks for signing up. See you at your inbox!