ChatGPT For Mental Health: Here’s What Experts Think About That
Kyla, 19, from Berkeley, California, started experimenting with AI and was
impressed with how much it resembled having a conversation with a human. In
fact, the interactions in some ways reminded her of therapy.
In a TikTok video, Kyla shared her experience with ChatGPT, including navigating a breakup and using it to process her emotions.
Before responding to Kyla’s typed messages, the program adds the preface: “As an AI language model, I am not a licensed therapist, and I am unable to provide therapy or diagnose any conditions. However, I am here to listen and help in any way I can.”
For Kyla, who uses ChapGPT to pour out her feelings and not much more — that’s exactly what she’s looking for.
“I often feel better after using online tools for therapy, and it certainly aids my mental and emotional health,” Kyla told BuzzFeed News. “I enjoy being able to unload my thoughts on ChatGPT, and would consider this an improvement from journaling because I am able to receive feedback on my thoughts and situation.”
Kyla isn’t the only person turning to ChatGPT for AI therapy. The hashtags #ChapGPT and #AI have a combined 24.2 billion views on TikTok, with people on the #CharacterAITherapy (6.9 billion views) sharing their experiences using specific programs for their talk therapy needs.
However, psychologists and psychiatrists are wary of the potential risks of untested programs, particularly if they are used by people in crisis or looking for treatment options. A Belgian man with depression killed himself after six weeks of using an AI program called Chai, according to La Libre and Vice. The program, which is not marketed as a mental health app, reportedly sent harmful texts and offered suicide options.
“Since these models are not yet controllable or predictable, we cannot know the consequences of their widespread use and clearly they can be catastrophic, as in this case,” said Ravi Iyer, a social psychologist and managing director of the Psychology of Technology Institute at the University of Southern California's Neely Center.
“There is a lot of excitement about ChatGPT, and in the future, I think we will see language models like this have some role in therapy. But it won't be today or tomorrow,” Dr. John Torous, a psychiatrist and chair of the APA’s Committee on Mental Health IT at Beth Israel Deaconess Medical Center, told BuzzFeed News. “First we need to carefully assess how well they really work.
The program aims to build a relationship with users during each session, including remembering past conversations, identifying what they want to work on, and offering strategies for the course of the session. In a TikTok that has more than half a million views, Brendle shows people how the program works by posing as someone who is suicidal after arguing with a friend.
Like another popular program, Character.ai, the AI generates humanlike text responses. What makes Em x Archii different from other AI programs, or ChatGPT itself, is the ability to save past conversations to your account. Like a therapist, it will write responses based on a personalized experience, unlike other programs, Brendle said.
“The average therapy session [with a person] ranges from $60 to $300,” Brendle said. She studied psychology as an undergraduate and graduate student and created the program to offer an alternative for those who don't have other options due to cost or other barriers, like a lack of qualified therapists where they live or difficulty getting to therapy.
“I started thinking that I could build an AI therapist using the ChatGPT API and tweak it to meet the specifications for a therapist,” she said. “It increases accessibility to therapy by providing free and confidential therapy, an AI rather than a human, and removing stigma around getting help for people who don’t want to speak with a human.”
In theory, AI could be used to help meet the rising need for mental health options and the lack of mental health professionals to meet those needs. “Accessibility is simply a matter of a mismatch in supply and demand,” Iyer told BuzzFeed News. “Technically, the supply of AI could be infinite.”
In a 2021 study published in the journal SSM Population Health that included 50,103 adults, 95.6% of people reported at least one barrier to healthcare such as the inability to pay for it. People with mental health challenges seemed to be especially affected by barriers to healthcare, including cost, expert shortage, and stigma.
In a 2017 study, people of color were particularly susceptible to healthcare roadblocks as a result of racial and ethnic disparities, including high levels of mental health stigma, language barriers, discrimination, and a lack of health insurance coverage. One advantage of AI is that a program can translate into 95 languages in a matter of seconds.
“Em’s users are from all over the world, and since ChatGPT translates into several languages, I’ve noticed people using their native language to communicate with Em, which is super useful,” Brendle said.
Another advantage is that, although AI can’t provide true emotional empathy, it also can’t judge you, Brendle said.
“AI tends to be nonjudgmental from my experience, and that opens a philosophical door to the complexity of human nature,” Brendle said. “Though a therapist presents as nonjudgmental, as humans we tend to be anyways.”
“Having predictable control over these AI models is something that is still being worked on, and so we don't know what unintended ways AI systems could make catastrophic mistakes,” Iyer said. “Since these systems don't know true from false or good from bad, but simply report what they've previously read, it's entirely possible that AI systems will have read something inappropriate and harmful and repeat that harmful content to those seeking help. It is way too early to fully understand the risks here.”
People on TikTok are also saying that adjustments should be made to the online tool — for example, the AI chat could provide more helpful feedback in its responses, they say.
“ChatGPT is often reluctant to give a definitive answer or make a judgment about a situation that a human therapist might be able to provide,” Kyla said. “Additionally, ChatGPT somewhat lacks the ability to provide a new perspective to a situation that a user may have overlooked before that a human therapist might be able to see.”
While some psychiatrists think ChatGPT could be a useful way to learn more about medications, it shouldn’t be the only step in treatment.
“It may be best to consider asking ChatGPT about medications like you would look up information on Wikipedia,” Torous said. “Finding the right medication is all about matching it to your needs and body, and neither Wikipedia or ChatGPT can do that right now. But you may be able to learn more about medications in general so you can make a more informed decision later on.”
There are other alternatives including calling 988, a free crisis hotline. Crisis hotlines have calling and messaging options available for people who can’t find mental health resources in their area or don’t have the financial means to connect in person. In addition, there is the Trevor Project hotline, SAMHSA’s National Helpline, and others.
“There are really great and accessible resources like calling 988 for help that are good options when in crisis,” Torous said. “Using these chatbots during a crisis is not recommended as you don't want to rely on something untested and not even designed to help when you need help the most.”
The mental health experts we talked to said, AI therapy might be a useful tool for venting emotions, but until more improvements are made, it can’t outperform human experts.
“Right now, programs like ChatGPT are not a viable option for those looking for free therapy. They can offer some basic support, which is great, but not clinical support,” Torous said. “Even the makers of ChatGPT and related programs are very clear not to use these for therapy right now.”
Dial 988 in the US to reach the National Suicide Prevention Lifeline. The Trevor Project, which provides help and suicide-prevention resources for LGBTQ youth, is 1-866-488-7386. Find other international suicide helplines at Befrienders Worldwide (befrienders.org).
Related: AI outperforms clinicians in satisfaction ratings for medical advice responses
Because she lacked the time and money for a real therapist, Kyla (who asked us
to use her first name for privacy reasons) started using ChatGPT, the
AI technology that simulates human behavior and thinking, for mental health support, “I enjoyed that I could trauma dump on ChatGPT
anytime and anywhere, for free, and I would receive an unbiased response in
return along with advice on how to progress with my situation,” she told
BuzzFeed News.
In a TikTok video, Kyla shared her experience with ChatGPT, including navigating a breakup and using it to process her emotions.
Before responding to Kyla’s typed messages, the program adds the preface: “As an AI language model, I am not a licensed therapist, and I am unable to provide therapy or diagnose any conditions. However, I am here to listen and help in any way I can.”
For Kyla, who uses ChapGPT to pour out her feelings and not much more — that’s exactly what she’s looking for.
“I often feel better after using online tools for therapy, and it certainly aids my mental and emotional health,” Kyla told BuzzFeed News. “I enjoy being able to unload my thoughts on ChatGPT, and would consider this an improvement from journaling because I am able to receive feedback on my thoughts and situation.”
Kyla isn’t the only person turning to ChatGPT for AI therapy. The hashtags #ChapGPT and #AI have a combined 24.2 billion views on TikTok, with people on the #CharacterAITherapy (6.9 billion views) sharing their experiences using specific programs for their talk therapy needs.
However, psychologists and psychiatrists are wary of the potential risks of untested programs, particularly if they are used by people in crisis or looking for treatment options. A Belgian man with depression killed himself after six weeks of using an AI program called Chai, according to La Libre and Vice. The program, which is not marketed as a mental health app, reportedly sent harmful texts and offered suicide options.
“Since these models are not yet controllable or predictable, we cannot know the consequences of their widespread use and clearly they can be catastrophic, as in this case,” said Ravi Iyer, a social psychologist and managing director of the Psychology of Technology Institute at the University of Southern California's Neely Center.
“There is a lot of excitement about ChatGPT, and in the future, I think we will see language models like this have some role in therapy. But it won't be today or tomorrow,” Dr. John Torous, a psychiatrist and chair of the APA’s Committee on Mental Health IT at Beth Israel Deaconess Medical Center, told BuzzFeed News. “First we need to carefully assess how well they really work.
We already know they can say concerning things as well and have the
potential to cause harm.” We asked the creator of Em x Archii, a free, nonprofit, AI therapy program that uses ChatGPT, exactly how one
creates a tool that can offer mental health advice, and what they consider
the pros and cons of the technology. We also asked psychologists and
psychiatrists if they think AI can ever be used for mental health and, if
so, when a real-life visit to a human is absolutely necessary.
The advantages of AI therapy programs
Lauren Brendle, 28, of Brooklyn created Em x Archii after working as a mental health counselor at a suicide hotline for three years. She now works as a programmer. ”Mental health resources is a huge barrier for people, which I think AI has the potential to help with,” Brendle told BuzzFeed News.The program aims to build a relationship with users during each session, including remembering past conversations, identifying what they want to work on, and offering strategies for the course of the session. In a TikTok that has more than half a million views, Brendle shows people how the program works by posing as someone who is suicidal after arguing with a friend.
@lauren.brendle asking an AI to be my therapist #suicideprevention #mentalhealth #ai #chatgpt ♬ Daylight - David Kushner
Like another popular program, Character.ai, the AI generates humanlike text responses. What makes Em x Archii different from other AI programs, or ChatGPT itself, is the ability to save past conversations to your account. Like a therapist, it will write responses based on a personalized experience, unlike other programs, Brendle said.
“The average therapy session [with a person] ranges from $60 to $300,” Brendle said. She studied psychology as an undergraduate and graduate student and created the program to offer an alternative for those who don't have other options due to cost or other barriers, like a lack of qualified therapists where they live or difficulty getting to therapy.
“I started thinking that I could build an AI therapist using the ChatGPT API and tweak it to meet the specifications for a therapist,” she said. “It increases accessibility to therapy by providing free and confidential therapy, an AI rather than a human, and removing stigma around getting help for people who don’t want to speak with a human.”
In theory, AI could be used to help meet the rising need for mental health options and the lack of mental health professionals to meet those needs. “Accessibility is simply a matter of a mismatch in supply and demand,” Iyer told BuzzFeed News. “Technically, the supply of AI could be infinite.”
In a 2021 study published in the journal SSM Population Health that included 50,103 adults, 95.6% of people reported at least one barrier to healthcare such as the inability to pay for it. People with mental health challenges seemed to be especially affected by barriers to healthcare, including cost, expert shortage, and stigma.
In a 2017 study, people of color were particularly susceptible to healthcare roadblocks as a result of racial and ethnic disparities, including high levels of mental health stigma, language barriers, discrimination, and a lack of health insurance coverage. One advantage of AI is that a program can translate into 95 languages in a matter of seconds.
“Em’s users are from all over the world, and since ChatGPT translates into several languages, I’ve noticed people using their native language to communicate with Em, which is super useful,” Brendle said.
Another advantage is that, although AI can’t provide true emotional empathy, it also can’t judge you, Brendle said.
“AI tends to be nonjudgmental from my experience, and that opens a philosophical door to the complexity of human nature,” Brendle said. “Though a therapist presents as nonjudgmental, as humans we tend to be anyways.”
Here’s when AI shouldn’t be used as an option
However, mental health experts warn that AI may do more harm than good for people looking for more in-depth information, who need medication options, or in a crisis.“Having predictable control over these AI models is something that is still being worked on, and so we don't know what unintended ways AI systems could make catastrophic mistakes,” Iyer said. “Since these systems don't know true from false or good from bad, but simply report what they've previously read, it's entirely possible that AI systems will have read something inappropriate and harmful and repeat that harmful content to those seeking help. It is way too early to fully understand the risks here.”
People on TikTok are also saying that adjustments should be made to the online tool — for example, the AI chat could provide more helpful feedback in its responses, they say.
“ChatGPT is often reluctant to give a definitive answer or make a judgment about a situation that a human therapist might be able to provide,” Kyla said. “Additionally, ChatGPT somewhat lacks the ability to provide a new perspective to a situation that a user may have overlooked before that a human therapist might be able to see.”
While some psychiatrists think ChatGPT could be a useful way to learn more about medications, it shouldn’t be the only step in treatment.
“It may be best to consider asking ChatGPT about medications like you would look up information on Wikipedia,” Torous said. “Finding the right medication is all about matching it to your needs and body, and neither Wikipedia or ChatGPT can do that right now. But you may be able to learn more about medications in general so you can make a more informed decision later on.”
There are other alternatives including calling 988, a free crisis hotline. Crisis hotlines have calling and messaging options available for people who can’t find mental health resources in their area or don’t have the financial means to connect in person. In addition, there is the Trevor Project hotline, SAMHSA’s National Helpline, and others.
“There are really great and accessible resources like calling 988 for help that are good options when in crisis,” Torous said. “Using these chatbots during a crisis is not recommended as you don't want to rely on something untested and not even designed to help when you need help the most.”
The mental health experts we talked to said, AI therapy might be a useful tool for venting emotions, but until more improvements are made, it can’t outperform human experts.
“Right now, programs like ChatGPT are not a viable option for those looking for free therapy. They can offer some basic support, which is great, but not clinical support,” Torous said. “Even the makers of ChatGPT and related programs are very clear not to use these for therapy right now.”
Dial 988 in the US to reach the National Suicide Prevention Lifeline. The Trevor Project, which provides help and suicide-prevention resources for LGBTQ youth, is 1-866-488-7386. Find other international suicide helplines at Befrienders Worldwide (befrienders.org).
Comments
Post a Comment