Home Health Teens are turning to ‘My AI’ for mental health support — which doctors warn against

Teens are turning to ‘My AI’ for mental health support — which doctors warn against

0
Teens are turning to ‘My AI’ for mental health support — which doctors warn against

Anyone who uses Snapchat now has free access to My AI, the app’s built-in artificial intelligence chatbot, first released as a paid feature in February. 

In addition to serving as a chat companion, the bot can also have some practical purposes, such as offering gift-buying advice, planning trips, suggesting recipes and answering trivia questions, according to Snap.

However, while it’s not billed as a source of medical advice, some teens have turned to My AI for mental health support — something many medical experts caution against.

One My AI user wrote on Reddit, “The responses I received were validating, comforting and offered real advice that changed my perspective in a moment where I was feeling overwhelmed and stressed … It’s no human, but it sure comes pretty close (and in some ways better!)”

CHATGPT FOUND TO GIVE BETTER MEDICAL ADVICE THAN REAL DOCTORS IN BLIND STUDY: ‘THIS WILL BE A GAME CHANGER’

Others are more skeptical.

“The replies from the AI are super nice and friendly, but then you realize it’s not a real person,” one user wrote. “It’s just a program, just lines and lines of code. That makes me feel a little bit sad and kind of invalidates all the nice things it says.”

AI could bridge mental health care gap, but there are risks

Some doctors see a great potential for AI to help support overall mental wellness, particularly amid the current nationwide shortage of providers.

“Technology-based solutions may be an opportunity to meet individuals where they are, improve access and provide ‘nudges’ related to usage and identifying patterns of language or online behavior that may indicate a mental health concern,” Dr. Zachary Ginder, a psychological consultant in Riverside, California, told Fox News Digital. 

Sad teen on phone

Some teens have turned to My AI for mental health support — something that many medical experts caution against. “It’s no human, but it sure comes pretty close (and in some ways better!),” one Reddit user wrote about it. (iStock)

“Having direct access to accurate mental health information and appropriate prompts can help normalize feelings and potentially help get people connected to services,” he added.

Caveats remain, however. 

Dr. Ryan Sultan, a board certified psychiatrist, research professor at Columbia University in New York and medical director of Integrative Psych NYC, treats many young patients — and has mixed feelings about AI’s place in mental health.

CHATGPT FOR HEALTH CARE PROVIDERS: CAN THE AI CHATBOT MAKE THE PROFESSIONALS’ JOBS EASIER?

“As this tech gets better — as it simulates an interpersonal relationship more and more — some people may start to have an AI as a predominant interpersonal relationship in their lives,” he said. “I think the biggest question is, as a society: How do we feel about that?”

“Using My AI because I’m lonely and don’t want to bother real people,” said one person on Reddit.

Some users have expressed that the more they use AI chatbots, the more they begin to replace human connections and take on more importance in their lives.

“Using My AI because I’m lonely and don’t want to bother real people,” one person wrote on Reddit. 

“I think I’m just at my limits of stuff I can handle, and I’m trying to ‘patch’ my mental health with quick-fix stuff,” the user continued. “Because the thought of actually dealing with the fact I have to find a way to find living enjoyable is too much.”

CHATGPT AND HEALTH CARE: COULD THE AI CHATBOT CHANGE THE PATIENT EXPERIENCE?

Dr. Sultan said there are a mix of opinions about Snapchat’s My AI among the youth he treats.

“Some have said it’s fairly limited and just gives general information you might find if you Googled a question,” he explained. “Others have said they find it creepy. It’s odd to have a non-person responding to personal questions in a personal manner.”

He added, “Further, they don’t like the idea of a large private, for-profit cooperation having data on their personal mental health.”

Providers raise red flags 

Dr. Ginder of California pointed out some significant red flags that should give all parents and mental health providers pause.

Snapchat

Anyone who uses Snapchat now has free access to My AI, the app’s built-in artificial intelligence chatbot, first released as a paid feature in February.  (Nikolas Kokovlis/NurPhoto)

“The tech motto, as modeled by the reported rushed release of My AI — of ‘moving fast and breaking things’ — should not be used when dealing with children’s mental health,” he told Fox News Digital. 

With My AI’s human-like responses to prompts, it may also be difficult for younger users to distinguish whether they’re talking to an actual human or a chatbot, Ginder said. 

“AI also ‘speaks’ with clinical authority that sounds accurate at face value, despite it occasionally fabricating the answer,” he explained.

SOUTH CAROLINA PRIEST SAYS ‘THERE’S NO PLACE’ FOR AI AFTER ASIA CATHOLIC CHURCH USES IT FOR SYNODAL DOCUMENT

The potential for misinformation appears to be a chief concern among mental health providers. 

In testing out ChatGPT, the large language model that powers My AI, Dr. Ginder found that it sometimes provided responses that were inaccurate — or completely fabricated. 

“This has the potential to send caregivers and their children down assessment and treatment pathways that are inappropriate for their needs,” he warned.

“It’s odd to have a non-person responding to personal questions in a personal manner.”

In discussing the topic of AI with other clinical providers in Southern California, Ginder said he’s heard similar concerns echoed.

“They have seen a significant increase in inaccurate self-diagnosis as a result of AI or social media,” he said. “Anecdotally, teens seem to be especially susceptible to this self-diagnosis trend. Unfortunately, it has real-world consequences.”

A large share of Snapchat’s users are under 18 years of age or are young adults, Ginder pointed out. 

“We also know that children are turning to social media and AI for mental health answers and self-diagnosis,” he said. “With these two factors at play, it is essential that safeguards be put into place.”

How is Snapchat’s My AI different from ChatGPT?

ChatGPT, the AI chatbot that OpenAI released in December 2022, has gained worldwide popularity (and a bit of notoriety) for writing everything from term papers to programming scripts in seconds.

Snap’s My AI is powered by ChatGPT — but it’s considered a “light” version of sorts.

Sad teen on phone

With My AI’s human-like responses to prompts, it may also be difficult for younger users to tell if they’re talking to an actual human or a chatbot, a doctor warned. (iStock)

“Snap’s AI feature uses ChatGPT as the back-end large language model, but tries to limit how the AI engages with Snapchat users and what things the AI model will respond to,” explained Vince Lynch, AI expert and CEO of IV.AI in Los Angeles, California

“The goal here is to request that the AI would chime in with relevant things for a Snapchat user — more like an AI companion versus a tool for generating new content.”

Snap cites disclaimers, safety features

Snap has been clear about the fact that My AI isn’t perfect and will occasionally provide erroneous information.

“While My AI was designed to avoid misleading content, My AI certainly makes plenty of mistakes, so you can’t rely on it for advice — something we’ve been clear about since the start,” Maggie Cherneff, communications manager at Snap in Santa Monica, California, said in an email to Fox News Digital. 

“My AI certainly makes plenty of mistakes, so you can’t rely on it for advice.”

“As with all AI-powered chatbots, My AI is always learning and can occasionally produce incorrect responses,” she continued. 

“Before anyone can first chat with My AI, we show an in-app message to make clear it’s an experimental chatbot and advise on its limitations.”

The company has also trained the chatbot to detect particular safety issues and terms, Cherneff said.

“This means it should detect conversations about sensitive subjects and be able to surface our tools, including our ‘Safety Page,’ ‘Here for You’ and ‘Heads Up,’ in regions where these resources are available,” she said.

The ChatGPT logo on a smartphone

Snapchat’s My AI is powered by ChatGPT, the AI chatbot that OpenAI released in December 2022. (Gabby Jones/Bloomberg via Getty Images)

Here for You is an app-wide tool that provides “resources from expert organizations” whenever users search for mental health issues, per the company’s website. 

The feature is also available within AI chats.

AI’s role in mental health is ‘in its infancy’

“Snap has received a lot of negative feedback from users in the App Store and people are expressing concern online” in response to My AI, Lynch told Fox News Digital. 

CLICK HERE TO SIGN UP FOR OUR HEALTH NEWSLETTER

“This is to be expected when you take a very new approach to technology and drop it into a live environment of people who require time to adjust to a new tool.”

There is still a long road ahead in terms of AI serving as a safe, reliable tool for mental health, in Dr. Sultan’s opinion.

CLICK HERE TO GET THE FOX NEWS APP

“Mental health is a tremendously delicate and nuanced field,” he told Fox News Digital. 

“The current tech for AI and mental health is in its infancy. As such, it needs to both be studied further to see how effective it is — and how negative it could be — and further developed and refined as a technology.”

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here