Advertisements

[ad_1]

Misty Williams occasionally checks in to the Cedars Sinai Medical Center emergency room to treat debilitating pain from sickle cells, where red blood cells harden and block blood flow.

After painkillers and hydration are ordered, the 41-year-old Los Angeles resident is accessed to a virtual reality headset with an artificial intelligence-powered chatbot that allows her to continue her conversation.

Turning on the headset, Williams finds himself in the virtual garden, and butterflies drift around her. The humanoid robot greets her with a soothing female voice.

Misty Williams, a patient wearing an Apple Vision VR headset, will be holding a session with an AI-powered “robot” named Xaia at Cedars-Sinai.

(Genaro Molina / Los Angeles Times)

“Hello, welcome. My name is Xaia. I am your mental health ally,” it says. “How can I help?”

After the session, Williams’ pain is relieved and her mind is calmed.

“I feel more peaceful, both mentally and physically,” Williams said.

Xaia (pronounced Zai-UH) is just one of many ways artificial intelligence technology is invading the fast-growing sector known as digital health.

According to Rock Health at Digital Health Advisory Firm, digital health startups using AI accounted for an estimated $3.9 billion in 2024, or 38% of the sector’s total. Mental health is a $1.4 billion funded clinical field, which has reduced $1.4 billion.

Major healthcare providers in Los Angeles are embracing this trend. UCLA Health uses AI to help doctors catch strokes faster, reduce hospital re-employment, and spend more time with patients by automating medical notes.

USC’s Keck Medicine will provide employees with commercial AI chat tools to help them manage their stress, according to Dr. Steven Siegel, the best mental health and wellness officer.

At Cedars-Sinai, Xaia, an acronym for an artificially intelligent alliance in augmented reality, was designed and programmed by Dr. Omar Lilan in collaboration with the Medical Center technology venture, with ideas and research support from therapeutic inputs from Dr. Brennan Spiegel and clinical psychologist Robert Chernov.

Dr. Omar Lilan.

(cedars-sinai)

VRX Health, a for-profit company founded by Liran, holds Xaia commercially exclusively in the market from Cedars-Sinai. Cedars-Sinai and some private investors retain the fairness of the company.

The One-like version used by Misty Williams is available to the public for $19.99 a month via Apple Vision Virtual Reality Headets. The VR version of the metaheadset is freely available to researchers only. The web and mobile versions allow licensed clinicians to access staged pricing from $99 to $399 or more per month. This allows patients to invite them to use the tool.

Psychiatrist Lilan said Xaia is designed to supplement mental health therapists’ services amid the national shortage of healthcare providers.

“Even if someone has to watch it once a week, they may only be able to see it once a month,” he said.

The XAIA app is drawn from hundreds of therapy transcripts that sound like a real therapist from real sessions and mock sessions created by experts.

For example, if a user tells Xaia that is struggling with a new cancer diagnosis, the robot might say, “It must be very difficult for you.” Then ask how it affects their mood and what they do when they are overwhelmed.

“If you try to stay positive when things feel very heavy, you have to need a lot of energy,” says the chatbot. “If you realize you’re being drawn back to difficult thoughts, what will happen next?

Robert Chernoff, a clinical psychologist.

(cedars-sinai)

So far, it has been used by around 300 patients in various Cedars-Sinai studies, according to Spiegel, director of health services research at Cedars-Sinai.

Many people with chronic illnesses also struggle with anxiety and depression, Spiegel said. Physical and emotional symptoms flourish with each other, and tools like Xaia aim to help both.

The tool is not yet covered by insurance, but there are billing codes for virtual reality therapy and digital health services, and other hospitals like Mayo Clinic are beginning to use them. According to Gabe Zetter, VRX has agreed to deploy XAIA to Mayo Clinic.

Xaia is not just this type of app. Woebot, a pioneering chatbot developed by psychologist Alison Darcy at Stanford, used scripted conversations based on cognitive behavioral therapy to support users with anxiety and depression.

It reached 1.5 million users, but in July the company closed its app. Darcy said that AI is moving faster than regulators like the Food and Drug Administration, so it focuses on building new tools with large-scale language models.

In recent years, several emotional support chatbots have been criticized for deepening their pain, including one incident in which a Florida teen committed suicide in 2024 after a lengthy conversation with the chatbot.

Such cases underscore the risks of emotionally responsive AI tools, said Todd Esig, psychologist and founder and founder of the American Council on Artificial Intelligence.

“Even after the most loving and empathetic reactions, AI doesn’t care if they drive to the store or get kicked off the cliff,” says Essig.

AI programs learn to mimic human responses, Essig said, so it’s up to people to set clear restrictions and ensure they don’t do any harm.

“I feel peaceful, both mentally and physically,” Misty Williams said after a session with the AI-powered XAIA.

(Genaro Molina / Los Angeles Times)

When built in an ethical framework and used under clinical supervision, tools like XAIA can support true therapeutic advances that function like digital journals.

However, many emotional support chatbots that are not clinically monitored are designed to mimic intimacy and build emotional bonds.

“People can experience different apps,” Halpern said. “But it doesn’t really give people the real experiences with other people who are important to develop the healthy, mutually empathetic curiosity they need to participate in complex relationships.”

Halpern noted that there is a difference between clinically approved mental health tools and those without supervision. She and others support a California bill sponsored by Sen. Steve Padilla (D-Chula Vista), which requires companies developing mental health chatbots or apps to disclose whether the tools are clinically validated, regulated by the FDA, or rely on generating AI.

Lilan said he and his partner are aware of the restrictions and incorporate guardrails to prevent chatbots from saying anything harmful or inappropriate. For example, one arm of the AI generates a response, and the other immediately double-checks to make sure it’s safe before passing it through to the user.

“We don’t just let it be public,” Lilan said, and the mobile and desktop guided treatment versions are currently only available through licensed clinicians, with Cedar testing XAIA in multiple studies. “We’re trying to be very careful.”

A 14 study opened up on a variety of topics, including patients using XAIA with mild or moderate anxiety or depression, including their deceased mothers and fear of being fired. For patients who have been in a night sweat since their breakup, Xaia asked them to ask more about what their relationship felt unresolved and how it affected the patient.

While some patients still preferred the nuance and reactivity of human therapists, the medical literature suggests that patients are warmed to the ideas of non-human therapists.

In a study published in PLOS Mental Health in February, participants were asked to compare responses written by licensed therapists and therapists generated by ChatGPT.

Not only did many people struggle to communicate the difference, they also constantly rated AI replies as more empathetic, culturally sensitive and emotionally attractive.

Xaia creators see this tool as an extension of the relationship between patients and therapists. That’s the kind of thing that might be useful if someone needs mental health support at midnight or between sessions.

“We still need people – humans – who see other humans – to have conversations about vulnerable topics,” Spiegel says.

At the same time, “It’s not practical to bury your head in the sand and say you shouldn’t do this. “In time, brush your teeth with AI.”

[ad_2]
Source link

Leave A Reply

Exit mobile version