NewsLocal News

Actions

How effective can AI be as a mental health resource?

Posted

SALT LAKE CITY — Artificial intelligence is used for all sorts of things, like explaining complex topics or even finding a new spring recipe. But what about using it for your mental health?

“We’re growing lonelier as a society. That’s not a void AI can fill — other people can,” said Matt Draper, a psychologist based in Springville.

Utah's state lawmakers created House Bill 452 in this year's legislative session, which regulates mental health chatbots that use artificial intelligence technology. This includes prohibiting mental health chatbots from using users’ personal information.

One of the forms of AI that is used for mental health is generative AI, which includes ChatGPT and Copilot.

“I was going through a rough time, I really needed someone to talk to, I reached out to my friends, but they were busy, so I turned to the AI,” said 17-year-old Mac Martin, a mental health advocate. “That kind of helped me figure out some of my thoughts. Even just the act of being alone and speaking out loud can help.”

Some Utahns are on board with using the technology for their mental health, but others are not. Delaney Lanham, a barista in Midvale, is one of those people.

“I wouldn’t want my deepest darkest feelings out there on the internet because it really never goes away,” Lanham said.

According to a report from the Utah Behavioral Health Coalition, kids and teens who have been diagnosed with mental or behavioral health conditions haven’t been able to get treated due to a shortage of available therapists.

“It’s my hope that we can see generative AI progress to a point where it has strong specialized abilities in this area and can engage with people in a way that’s safe and ethical for them while they perhaps are having difficulty accessing care or waiting to get care,” said Nina de Lacy, an assistant professor at the University of Utah, who uses AI tools to find out who is at risk of developing mental illnesses.

There are benefits, yes. But where do we draw the line?

“The difference between medicine and poison is dose,” Draper said. “If they’re getting microdoses of something, that’s nourishing. If they don’t have access to emotional or cognitive nourishment, they have this AI there — it’s far better than nothing. What I worry about is when it’s available and people are doing the easy thing of talking to an AI that will always reflect back to them the most bright and shiny version of them rather than somebody who sees the bright and shiny version and is willing to challenge us to shine brighter.”