Artificial Intelligence
Getty Images

Artificial Intelligence Could Help Solve America's Impending Mental Health Crisis

<!-- wp:gutenberg-custom-blocks/featured-media {"id":5732101,"url":"https://api.time.com/wp-content/uploads/2019/11/artificial-intelligence-psychiatry.jpg","caption":"","credit":"Getty Images"} -->
Artificial Intelligence
<!-- /wp:gutenberg-custom-blocks/featured-media --><!-- wp:paragraph -->

Five years from now, the U.S.’ already overburdened mental health system may be short as many as 15,600 psychiatrists as the growth in demand for their services outpaces supply, according to a 2017 report from the National Council for Behavioral Health. But some proponents say that, by then, an unlikely tool—artificial intelligence—may be ready to help mental health practitioners mitigate the impact of the deficit.

<!-- /wp:paragraph --><!-- wp:gutenberg-custom-blocks/video-jw {"mediaId":"EkBWtZKH","autostart":false} -->

<!-- /wp:gutenberg-custom-blocks/video-jw --><!-- wp:paragraph -->

Medicine is already a fruitful area for artificial intelligence; it has shown promise in diagnosing disease, interpreting images and zeroing in on treatment plans. Though psychiatry is in many ways a uniquely human field, requiring emotional intelligence and perception that computers can’t simulate, even here, experts say, AI could have an impact. The field, they argue, could benefit from artificial intelligence’s ability to analyze data and pick up on patterns and warning signs so subtle humans might never notice them.

<!-- /wp:paragraph --><!-- wp:paragraph -->

“Clinicians actually get very little time to interact with patients,” says Peter Foltz, a research professor at the University of Colorado Boulder who this month published a paper about AI’s promise in psychiatry. “Patients tend to be remote, it’s very hard to get appointments and oftentimes they may be seen by a clinician [only] once every three months or six months.”

<!-- /wp:paragraph --><!-- wp:paragraph -->

Improve your mood by signing up for TIME’s guide to managing stress and anxiety.

<!-- /wp:paragraph --><!-- wp:paragraph -->

AI could be an effective way for clinicians to both make the best of the time they do have with patients, and bridge any gaps in access, Foltz says. AI-aided data analysis could help clinicians make diagnoses more quickly and accurately, getting patients on the right course of treatment faster—but perhaps more excitingly, Foltz says, apps or other programs that incorporate AI could allow clinicians to monitor their patients remotely, alerting them to issues or changes that arise between appointments and helping them incorporate that knowledge into treatment plans. That information could be lifesaving, since research has shown that regularly checking in with patients who are suicidal or in mental distress can keep them safe.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Some mental-health apps and programs already incorporate AI—like Woebot, an app-based mood tracker and chatbot that combines AI and principles from cognitive behavioral therapy—but it’ll probably be some five to 10 years before algorithms are routinely used in clinics, according to psychiatrists interviewed by TIME. Even then, Dr. John Torous, director of digital psychiatry at Beth Israel Deaconess Medical Center in Boston and chair of the American Psychiatric Association’s Committee on Mental Health Information Technology, cautions that “artificial intelligence is only as strong as the data it’s trained on,” and, he says, mental health diagnostics have not been quantified well enough to program an algorithm. It’s possible that will happen in the future, with more and larger psychological studies, but, Torous says “it’s going to be an uphill challenge.”

<!-- /wp:paragraph --><!-- wp:paragraph -->

Not everyone shares that position. Speech and language have emerged as two of the clearest applications for AI in psychiatry, says Dr. Henry Nasrallah, a psychiatrist at the University of Cincinnati Medical Center who has written about AI’s place in the field. Speech and mental health are closely linked, he explains. Talking in a monotone can be a sign of depression; fast speech can point to mania; and disjointed word choice can be connected to schizophrenia. When these traits are pronounced enough, a human clinician might pick up on them—but AI algorithms, Nasrallah says, could be trained to flag signals and patterns too subtle for humans to detect.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Foltz and his team in Boulder are working in this space, as are big-name companies like IBM. Foltz and his colleagues designed a mobile app that takes patients through a series of repeatable verbal exercises, like telling a story and answering questions about their emotional state. An AI system then assesses those soundbites for signs of mental distress, both by analyzing how they compare to the individual’s previous responses, and by measuring the clips against responses from a larger patient population. The team tested the system on 225 people living in either Northern Norway or rural Louisiana—two places with inadequate access to mental health care—and found that the app was at least as accurate as clinicians at picking up on speech-based signs of mental distress.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Written language is also a promising area for AI-assisted mental health care, Nasrallah says. Studies have shown that machine learning algorithms trained to assess word choice and order are better than clinicians at distinguishing between real and fake suicide notes, meaning they’re good at picking up on signs of distress. Using these systems to regularly monitor a patient’s writing, perhaps through an app or periodic remote check-in with mental health professionals, could feasibly offer a way to assess their risk of self-harm.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Wearable devices offer further opportunities. Many people already use wearables to track their sleep and physical activity, both of which are closely related to mental well-being, Nasrallah says; using artificial intelligence to analyze those behaviors could lead to valuable insights for clinicians.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Even if these applications do pan out, Torous cautions that “nothing has ever been a panacea.” On one hand, he says, it’s exciting that technology is being pitched as a solution to problems that have long plagued the mental health field; but, on the other hand, “in some ways there’s so much desperation to make improvements to mental health that perhaps the tools are getting overvalued.”

<!-- /wp:paragraph --><!-- wp:paragraph -->

Nasrallah and Foltz emphasize that AI isn’t meant to replace human psychiatrists or completely reinvent the wheel. (“Our brain is a better computer than any AI,” Nasrallah says.) Instead, they say, it can provide data and insights that will streamline treatment.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Alastair Denniston, an ophthalmologist and honorary professor at the U.K.’s University of Birmingham who this year published a research review about AI’s ability to diagnose disease, argues that, if anything, technology can help doctors focus on the human elements of medicine, rather than getting bogged down in the minutiae of diagnosis and data collection.

<!-- /wp:paragraph --><!-- wp:paragraph -->

Artificial intelligence “may allow us to have more time in our day to spend actually communicating effectively and being more human,” Denniston says. “Rather than being diagnostic machines… [doctors can] provide some of that empathy that can get swallowed up by the business of what we do.”

<!-- /wp:paragraph -->

TIME may receive compensation for some links to products and services on this website. Offers may be subject to change without notice.