Red Flag: Suicide Risk

How AI is helping prevent suicide in veterans

Suicide and AI Centerpiece

Illustration by Iris Johnson

By Sonya Collins

Medically reviewed by Jennifer Casarella, MD

July 14, 2022

Dan Miller has parked his Nissan Altima on the side of the road near a field outside Chicago, holding a gun to his head. 

Haunted for years by the compounded trauma of tours of duty in the Middle East and his work as a police officer in Chicago, at that moment, Miller saw no reason to live. And there were troubles at home with his wife and children, who had grown fearful of his behavior.

"My whole world was falling apart," he says of that dark night in 2014. "It left a hole I didn't know how to fill."

He chose not to pull the trigger after a brochure on the passenger seat of his car gave him an unexpected perspective - and launched him on a path to help others in his situation.

Had Miller taken his life that night, he would have joined thousands of other veterans who died by suicide. About 17 U.S. veterans lose their lives this way each day, on average, according to the Department of Veterans Affairs. In 2019, the last year for which records are available, 6,261 veterans took their own lives - and the suicide rate for veterans was 52% higher than for non-veterans, the agency's records show. 

The problem has become so severe that the Veterans Health Administration (VHA) now uses artificial intelligence (AI) to help identify veterans at the highest risk of suicide - and reach out to them before a crisis strikes.

But that wasn't available when Dan Miller's life was unraveling.

In the years leading up to his near-suicide, his wife had pushed him to get help. "She said, 'You're not the same person you were when you left. The kids are scared of you. The pets are scared of you," he recalls.

He resisted, even when his wife threatened divorce. Rising through the ranks of the Marines, Miller had become more emotionally isolated. He feared losing his job and the respect of others if he let anyone know what he was going through.

Finally, he gave the VHA a chance. He went in for an initial consultation in 2010 and didn't find it helpful. He didn't like being told what to do. So he stopped. He turned to obsessive exercise and excessive drinking.

That day in 2014, Miller's wife told him she was taking the kids out for a playdate. After she left, he was served with divorce papers. Less than an hour later, he was parked in his car with his gun, ready to end his life.

But if it all had happened just a few years later, things might never have gotten to that point.

Scanning for Suicide Risk

In 2017, the VHA piloted its AI program, called REACH VET, that aims to help prevent veterans from dying by suicide.

Every month, a computer scans the electronic health records of all VHA patients who've had a health care visit for any reason in the last 2 years. It checks more than 140 variables and weights them to estimate someone's overall suicide risk at that moment in time.

To build the risk algorithm, a computer combed through the medical records of 6,360 veterans confirmed to have died by suicide between 2009-2011. (The VHA continually updates the list of variables from the health records of VHA patients, including those who have died by suicide since then and others.)

Some variables are things you'd expect:

  • A past suicide attempt
  • A diagnosis of depression or other mental illness
  • A diagnosis of a terminal illness

Others are more surprising. For example, a diagnosis of arthritis or diabetes adds weight.

REACH VET flags the riskiest cases - the top 0.1% - for a mental health or primary care provider to review. They reach out to the patient to tell them how and why their record was flagged, discuss any recommended treatment changes, and ask them to come in for a visit.

"It's an opportunity to talk about their risk factors, which is designed to lead to a conversation about safety planning," says clinical psychologist Matthew Miller, PhD, national director of the U.S. Department of Veterans Affairs' Suicide Prevention Program. He's not related to Dan Miller.

Making a Suicide Safety Plan

A safety plan is a document that outlines how a person can help prevent their own suicide in a crisis. 

The plan may include:

  • A list of personal triggers or warning signs
  • What's helped them in the past
  • Names of people or organizations who can support them
  • Plans to remove means of suicide, such as guns, from their environment
  • Their reasons for living

In people at risk for suicide, research shows that having a safety plan reduces suicidal thoughts and attempts, lowers rates of depression and hopelessness, and boosts veterans' engagement with the health care system. It may also help people manage things that trigger their suicidal thoughts. 

Getting the Call

What if REACH VET had been around when Dan Miller was in crisis - and he'd gotten a call from the VHA?

"One of the biggest things on that day ... was feeling completely alone and that I had no one to turn to." 

- Dan Miller

"It absolutely, positively would have helped because one of the biggest things on that day when I got served was feeling completely alone and that I had no one to turn to," Miller says. He's now a speaker for the Wounded Warrior Project, a nonprofit that serves veterans and active duty service people.

Vets' reactions to the unexpected VHA phone call, psychologist Miller says, "run the gamut from 'Thank you for contacting me. Let's talk,' to 'What are you talking about? Leave me alone!' "

Nothing stops all suicides. But REACH VET is having an impact. In a clinical trial, vets contacted through REACH VET had more doctor visits, were more likely to have a written suicide prevention safety plan, and had fewer hospital admissions for mental health, ER visits, and suicide attempts.

An Assist From AI

Even simple outreach can make a big difference. And there's research to prove it.

One study included 4,730 veterans recently discharged from psychiatric care at the VHA, a group considered at high risk for suicide. 

Half of them got 13 caring emails from hospital staff in the weeks after leaving the hospital. The emails mentioned personal things the patient had shared, like a love of hiking, and wished them well. The other veterans got routine follow-up but no emails.

Two years later, those who got the caring emails were less likely to have died by suicide than the other vets. The study was published in 2014 in Contemporary Clinical Trials.

Researchers have done studies like this many times: with handwritten notes from the primary care doctor, postcards from the ER, and so forth. The results never vary: The notes reduce suicide risk.

"If we could use AI to identify people to receive notes or phone calls, it would be a very effective and inexpensive way to guide follow-up care," says Rebecca Bernert, PhD, director and founder of the Suicide Prevention Research Laboratory at Stanford University School of Medicine in Palo Alto, CA.

AI doesn't replace clinical judgment.

"AI can capture data that we miss due to the limits of our humanity," psychologist Miller says. "There's suicide prevention processes founded on big data and AI, and there are processes founded in clinical intuition and acumen."

"When you're able to put time and space between the suicidal thought and the access to the method to act on that thought, you save lives."

-- Rebecca Bernert, PhD

AI is only as good as the data it's based on. If that data lacks diversity, it may miss things. And variables that apply to veterans may differ in civilians.

Stopping Suicidal Thoughts

Google is putting AI to work against suicide, too. Its MUM (Multitask Unified Model) technology seeks to understand the intent behind what we google.

MUM powers Google Search. It can often tell the difference between a search for information about suicide for someone writing a research paper on the topic, versus a search for information on how or where to carry out a suicide.

When Google Search detects that someone in the U.S. might be in crisis and at risk of suicide, the first search results that person gets are the number for the National Suicide Prevention Lifeline and other resources for people in crisis.

Google Home Assistant works in the same way. When a user makes a query that signals a suicide-related crisis, the gadget serves up resources that offer help.

MUM is working to understand the nuances of crisis language in 75 languages so that Google Search can provide people in crisis with hotlines or other resources in many countries.

"We want to find partners that are accessible to users in terms of hours of operation. We have a strong preference for finding partners that promise confidentiality and privacy to the extent that those are permitted [in that country]," says Anne Merritt, MD, a product manager at Google Search.

Other companies are working on apps that use AI to spot suicide risk in other ways, including voice technology that may notice subtle changes in the voice of someone who's depressed and may be thinking of suicide. Those are still in development but show promise. Keep in mind that apps do not require government approval, so if you try one, be sure to let your health care provider know.

Changing the Channel

Seeing a hotline number on your phone or computer screen can help, Dan Miller says. "If I happened to be online, searching maybe for a bridge to jump off of ... and suddenly that pops up on the screen, it's like it changes the channel."

It may not work for everyone, he says, but that search result could interrupt someone's suicidal train of thought.

That's crucial, psychologist Miller says, because most suicide attempts escalate from first thought to potentially fatal action in just 1 hour. That's how fast it happened for Dan Miller in 2014.

"When you're able to put time and space between the suicidal thought and the access to the method to act on that thought, you save lives," Bernert says.

Making a Different Choice

An interruption in Miller's thinking is what had saved his life.

Holding the gun to his head, Miller looked over at the passenger seat at a brochure from Wounded Warrior Project, which he had just learned about. Miller noticed a photo of a man in a wheelchair, a veteran like him, who had no legs. He thought that the man looked worse off than him but hadn't given up.

Miller put down his gun and decided to get help.

Recovering from a near suicide attempt, he says, is a journey. It doesn't happen overnight. Now, 8 years later, Miller is planning a brief break from the speaker circuit. He plans to spend 2 weeks in an outpatient counseling program for posttraumatic stress disorder and traumatic brain injury.

"Telling my story to strangers - part of it is healing me in a way, but I'm learning that repeating the story over and over again is also keeping me from letting it go. And I'm still healing."

Hear what a therapist says about suicidal thinking on this episode of WebMD's podcast, Health Discovered.