News

FOX: AI tech identifies suicide risk in military veterans before it’s too late: ‘Flipping the model’

Mission Roll Call 6 min read June 22, 2023
Share:

This article originally appeared on FOX.

This story discusses suicide. If you or someone you know is having thoughts of suicide, please contact the Suicide & Crisis Lifeline at 988 or 1-800-273-TALK (8255).

As the mental health of U.S. military veterans remains a major concern among many people in our society, new technology could become a lifesaver.

An AI platform developed by ClearForce, a tech company in Vienna, Virginia, aims to identify the risk of suicide among veterans before it’s too late.

Col. Michael Hudson, vice president at ClearForce, spoke to Fox News Digital in an interview to discuss his efforts on the veteran suicide initiative.

A former Marine of 30 years, Hudson spent time working in sexual assault prevention and with his command’s behavioral suicide team before completing his service.

The veteran joined ClearForce’s efforts to identify “insider risk and insider threat” through technology that “allows organizations to become aware of individuals who are struggling” in the workplace, including with harassment and mental health, he said.

“We’re focused on the veteran, working inside the active-duty space to look at the current model and find ways— using technology we’ve developed — to have better outcomes related to reducing suicide,” he said.

ClearForce is using artificial intelligence “for good” by incorporating a “human into the conversation” instead of relying on generative AI, Hudson explained.

“What we’re doing with our models is tuning them — curating them, if you will — with the appropriate data sets that identify individuals who are struggling,” he said.

“It’s a series of events, and it’s a dynamic problem, so it’s never a one and done.”

This data is pulled from decades of evidence and research from agencies such as the Department of Defense, Veterans Affairs and the Centers for Disease Control and Prevention on the leading indicators of mental health struggles.

“We tune the model to see those, but we also [use] our machine learning functionality to make sure they understand it’s not just one event,” Hudson said. “It’s a series of events, and it’s a dynamic problem, so it’s never a one and done.”

While June is PTSD Awareness Month, Hudson mentioned that ClearForce supports veterans daily — rather than checking in on them only during certain periods of time.

“How often are you going to hit someone on their particular sweet spot, when they could really use the help?” he asked.

“Why don’t we better align those resources and knock on their door … and open a line of communication?”

As an example, Hudson revealed that financial difficulties, which often lead to homelessness, are a known indicator of the potential risk of suicide.

ClearForce’s AI-driven data is shared with various veterans groups, government agencies and states — which can then act on these early indications within their own communities.

Many jurisdictions and organizations, said Hudson, are interested in “flipping the model” on suicide prevention, an effort in which he said ClearForce is “leading the fight.”

“Today, most of the suicide prevention efforts nationally are anchored on the individual … taking the first step,” he said. “But we’re leaning into that saying: ‘Let’s change the model.’”

He added, “Let’s use data. Let’s anchor it on technology and science … and then reach the individual sooner. Go to them where they’re at. Talk about resources. Allow them to course-correct. That’s where we’re going. I think that’s how we need to change this model to change outcomes.”

“Anything that uses AI, particularly in the health care and mental health care space, should be treated very carefully and very cautiously.”

Suicides among veterans are a national concern, as this demographic is 1.5 times more likely to commit suicide than are civilians, with an average of 17 suicide deaths per day by veterans, according to ClearForce and the VA.

Nearly a quarter of military families do not receive the mental health care they need, according to a 2022 survey by Blue Star Families, a California-based nonprofit that aims to connect military families with support from neighborhood civilians.

Blue Star Families’ Dr. Lindsay Knight, executive vice president of social impact, told Fox News Digital in an interview that the organization has taken its own steps toward suicide prevention, complementing ClearForce’s approach to getting ahead of the issue.

“We want to see, essentially, how we can get resources into advocate kits for veterans, so that instead of coming up with an answer at the moment of a mental health crisis, people within that community can respond to much earlier signs,” she said.

While Blue Star Families as an organization is not using AI to develop its programming, Knight warned that those involved with AI-backed programs proceed with caution.

“Anything that uses AI, particularly in the health care and mental health care space, should be treated very carefully and very cautiously in how it is implemented,” she said.

Knight stressed the need for a sense of belonging within the community.

She noted that Blue Star Families is working on using its own data and analytics to connect military personnel and their families through a digital platform called The Neighborhood.

“There is absolutely no shame in reaching out or asking for help,” she said. “There is still a stigma around it … and we want to essentially normalize that this is a part of health.”

“There is absolutely no shame in reaching out or asking for help.”

Cole Lyle, a former Marine and today executive director of Mission Roll Call, applauded the potential for AI to assist in this capacity.

“Artificial intelligence has an incredible potential to modernize and optimize health care solutions and play a proactive role in intervention, providing the necessary referrals to care for veterans,” Lyle said in a statement to Fox News Digital.

The Virginia-based Lyle, whose organization works on suicide prevention among veterans, added that while the possibilities of AI should be embraced, privacy should also be prioritized.

“We should also ensure that veterans get the final say on their health care options, made in collaboration with their health care providers, who take their unique circumstances into account,” he said.

Col. Hudson of ClearForce shared that “people want to make change” as communities recognize that “we’re losing too many veterans to suicide.”

To his fellow veterans, Hudson said that struggling with mental health is “not a sign of weakness.”

“It is a sign of self-awareness,” he said. “Do it for yourself. Do it for those who love you. Do it because it’s the right thing to do.”

He added, “It’s the same thing we did while we were serving [in] active duty. If we ran into a problem that was too hard for us to do … we called in supporting fires. So call in those supporting fires, those friends, those family [members]. Stay connected and stay in the fight.”

ClearForce is currently “prototyping” its AI model with different states, including Virginia, to build out and refine the process.

The technology has so far proven to be accurate 91% of the time, according to the company, in terms of identifying mental health risks among veterans.

Hudson said any organizations that would like to get involved can visit clearforce.com/suicideprevention.

Share:
Interview
Suicide

Latest In News

Let your voice be heard

Join an independent, unfiltered, unbiased movement for change.

Join Today