Veteran Mental Health Still Depends on Human Care Despite AI Intervention

You are currently viewing Veteran Mental Health Still Depends on Human Care Despite AI Intervention

Despite both local and national efforts to improve veteran mental health, the veteran suicide epidemic continues to cause widespread concern among U.S. leaders and veteran service organizations. In its 2024 National Veteran Suicide Prevention Annual Report, the Department of Veterans Affairs reported that an average of 17.6 veterans took their lives each day in 2024, up from 16.5 in 2001. Many onlooker organizations consider the VA’s figures an underestimate.

Elected leaders also have concerns about the lack of progress in lowering the veteran suicide rate. In a 2026 funding bill for the VA, Congress directed the department “to use predictive modeling [AI] and analytics for veteran suicide prevention.” 

The VA currently employs four AI initiatives for supporting veteran mental health, including an AI-driven effort to identify at-risk veterans. Recovery Engagement and Coordination for Health-Veterans Enhanced Treatment (REACH VET) is a machine learning system that utilizes statistics to target the veterans in the highest .1% at risk of committing suicide on a monthly basis. Those veterans received directed mental health outreach.

REACH VET has come under scrutiny due to accusations of bias. Some elevated risks for suicide for women veterans, including domestic violence and intimate partner violence, were not taken into consideration by an earlier model, which was also plagued with accusations of racial bias. A newer version of the program, REACH VET 2.0, has included new risk factors and removed variables linked to bias concerns.

A better impact at VA

When Rep. Nikki Budzinski (D-IL) asked the VA to “commit that AI would never replace human intervention in suicide prevention,” Acting Director of the VA’s National Artificial Intelligence Institute said that there were no current plans for using AI as a “treatment device.” 

Dr. Ryan Ziegler, the Special Operations Association of America’s Director of Strategic Partnerships, argues that AI cannot replace a physician. “AI does not have compassion. And in my experience, the most impactful part of being a medical provider for somebody in mental distress is the ability to show them compassion.”

Having been part of the VA system, Ziegler said that integrating AI tools available to practitioners will reduce their administrative load, providing practitioners with “more ability to put more compassion” towards patients.

This would mean eliminating the need to navigate antiquated systems that are not integrated with the brand new technologies the VA is attempting to incorporate. Ziegler also emphasized that training on current AI systems is paramount, or tools will be useless to assist practitioners.

Ziegler suggested having an agentic AI that could scan documents from doctors external to the VA system. At present, he explained that these documents are faxed to VA and scanned into a system where they are not quickly and easily searchable. Agentic AI could create documents that are both queriable by physicians and accessible on a streamlined system. Ziegler argued that agentic AI could also manage calls for coordinating with patients’ community provider offices to ensure patient charts are sent in a timely manner for VA provider review.

AI tools that save time for staff are “actually going to change the outcome of how much time I have to spend where I can actually focus on the patient and listen, which is what everyone wants when they are sitting with a provider,” Ziegler said. Leveraging AI’s strengths (i.e., administrative functions) can free up provider mental space, directly contributing to better provider-patient interactions. AI-enabled technologies helping veterans are no longer farfetched science fiction.

AI with a purpose

Some burgeoning AI programs hold promise in supporting the veteran community outside the VA. AI health technology company Canary Speech scans a user’s speech to screen for mood, energy levels, stress, and neurological disorders in just 40 seconds. In 2024, Canary Speech collaborated with 501(c)3 Veteran Business Project to equip veterans with technology that would improve their mental health and enhance their business skills. 

When veterans are in crisis, finding mental health care can be a difficult and burdensome process. Trusted Mission AI can connect veterans with mental health care on demand by searching out available therapy appointments with providers who have the experience to meet a patient’s needs. Like Canary Speech, Trusted Mission AI can detect speech patterns that would indicate immediate crisis. 

A Path Forward

Ziegler raised important concerns about current AI tools in the veteran mental health space, explaining that they do not eliminate serious issues that impact access to care, like the number of mental health clinicians available for support. 

Again, Ziegler emphasized that AI tools cannot replace a “psychologist or a psychiatrist [who is] going to put you into a certain framework of treatment.” He emphasized that the technology is “not a superhuman yet, but we see it that way because it’s able to pull all this data from the entire corpus of the internet. It’s almost like it’s a child in adults’ clothes.”

The future of veteran suicide prevention will not be decided by algorithms alone. It will depend on whether institutions deploy AI with humility, transparency, and a clear understanding of its limits. Used responsibly, AI will never replace human care. But it may help ensure that care arrives sooner, lasts longer, and reaches veterans who might otherwise suffer in silence. The VA’s nascent  implementation of AI-enabled systems into its veteran care process is noteworthy, and may help reduce the veteran suicide epedemic.