Voice-Based Diagnostics Offers Potential Precision Medicine Advances as NIH Invests in Speech-Based Technologies

NIH invests $14 million to develop database of patients’ voices for personalized, AI-driven diagnostics

The National Institutes of Health (NIH) has committed to up to $14 million in funding to develop a database of patient voices that will be used to develop personalized artificial intelligence (AI) diagnostics. The new funding to begin this initiative is expected to enable breakthroughs in multiple areas, from neurological disorders to developmental problems.

The newly funded initiative will be led by the University of South Florida (USF) but will also include Weill Cornell Medicine as well as 10 other research institutions in the United States and Canada. The new initiative, called Voice as a Biomarker of Health, also includes a collaboration with Owkin, a French-American AI startup.

The new project is part of the NIH’s Bridge2AI program, a fund that focuses on enabling the widespread adoption of AI in healthcare by developing high-quality datasets that can be used for machine learning analysis. Developing these datasets is seen as a key step in advancing new AI technologies and identifying new breakthroughs in precision medicine applications of AI.

Using Voice and Sounds Together With Advanced AI Algorithms

Yaël Bensoussan, MD: co-leader of Voice as a Biomarker of Health, University of South Florida (USF), a Bridge2AI funded project
Voice as a Biomarker of Health is co-led by Yaël Bensoussan, MD (above), director of the University of South Florida Health Voice Center at the USF Health Morsani College of Medicine. The research team has identified five disease cohort categories for which voice changes have been associated with specific diseases with well-recognized unmet needs. (Photo: LinkedIn)

“Voice has the potential to be a biomarker for several health conditions,” said Yaël Bensoussan, MD, in a USF press release. “Creating an effective framework that incorporates huge datasets using the best of today’s technology in a collaborative manner will revolutionize the way that voice is used as a tool for helping clinicians diagnose diseases and disorders.” Bensoussan is an assistant professor in the Department of Otolaryngology and director of the USF Health Voice Center at the USF Health Morsani College of Medicine.

 The data collected for the Voice as a Biomarker of Health project will focus on five key areas:

  1. Voice disorders
  2. Neurological diseases, including neurodegenerative disorders
  3. Psychiatric and mood disorders
  4. Respiratory diseases
  5. Pediatric speech disorders

“The potential for using voice and sounds together with advanced AI algorithms to accurately diagnose certain diseases is incredible,” explained Olivier Elemento, PhD, director of the Englander Institute for Precision Medicine and professor of physiology and biophysics at Weill Cornell Medicine. “Our future findings could lead to a revolution in health care where continuous voice monitoring could alert physicians earlier than currently possible to certain conditions, such as infections or neurological diseases.”

Goal: Develop Infrastructure and Form Collaborative

While the project initially focuses on specific categories of diseases, the potential precision medicine applications of this project may be limitless. “If the infrastructure is well developed, this could represent the start of an international collaborative mission, such as the Human Genome Project, where voice data would be used by thousands of researchers, and then—based on that research—by clinicians worldwide,” said Vardit Ravitsky, PhD, a principal investigator in the project in a University of Montreal interview. “It could allow new and important discoveries and enhance what precision medicine has to offer patients.”

The NIH’s Voice as a Biomarker of Health program is not the only precision medicine initiative with a focus on leveraging voice as a diagnostic tool. Recent private sector funding has also focused on this space.

Other Voice AI Projects Also Funded

A recently announced investment by Northwell Holdings, the venture capital branch of Northwell Health, will provide $3 million to provide clinicians with new tools to perform AI voice assessment on nonverbal expression. The $3 million investment was awarded to Hume, an AI startup developing voice technology that recognizes subtle changes in nonverbal expressions, such as laughter, sighs, and gasps, as well as the tune, rhythm, and timbre of speech.

This recent investment will reportedly be used by Hume to refine their machine learning models to improve the health applications of their technology. Applications that are currently under development include using voice changes to recognize signs of depression, cognitive impairment, and pain.

“With Hume, what we are so excited about is this breakthrough technology, a tool that is going to be able to advance the way that we deliver care and perhaps conduct operations throughout clinical and business enterprises,” said Rich Mulry, CEO and president of Northwell Holdings, in a Fierce Healthcare article. “The other huge factor for us in deciding to invest was the broad evidence-based population upon which the algorithms are built. The company went to extensive lengths to obtain data in multiple countries throughout the world, so we believe that is going to help create a much stronger product that’s less prone to bias or error.”

“Where we can help is by providing the other information that’s absent from large language models, which is nonverbal expression,” explained Alan Cowen, PhD, founder and CEO of Hume. “Where there’s metrics of treatment progress of patient well-being and health, we can see if our technologies predict those metrics first and then provide potential interventions that will enable applications to improve those outcomes.”

Voice-based technologies are still in their infancy; however, they provide a low-cost, noninvasive tool that may soon be available to healthcare institutions that offer new precision medicine technologies. New funding and advances in this unique niche illustrate the novel AI technologies that are being explored to advance precision medicine.

—Caleb Williams

Related Information:

USF Health, Weill Cornell Medicine Earn Inaugural Funding in NIH’s Newly Launched Bridge2AI Initiative, Will Create Artificial Intelligence Platform for Using Voice to Diagnose Disease

USF Health Voice Center

Using Voice as a Biomarker for Diagnosis

A Global Collaborative Teamed Up to Help Diagnose Rare Lung Disease with AI. Here’s How They Did It

Yael Bensoussan, MD

Olivier Elemento, PhD

Vardit Ravitsky, PhD

Rich Mulry

Alan Cowen, PhD

Browse All Briefings