Skip to content


comment

The same devices used to take selfies are being repurposed and commercialized for quick access to patient health monitoring information. A fingertip pressed against a phone’s camera lens can measure heart rate. A bedside microphone can test for sleep apnea.

In the best of this new world, data is transmitted remotely to the medical professional for the comfort and convenience of the patient without the need for expensive equipment.

But the use of smartphones as diagnostic tools remains a work in progress. While doctors and their patients have seen some real-world success, experts say their overall potential remains unrealized and uncertain.

Smartphones are equipped with sensors that can monitor a patient’s vital signs. They can help assess people for seizures, monitor for atrial fibrillation, and conduct mental health screenings, to name a few using nascent apps.

Enterprising companies and researchers are taking advantage of phones’ built-in cameras and light sensors; microphones; accelerometers that detect body movements; gyroscopes; and even speakers. The apps then use artificial intelligence software to analyze the collected sights and sounds to create an easy connection between patients and doctors. In 2021, more than 350,000 digital health products were available in app stores, according to a Grand View Research report.

“It’s very difficult to put devices in a patient’s home or hospital, but everyone just walks around with a cell phone that has a network connection,” said Andrew Gostin, MD and CEO of sensor networking company Artisight. Most Americans own a smartphone, including more than 60 percent of people age 65 and older, according to the Pew Research Center. The pandemic has also made people more comfortable with virtual care.

Manufacturers of some of these products have sought approval from the Food and Drug Administration to market them as medical devices. Others were designated as exempt from the regulatory process, classified in the same clinical classification as band aid. But how the agency handles medical devices powered by artificial intelligence and machine learning is still being tweaked to reflect the adaptive nature of the software.

Ensuring accuracy and clinical validation is critical to ensuring buy-in from healthcare providers. And many tools still need refinement, says Eugene Yang, MD, clinical professor of medicine at the University of Washington.

Judging these new technologies is difficult because they rely on algorithms created by machine learning and artificial intelligence to collect data, rather than the physical instruments commonly used in hospitals. So researchers can’t “compare apples to apples” with medical industry standards, Yang said. Failing to meet such safeguards could undermine the technology’s goals of easing costs and access, because the doctor still has to verify the results, he added.

Major technology companies such as Google have invested heavily in the field, serving clinicians and home caregivers as well as consumers. Currently, users of the Google Fit app can check their heart rate by placing their finger on the rear camera lens or track their breathing rate using the front camera.

Google’s research uses machine learning and computer vision, an area of ​​AI that relies on visual data such as videos or images. So instead of using a blood pressure cuff, for example, the algorithm can interpret subtle visual changes in the body that serve as blood pressure proxies and biomarkers, said Shwetak Patel, Google’s director of health technology and professor of electrical and computer engineering. at the University of Washington.

Google is also investigating the effectiveness of its smartphone’s built-in microphone to detect heartbeats and murmurs and using the camera to preserve vision by researching diabetic eye disease, the company said in 2022.

The tech giant recently bought Sound Life Sciences, a Seattle startup with an FDA-cleared sonar technology application. It uses the smart device’s speaker to bounce inaudible pulses from the patient’s body to detect movement and monitor breathing.

Binah.ai, based in Israel, also uses a smartphone’s camera to calculate vital signs. Its software examines the area around the eyes and analyzes the light reflected from the blood vessels to the lenses, said Mona Popilian-Yona, a spokeswoman for the company.

Applications even reach disciplines such as optometry and mental health;

  • With a microphone, Canary Speech uses the same underlying technology as Amazon’s Alexa to analyze patients’ voices for mental health conditions. The software can be integrated into telemedicine appointments and allows doctors to screen for anxiety and depression using a library of audio biomarkers and predictive analytics, said company CEO Henry O’Connell.
  • Australia-based ResApp Health has received FDA approval in 2022 for an iPhone app that tests for moderate to severe obstructive sleep apnea by listening to breathing and snoring. SleepCheckRx, which requires a prescription, is minimally invasive compared to the sleep studies currently used to diagnose sleep apnea.
  • Brightlamp’s Reflex app is a clinical decision support tool that helps manage seizures and restore vision, among other things. Using an iPad or iPhone camera, the mobile app measures how a person’s pupils react to changes in light. Through machine learning analysis, images provide practitioners with data points to assess patients. Brightlamp is sold directly to healthcare providers and is used in more than 230 clinics. Clinics pay a standard annual fee of $400 per account, which is not covered by insurance. The Department of Defense has an ongoing clinical trial using the reflex.

In some cases, such as the Reflex app, the data is processed directly on the phone rather than in the cloud, said Brightlamp CEO Curtis Slas. By processing everything on the device, the app avoids privacy issues because patient consent is needed elsewhere in the data flow.

But algorithms need to be trained and tested by collecting masses of data, and this is an ongoing process.

Researchers have found, for example, that some computer vision applications, including those for monitoring heart rate and blood pressure, may be less accurate for dark skin. Research is underway to find better solutions.

“We’re not there yet,” Ian said. “That’s the main thing.”

This article has been prepared Kaiser Health News, A program of the Kaiser Family Foundation, an endowed nonprofit organization that provides information on health issues to the nation.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *