Phillip Alvelda, the CEO of  and a serial entrepreneur, put it to me this way: “Imagine you’re lying in the hospital and every few hours a very disruptive nurse with a clanging cart of diagnostic tools shows up to interrupt your rest and take your vitals. Intrusive, annoying, high contact, and it is not a constantly-monitored assessment of your vitals.”

That’s a scenario Alvelda set out to eliminate with his latest project, Medio.AI.  The idea is to use a simple camera on a mobile device, laptop or a one mounted in an office or hospital room. The camera can read your vital signs simply by looking at live video of your face. While a number of companies are working on computer imaging systems for analysis and diagnostics in clinical settings, is the first I’ve seen that you can try at home.


And try it I did.  I headed over to website. On its web-based application, I took my vitals for a spin. Impressive — not my vitals, but the process. Without any sensors or external hardware, I simply looked into a small camera window for a few moments and got a readout of my heart rate, respiration rate, and blood oxygenation. Combined with my temperature reading (the app doesn’t do this–yet) and a few questions about other symptoms like cough or headache, churned out a rating that estimated the likelihood that I was COVID free. (That’s just one analysis it can do.) When I measured the results of my Medio visit against my smartwatch (for heart rate and blood oxygenation) the results proved accurate. If it had showed I might have COVID, it would have recommended further testing.

Under the hood is some really intense computer imaging. I looked at the image in the app and saw a not especially large or high-res image of myself. Medio on the other side detects slight variations of blood movement through the skin and small movements of breathing through my shoulders. Alvelda calls it “video-based ambient biometrics.” The app’s imaging system measures changes in skin tone that help is calculate all those vital signs. It rates your likelihood of having COVID-19 by coupled that with CDC-based questions about symptoms like cough, headache, fever, and abdominal issues,  Alvelda holds a number of patents on the technique.

Medio is not alone in exploring the use of contactless computer imaging to extract vital signs. Binah.AI, has a similar solution. Presently the consumer application called  is not ready, but a B2B version has been deployed. Binah’s website explains that it applies signal processing and AI technologies, combined with proprietary mathematical back-end processing to analyze video taken from the upper cheek skin region of a human face. According to the site, they use PPG (photoplethysmogram) signal processing, which is the same technology a smartwatch uses to gather blood flow information (via those green lights on the back of the watch). Binah looks at micromovements in the blood vessels of your skin and claims to be able to calculate hjeart rate, oxygen saturation (SpO2), and respiration rate. Based on that data it infers heart rate variability (HRV) and even mental stress. Both Medio and Binah promise further vitals information as they perfect their machine learning algorithms. Binah won the CES Innovation Award in 2020 (watch the video) and said the app for consumers would be coming but thus far, no app. They are focusing on B2B applications for now.


A third app called Vital Sign AI adds audio to the equation. Asking you to cough into your smartphone’s microphone is detects various breath patterns as well as checking vital signs.

Med geeks looking for deeper science can have at it here.

How will these technologies be used? Telemedicine sessions will increasingly be able to acquire vital data remotely, with accuracy. Hospitals can have continuous, remote, non-invasive monitoring of patients. Companies and workplaces can screen for health. Consumers can track their own health much as they do using smartwatches today. The biggest plus in COVID times is that no contact is required. And, interestingly, the solutions are not engaged in controversial facial recognition. They don’t care about your face; they care about the blood flowing through your skin. And just imagine how accurate these systems would be if all the various measurements were somehow combined! I suspect eventually that will happen, given the obvious benefits for healthcare.

A top concern with these programs should the robustness of their data sets. I’m a white woman with fairly routine skin color. But what if I were a child, an elderly person, or a person of color?  Since so much of the technology revolves around slight changes in skin tones it would be great to know that all skin tones yield equally-good results. There have been reports of the potential bias in these selfie videos.

Beauty may not just be skin deep, but apparently in some key ways, our health actually is.