A University of Washington-led team has developed a method that uses the camera on a person's smartphone or computer to take their pulse and respiration signal from a real-time video of their face. Its system runs on the device instead of the cloud and uses machine learning to capture subtle changes in how light reflects off a person's face.