DeepBreath: Breathing Exercise Assessment with a Depth Camera

Practicing breathing exercises is crucial for patients with chronic obstructive pulmonary disease (COPD) to enhance lungfunction. Breathing mode (chest or belly breathing) and lung volume are two important metrics for supervising breathing exercises. Previous works propose that these metrics can be sensed separately in a contactless way, but they are impractical with unrealistic assumptions such as distinguishable chest and belly breathing patterns, the requirement of calibration, and the absence of body motions. In response, this research proposes DeepBreath, a novel depth camera-based breathing exercise assessment system, to overcome the limitations of the existing methods. DeepBreath, for the first time, considers breathing mode and lung volume as two correlated measurements and estimates them cooperatively with a multitask learning framework. This design boosts the performance of breathing mode classification. To achieve calibration-free lung volume measurement, DeepBreath uses a data-driven approach with a novel UNet-based deep-learning model to achieve one-model-fit-all lung volume estimation, and it is designed with a lightweight silhouette segmentation model with knowledge transferred from a state-of-the-art large segmentation model that enhances the estimation performance. In addition, DeepBreath is designed to be resilient to involuntary motion artifacts with a temporal-aware body motion compensation algorithm. We collaborate with a clinical center and conduct experiments with 22 healthy subjects and 14 COPD patients to evaluate DeepBreath. Theexperimental result shows that DeepBreath can achieve high breathing metrics estimation accuracy but with a much more realistic setup compared with previous works. Download PDF

FlexibleBP: Blood Pressure Monitoring Using Wrist-worn Flexible Sensor

We propose FlexibleBP, a novel cuffless blood pressure monitoring system using a wrist-worn flexible sensor to enhance comfort and accuracy. By capturing pulse wave signals from the radial artery, we develop a personalized estimation framework incorporating a Transformer model with fine-tuning. Experiments with 36 participants confirm FlexibleBP’s accuracy, meeting AAMI standards. This work marks a step toward more user-friendly, advanced wearable BP monitoring solutions. Download PDF

EarPass: Continuous User Authentication with In-ear PPG

In the rapidly expanding universe of smart IoT, earable devices, such as smart headphones and hearing aids, are gaining remarkable popularity. As we anticipate a future where a myriad of sophisticated applications—interaction, communication, health monitoring, and fitness guidance—migrate to earable devices handling sensitive and private information, the need for a robust, continuous authentication system for these devices becomes more critical than ever. Yet, current earable-based solutions, which rely predominantly on audio signals, are marred by inherent drawbacks such as privacy concerns, high costs, and noise interference. In light of these challenges, we investigate the potential of leveraging photoplethysmogram (PPG) sensors, which monitor key cardiac activities and reflect the uniqueness of an individual’s cardiac system, for earable authentication. Our study presents EarPass, an innovative ear-worn system that introduces a novel pipeline for the extraction and classification of in-ear PPG features to enable continuous user authentication. Initially, we preprocess the input in-ear PPG signals to facilitate this feature extraction and classification. Additionally, we present a method for detecting and eliminating motion artifacts (MAs) caused by head motions. Through extensive experiments, we not only demonstrate the effectiveness of our proposed design, but also establish the feasibility of using in-ear PPG for continuous user authentication—a significant stride towards more secure and efficient earable technologies. Download PDF

EarSpiro: Earphone-based Spirometry for Lung Function Assessment

Spirometry is the gold standard for evaluating lung functions. Recent research has proposed that mobile devices can measure lung function indices cost-efficiently. However, these designs fall short in two aspects. First, they cannot provide the flow-volume (F-V) curve, which is more informative than lung function indices. Secondly, these solutions lack inspiratory measurement, which is sensitive to lung diseases such as variable extrathoracic obstruction. In this paper, we present EarSpiro, an earphone-based solution that interprets the recorded airflow sound during a spirometry test into an F-V curve, including both the expiratory and inspiratory measurements. EarSpiro leverages a convolutional neural network (CNN) and a recurrent neural network (RNN) to capture the complex correlation between airflow sound and airflow speed. Meanwhile, EarSpiro adopts a clustering-based segmentation algorithm to track the weak inspiratory signals from the raw audio recording to enable inspiratory measurement. We also enable EarSpiro with daily mouthpiece-like objects such as a funnel using transfer learning and a decoder network with the help of only a few true lung function indices from the user. Extensive experimentswith 60 subjects show that EarSpiro achieves mean errors of 0.20𝐿/𝑠 and 0.42𝐿/𝑠 for expiratory and inspiratory flow rate estimation, and 0.61𝐿/𝑠 and 0.83𝐿/𝑠 for expiratory and inspiratory F-V curve estimation. The mean correlation coefficient between the estimated F-V curve and the true one is 0.94. The mean estimation error for four common lung function indices is 7.3%. Download PDF