Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
Issued Date
2023-02-01
Resource Type
eISSN
20799292
Scopus ID
2-s2.0-85147861299
Journal Title
Electronics (Switzerland)
Volume
12
Issue
3
Rights Holder(s)
SCOPUS
Bibliographic Citation
Electronics (Switzerland) Vol.12 No.3 (2023)
Suggested Citation
Hnoohom N., Mekruksavanich S., Jitpattanakul A. Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors. Electronics (Switzerland) Vol.12 No.3 (2023). doi:10.3390/electronics12030693 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/81782
Title
Physical Activity Recognition Based on Deep Learning Using Photoplethysmography and Wearable Inertial Sensors
Author(s)
Author's Affiliation
Other Contributor(s)
Abstract
Human activity recognition (HAR) extensively uses wearable inertial sensors since this data source provides the most information for non-visual datasets’ time series. HAR research has advanced significantly in recent years due to the proliferation of wearable devices with sensors. To improve recognition performance, HAR researchers have extensively investigated other sources of biosignals, such as a photoplethysmograph (PPG), for this task. PPG sensors measure the rate at which blood flows through the body, and this rate is regulated by the heart’s pumping action, which constantly occurs throughout the body. Even though detecting body movement and gestures was not initially the primary purpose of PPG signals, we propose an innovative method for extracting relevant features from the PPG signal and use deep learning (DL) to predict physical activities. To accomplish the purpose of our study, we developed a deep residual network referred to as PPG-NeXt, designed based on convolutional operation, shortcut connections, and aggregated multi-branch transformation to efficiently identify different types of daily life activities from the raw PPG signal. The proposed model achieved more than 90% prediction F1-score from experimental results using only PPG data on the three benchmark datasets. Moreover, our results indicate that combining PPG and acceleration signals can enhance activity recognition. Although, both biosignals—electrocardiography (ECG) and PPG—can differentiate between stationary activities (such as sitting) and non-stationary activities (such as cycling and walking) with a level of success that is considered sufficient. Overall, our results propose that combining features from the ECG signal can be helpful in situations where pure tri-axial acceleration (3D-ACC) models have trouble differentiating between activities with relative motion (e.g., walking, stair climbing) but significant differences in their heart rate signatures.