Orthopedic Walker Fall Detection and Motion Classification Using Bidirectional Gated Recurrent Unit Neural Network
Issued Date
2025-01-01
Resource Type
Scopus ID
2-s2.0-105004553018
Journal Title
10th International Conference on Digital Arts, Media and Technology, DAMT 2025 and 8th ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering, NCON 2025
Start Page
447
End Page
452
Rights Holder(s)
SCOPUS
Bibliographic Citation
10th International Conference on Digital Arts, Media and Technology, DAMT 2025 and 8th ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering, NCON 2025 (2025) , 447-452
Suggested Citation
Mekruksavanich S., Hnoohom N., Phaphan W., Jitpattanakul A. Orthopedic Walker Fall Detection and Motion Classification Using Bidirectional Gated Recurrent Unit Neural Network. 10th International Conference on Digital Arts, Media and Technology, DAMT 2025 and 8th ECTI Northern Section Conference on Electrical, Electronics, Computer and Telecommunications Engineering, NCON 2025 (2025) , 447-452. 452. doi:10.1109/ECTIDAMTNCON64748.2025.10962109 Retrieved from: https://repository.li.mahidol.ac.th/handle/123456789/110124
Title
Orthopedic Walker Fall Detection and Motion Classification Using Bidirectional Gated Recurrent Unit Neural Network
Author's Affiliation
Corresponding Author(s)
Other Contributor(s)
Abstract
Accurate fall detection for orthopedic walker users is essential for timely medical intervention and enhancing safety for elderly and mobility-impaired individuals. This paper introduces a new method using a bidirectional gated recurrent unit (BiGRU) neural network to detect falls and classify motion patterns in walker usage. We use the Walker dataset, which includes inertial measurement unit sensor data for four motion classes: idle, motion, step, and fall. The dataset consists of 6-axis IMU readings (3-axis accelerometer and 3-axis gyroscope) from sensors mounted on orthopedic walkers, with around 620 samples per class. Our BiGRU model leverages temporal dependencies in both forward and backward directions to improve classification accuracy. We assess the model's performance in two scenarios: binary classification (fall vs. non-fall) and multi-class classification of all four motion types. The proposed architecture shows strong performance, achieving 99.79% accuracy and a 99.73% F1-score in binary classification, and 98.51% accuracy and a 98.50% F1-score in multi-class classification. It outperforms traditional deep learning models like CNN, LSTM, and GRU. The experimental results confirm the effectiveness of bidirectional processing in capturing temporal motion patterns, indicating significant potential for realtime fall detection in walker-assisted mobility. The high precision and recall values in both scenarios demonstrate the model's reliability for practical healthcare monitoring applications.
