Essential Site Maintenance: Authorea-powered sites will be updated circa 15:00-17:00 Eastern on Tuesday 5 November.
There should be no interruption to normal services, but please contact us at [email protected] in case you face any issues.

Zihang You

and 4 more

Lower limb exoskeletons play an increasingly important role in rehabilitation and mobility assistance, often requiring real-time human activity recognition (HAR) to deliver seamless assistance and natural interaction with the user. This study evaluates the effectiveness of various sensor types in HAR, employing three deep neural network models (MLP, LSTM, and CNN-LSTM) on an open-source dataset featuring 21 subjects performing six different locomotion activities. The dataset includes data from inertial measurement units (IMUs), goniometers, and marker-based motion capture derived joint angles using inverse kinematics (IK). To ensure real-time applicability, we trained these networks with a very short sliding window (50 ms) of timeseries data from different sensor combinations. Our results show that using bilateral IK joint angle data from all three lower limb joints can achieve an accuracy of 98.98%. Accuracy decreases as fewer joints are used, with 97.77% for the hip+knee combination and 91.79% for the hip joint alone. The addition of a thigh IMU to the IK data consistently improves HAR accuracy across all configurations. Moreover, incorporating derived joint angular velocity notably enhances HAR accuracy, improving accuracy of goniometer up to 10% for a single joint (knee) and 2% for two joints. For bilateral IK data, this improvement goes up to 15% for a single joint (knee).These findings highlight the substantial benefits of utilizing joint angle, velocity, and acceleration data obtainable from wearable exoskeletons and IMUs for real-time HAR and adaptive assistance in lower limb exoskeletons. The insights gained are pivotal for developing minimalistic HAR approaches and optimizing the seamless integration of exoskeletons into everyday activities.