Shanakht-Net: Person Re-identification using Inertial Sensors data
generated by Smart-wearables from Daily Activities
Abstract
An important component of an automated surveillance environment is
person re-identification. The problem is often addressed using data
received from vision sensors using appearance-based features, which are
heavily reliant on visual cues such as colour, texture, and so on,
limiting the reliability of re-identification of an individual. Much
research has been performed to solve the problem of re-identification
utilising human gait using inertial measurement units (IMU) data, which
is thought to be unique and offer a distinct biometric signature that is
especially ideal for re-ID in uncontrolled conditions. The locomotive
activity of walking was the primary emphasis. The current study utilised
not only locomotive activities but also non-locomotive activities of
daily living. The data was obtained from the WISDM lab. The data is
collected while engaging in six distinct everyday activities. The
dataset was originally gathered for the purpose of Human Activity
Recognition. Nonetheless, each person is given a unique ID. This
information was utilised to re-identify the individual. The dataset
consists of data of 36 volunteers. Shanakht-Net, a novel convolutional
neural network, is introduced. The F1-score obtained is
93\%. Precision, recall, and accuracy are assessed and
reported as well.