Abstract
Occluded person Re-identification (Re-ID) is to identify a particular
person when the person’s body parts are occluded. However, challenges
remain in enhancing effective information representation and suppressing
background clutter when considering occlusion scenes. In this paper, we
propose a novel Attention Map-Driven Network (AMD-Net) for occluded
person Re-ID. In AMD-Net, human parsing labels are introduced to
supervise the generation of partial attention maps, while we suggest a
Spatial-frequency Interaction Module (SIM) to complement the
higher-order semantic information from the frequency domain.
Furthermore, we propose a Taylor-inspired Feature Filter (TFF) for
mitigating background disturbance and extracting fine-grained features.
Moreover, we also design a part-soft triplet loss, which is robust to
non-discriminative body partial features. Experimental results on
Occluded-Duke, Occluded-Reid, Market-1501, and Duke-MTMC datasets show
that our method outperforms existing state-of-the-art methods. The code
is available at: https://github.com/ISCLab-Bistu/SA-ReID.