TU Darmstadt / ULB / TUprints

Multi-Modal Long-Term Person Re-Identification Using Physical Soft Bio-Metrics and Body Figure

Shoukry, Nadeen ; Abd El Ghany, Mohamed A. ; Salem, Mohammed A.-M. (2022)
Multi-Modal Long-Term Person Re-Identification Using Physical Soft Bio-Metrics and Body Figure.
In: Applied Sciences, 2022, 12 (6)
doi: 10.26083/tuprints-00021107
Article, Secondary publication, Publisher's Version

[img] Text
applsci-12-02835.pdf
Copyright Information: CC BY 4.0 International - Creative Commons, Attribution.

Download (3MB)
Item Type: Article
Type of entry: Secondary publication
Title: Multi-Modal Long-Term Person Re-Identification Using Physical Soft Bio-Metrics and Body Figure
Language: English
Date: 8 April 2022
Place of Publication: Darmstadt
Year of primary publication: 2022
Publisher: MDPI
Journal or Publication Title: Applied Sciences
Volume of the journal: 12
Issue Number: 6
Collation: 18 Seiten
DOI: 10.26083/tuprints-00021107
Corresponding Links:
Origin: Secondary publication DeepGreen
Abstract:

Person re-identification is the task of recognizing a subject across different non-overlapping cameras across different views and times. Most state-of-the-art datasets and proposed solutions tend to address the problem of short-term re-identification. Those models can re-identify a person as long as they are wearing the same clothes. The work presented in this paper addresses the task of long-term re-identification. Therefore, the proposed model is trained on a dataset that incorporates clothes variation. This paper proposes a multi-modal person re-identification model. The first modality includes soft bio-metrics: hair, face, neck, shoulders, and part of the chest. The second modality is the remaining body figure that mainly focuses on clothes. The proposed model is composed of two separate neural networks, one for each modality. For the first modality, a two-stream Siamese network with pre-trained FaceNet as a feature extractor for the first modality is utilized. Part-based Convolutional Baseline classifier with a feature extractor network OSNet for the second modality. Experiments confirm that the proposed model can outperform several state-of-the-art models achieving 81.4 % accuracy on Rank-1, 82.3% accuracy on Rank-5, 83.1% accuracy on Rank-10, and 83.7% accuracy on Rank-20.

Uncontrolled Keywords: FaceNet, long-term person re-identification, OSNet, PCB, PRCC dataset, Siamese network
Status: Publisher's Version
URN: urn:nbn:de:tuda-tuprints-211075
Classification DDC: 600 Technology, medicine, applied sciences > 620 Engineering and machine engineering
Divisions: 18 Department of Electrical Engineering and Information Technology > Institute of Computer Engineering > Integrated Electronic Systems (IES)
Date Deposited: 08 Apr 2022 11:21
Last Modified: 14 Nov 2023 19:04
SWORD Depositor: Deep Green
URI: https://tuprints.ulb.tu-darmstadt.de/id/eprint/21107
PPN: 500783160
Export:
Actions (login required)
View Item View Item