TU Darmstadt / ULB / TUprints

ExerTrack - Towards Smart Surfaces to Track Exercises

Fu, Biying ; Jarms, Lennart ; Kirchbuchner, Florian ; Kuijper, Arjan (2022)
ExerTrack - Towards Smart Surfaces to Track Exercises.
In: Technologies, 2022, 8 (1)
doi: 10.26083/tuprints-00016290
Article, Secondary publication, Publisher's Version

Copyright Information: CC BY 4.0 International - Creative Commons, Attribution.

Download (14MB) | Preview
Item Type: Article
Type of entry: Secondary publication
Title: ExerTrack - Towards Smart Surfaces to Track Exercises
Language: English
Date: 2022
Place of Publication: Darmstadt
Year of primary publication: 2022
Publisher: MDPI
Journal or Publication Title: Technologies
Volume of the journal: 8
Issue Number: 1
Collation: 21 Seiten
DOI: 10.26083/tuprints-00016290
Corresponding Links:
Origin: Secondary publication DeepGreen

The concept of the quantified self has gained popularity in recent years with the hype of miniaturized gadgets to monitor vital fitness levels. Smartwatches or smartphone apps and other fitness trackers are overwhelming the market. Most aerobic exercises such as walking, running, or cycling can be accurately recognized using wearable devices. However whole-body exercises such as push-ups, bridges, and sit-ups are performed on the ground and thus cannot be precisely recognized by wearing only one accelerometer. Thus, a floor-based approach is preferred for recognizing whole-body activities. Computer vision techniques on image data also report high recognition accuracy; however, the presence of a camera tends to raise privacy issues in public areas. Therefore, we focus on combining the advantages of ubiquitous proximity-sensing with non-optical sensors to preserve privacy in public areas and maintain low computation cost with a sparse sensor implementation. Our solution is the ExerTrack, an off-the-shelf sports mat equipped with eight sparsely distributed capacitive proximity sensors to recognize eight whole-body fitness exercises with a user-independent recognition accuracy of 93.5 % and a user-dependent recognition accuracy of 95.1 % based on a test study with 9 participants each performing 2 full sessions. We adopt a template-based approach to count repetitions and reach a user-independent counting accuracy of 93.6 %. The final model can run on a Raspberry Pi 3 in real time. This work includes data-processing of our proposed system and model selection to improve the recognition accuracy and data augmentation technique to regularize the network.

Uncontrolled Keywords: capacitive sensing, capacitive proximity-sensing, human activity recognition, exercise recognition, exercise counting, ubiquitous sensing, smart surfaces
Status: Publisher's Version
URN: urn:nbn:de:tuda-tuprints-162901
Classification DDC: 000 Generalities, computers, information > 004 Computer science
600 Technology, medicine, applied sciences > 600 Technology
Divisions: 20 Department of Computer Science > Interactive Graphics Systems
Date Deposited: 09 Feb 2022 15:22
Last Modified: 08 Mar 2023 10:10
SWORD Depositor: Deep Green
URI: https://tuprints.ulb.tu-darmstadt.de/id/eprint/16290
PPN: 505575337
Actions (login required)
View Item View Item