TY - JOUR
T1 - Analysis of User-defined Radar-based Hand Gestures Sensed through Multiple Materials
AU - Sluÿters, Arthur
AU - Lambot, Sébastien
AU - Vanderdonckt, Jean
AU - Villarreal Narvaez, Santiago
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2024/2/16
Y1 - 2024/2/16
N2 - Radar sensing can penetrate non-conducting materials, such as glass, wood, and plastic, which makes it appropriate for recognizing gestures in environments with poor visibility, limited accessibility, and privacy sensitivity. While the performance of radar-based gesture recognition in these environments has been extensively researched, the preferences that users express for these gestures are less known. To analyze such gestures simultaneously according to their user preference and their system recognition performance, we conducted three gesture elicitation studies each with n
1=30 participants to identify user-defined, radar-based gestures sensed through three distinct materials: the glass of a shop window, the wood of an office door, and polyvinyl chloride in an emergency. On this basis, we created a new dataset of nine selected gesture classes for n
2=20 participants repeating twice the same gesture captured by radar through three materials, i.e., glass, wood, and polyvinyl chloride. To uniformly compare recognition rates in these conditions with sensing variations, a specifically tailored procedure was defined and conducted with one-shot radar calibration to train and evaluate a gesture recognizer. 'Wood' achieved the best recognition rate (96.44%), followed by 'Polyvinyl chloride' and 'Glass'. We perform a preference-performance analysis of the gestures by combining the agreement rate from the elicitation studies and the recognition rate from the evaluation.
AB - Radar sensing can penetrate non-conducting materials, such as glass, wood, and plastic, which makes it appropriate for recognizing gestures in environments with poor visibility, limited accessibility, and privacy sensitivity. While the performance of radar-based gesture recognition in these environments has been extensively researched, the preferences that users express for these gestures are less known. To analyze such gestures simultaneously according to their user preference and their system recognition performance, we conducted three gesture elicitation studies each with n
1=30 participants to identify user-defined, radar-based gestures sensed through three distinct materials: the glass of a shop window, the wood of an office door, and polyvinyl chloride in an emergency. On this basis, we created a new dataset of nine selected gesture classes for n
2=20 participants repeating twice the same gesture captured by radar through three materials, i.e., glass, wood, and polyvinyl chloride. To uniformly compare recognition rates in these conditions with sensing variations, a specifically tailored procedure was defined and conducted with one-shot radar calibration to train and evaluate a gesture recognizer. 'Wood' achieved the best recognition rate (96.44%), followed by 'Polyvinyl chloride' and 'Glass'. We perform a preference-performance analysis of the gestures by combining the agreement rate from the elicitation studies and the recognition rate from the evaluation.
KW - Gesture elicitation study
KW - Gesture sensing through materials
KW - Hand gesture recognition
KW - adar-based gesture recognition
KW - User-defined gestures
KW - user-defined gestures
KW - one-shot radar calibration
KW - radar-based gesture recognition
KW - gesture sensing through materials
KW - new datasets
KW - hand gesture recognition
UR - http://www.scopus.com/inward/record.url?scp=85185377378&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2024.3366667
DO - 10.1109/ACCESS.2024.3366667
M3 - Article
SN - 2169-3536
VL - 12
SP - 27895
EP - 27917
JO - IEEE Access
JF - IEEE Access
IS - 1
M1 - 1
ER -