In recent years, the interest of numerous fields for gestural interaction and gesture recognition has risen. While some of them can afford the use of various devices like gloves and armbands, artists favor more discreet techniques to maximize the impact of the integration of gestural interaction on their audience. The goal of this master thesis was to study and assess the potential of such interactions for interactive artistic systems, only using computer vision. By means of an Intel RealSense D435 camera and the Cubemos Skeleton TrackingSDK, different movement features to sound features mappings have been tested in order to identify the most intuitive and guessable ones for the users. This work presents the followed approach and the results it provided.
|la date de réponse||25 juin 2020|
|Superviseur||Bruno Dumas (Promoteur)|