Full Body Interaction – Design, Implementation, and User Support

Felix Kistler

Full Body Interaction – Design, Implementation, and User Support

erschienen 2016

Verlag: Universität Augsburg

While researchers have been working on device-free gestural interaction for multiple decades, full body interaction was first truly introduced to the mass market by the Microsoft Kinect for Xbox 360 in November 2010. Full body interaction is meant as a very unobtrusive and natural interaction modality; users can interact with a computer via motions of the whole body and without touching, wearing or holding any device or special gear. However, there are many differences between full body interaction and traditional interaction technologies such as the mouse and keyboard. It is not easy to provide good usability throughout the interaction because of its lower precision and higher complexity. Thus, multiple challenges still remain until full body interaction will gain further acceptance. In this dissertation, I investigate those challenges and present three major contributions: In the first, I follow a user-centered design process to create gesture sets that, on the one hand, are intuitive and easy to reproduce for the actual users, while on the other hand, are consistent, unambiguous, and can be recognized with low-cost technology. The second and main technical contribution of this dissertation is the Full Body Interaction (FUBI) framework, which can be used to easily integrate full body interaction in arbitrary applications, using an XML-based gesture-definition language that supports powerful gesture recognition. In addition, FUBI can be used to implement freehand interaction with a graphical user interface (GUI) or to implement avatar control. Besides being able to integrate full body interaction in an application, it is also important to support the user during the interaction. The third contribution therefore focuses on mechanisms such as affordances, feedback and feedforward, to help the users understand which gestures are currently available, how they should be performed, but also why they may not be recognized in certain cases. In this work, I focus mainly on virtual environments, which are especially suited for full body interaction. To prove the generalizability of my research, I further look at application scenarios in which a user controls GUIs or humanoid robots. Overall, I present concepts, implementations and study results to provide insights on how to improve the process of creating full body interaction applications. I therefore take into account all stake-holders of full body interaction: the interaction designer, the developer, as well as the end user.


  • bibtex  -  (bibtex.txt, 0 KB)