In this paper we present an exploratory work on the use of foot movements to support fundamental 3D interaction tasks. Depth cameras such as the Microsoft Kinect are now able to track users’ motion unobtrusively, making it possible to draw on the spatial context of gestures and movements to control 3D UIs. Whereas multitouch and mid-air hand gestures have been explored extensively for this purpose, little work has looked at how the same can be accomplished with the feet. We describe the interaction space of foot movements in a seated position and propose applications for such techniques in three-dimensional navigation, selection, manipulation and system control tasks in a 3D modelling context. We explore these applications in a user study and discuss the advantages and disadvantages of this modality for 3D UIs.
|Title of host publication
|2014 IEEE Symposium on 3D User Interfaces (3DUI 2014)
|Anatole Lécuyer, Rob Lindeman, Frank Steinicke
|Institute of Electrical and Electronics Engineers Inc.
|Published - 29 Mar 2014
|IEEE 9th Symposium on 3D User Interfaces (3DUI 2014) - Minneapolis, United States
Duration: 29 Mar 2014 → 30 Mar 2014
|IEEE 9th Symposium on 3D User Interfaces (3DUI 2014)
|29/03/14 → 30/03/14