LoCoMoTe – a framework for classification of natural locomotion in VR by task, technique and modality

Charlotte Croucher*, Wendy Powell, Brett Stevens, Matt Miller-Dicks, Vaughan Powell, Travis Wiltshire, Pieter Spronck

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

48 Downloads (Pure)

Abstract

Virtual reality (VR) research has provided overviews of locomotion techniques, how they work, their strengths and overall user experience. Considerable research has investigated new methodologies, particularly machine learning to develop redirection algorithms. To best support the development of redirection algorithms through machine learning, we must understand how best to replicate human navigation and behaviour in VR, which can be supported by the accumulation of results produced through live-user experiments. However, it can be difficult to identify, select and compare relevant research without a pre-existing framework in an ever-growing research field. Therefore, this work aimed to facilitate the ongoing structuring and comparison of the VR-based natural walking literature by providing a standardised framework for researchers to utilise. We applied thematic analysis to study methodology descriptions from 140 VR-based papers that contained live-user experiments. From this analysis, we developed the LoCoMoTe framework with three themes: navigational decisions, technique implementation, and modalities. The LoCoMoTe framework provides a standardised approach to structuring and comparing experimental conditions. The framework should be continually updated to categorise and systematise knowledge and aid in identifying research gaps and discussions.
Original languageEnglish
Number of pages19
JournalIEEE Transactions on Visualization and Computer Graphics
Early online date11 Sept 2023
DOIs
Publication statusEarly online - 11 Sept 2023

Keywords

  • human-computer interaction
  • machine learning
  • navigation
  • redirected walking
  • virtual reality

Cite this