Abstract
Integrating mobile eye tracking and optoelectronic motion capture enables point of gaze to be expressed within the laboratory co-ordinate system and presents a method not commonly applied during research examining dynamic behaviours, such as locomotion. This paper examines the quality of gaze data collected through the integration. Based on research suggesting increased viewing distances are associated with reduced data quality; the accuracy and precision of gaze data as participants (N = 11) viewed floor-based targets at distances of 1-6m was investigated. A mean accuracy of 2.55 ± 1.12° was identified, however, accuracy and precision measures (relative to targets) were significantly (p < .05) reduced at greater viewing distances. We then consider if signalprocessing techniques may improve accuracy and precision, and overcome issues associatedwith missing data. A 4th-order Butterworth lowpass filter with cut-off frequencies determinedvia autocorrelation did not significantly improve data quality, however, interpolation via Quinticspline was sufficient to overcome gaps of up to 0.1 s. We conclude the integration of gaze andmotion capture presents a viable methodology in the study of human behavior and presents advantages for data collection, treatment, and analysis. We provide considerations for the collection, analysis, and treatment of gaze data that may help inform future methodological decisions.
Original language | English |
---|---|
Number of pages | 25 |
Journal | i-Perception |
Volume | 13 |
Issue number | 5 |
DOIs | |
Publication status | Published - 26 Sept 2022 |
Keywords
- gaze behaviour
- eye tracking
- motion capture