Two-eye model-based gaze estimation from a Kinect sensor

Xiaolong Zhou, Haibin Cai, You Fu Li, Honghai Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

351 Downloads (Pure)


In this paper, we present an effective and accurate gaze estimation method based on two-eye model of a subject with the tolerance of free head movement from a Kinect sensor. To accurately and efficiently determine the point of gaze, i) we employ two-eye model to improve the estimation accuracy; ii) we propose an improved convolution-based means of gradients method to localize the iris center in 3D space; iii) we present a new personal calibration method that only needs one calibration point. The method approximates the visual axis as a line from the iris center to the gaze point to determine the eyeball centers and the Kappa angles. The final point of gaze can be calculated by using the calibrated personal eye parameters. We experimentally evaluate the proposed gaze estimation method on eleven subjects. Experimental results demonstrate that our gaze estimation method has an average estimation accuracy around 1.99°, which outperforms many leading methods in the state-of-the-art. © 2017 IEEE.
Original languageEnglish
Title of host publication2017 IEEE International Conference on Robotics and Automation
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages8
ISBN (Electronic)978-1509046331
ISBN (Print)978-1509046348
Publication statusPublished - 24 Jul 2017
Event2017 IEEE International Conference on Robotics and Automation - Singapore, Singapore
Duration: 29 May 20173 Jun 2017


Conference2017 IEEE International Conference on Robotics and Automation
Abbreviated titleICRA


Dive into the research topics of 'Two-eye model-based gaze estimation from a Kinect sensor'. Together they form a unique fingerprint.

Cite this