Skip to content

Two-eye model-based gaze estimation from a Kinect sensor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

In this paper, we present an effective and accurate gaze estimation method based on two-eye model of a subject with the tolerance of free head movement from a Kinect sensor. To accurately and efficiently determine the point of gaze, i) we employ two-eye model to improve the estimation accuracy; ii) we propose an improved convolution-based means of gradients method to localize the iris center in 3D space; iii) we present a new personal calibration method that only needs one calibration point. The method approximates the visual axis as a line from the iris center to the gaze point to determine the eyeball centers and the Kappa angles. The final point of gaze can be calculated by using the calibrated personal eye parameters. We experimentally evaluate the proposed gaze estimation method on eleven subjects. Experimental results demonstrate that our gaze estimation method has an average estimation accuracy around 1.99°, which outperforms many leading methods in the state-of-the-art. © 2017 IEEE.
Original languageEnglish
Title of host publication2017 IEEE International Conference on Robotics and Automation
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages8
ISBN (Electronic)978-1509046331
ISBN (Print)978-1509046348
Publication statusPublished - 24 Jul 2017
Event2017 IEEE International Conference on Robotics and Automation - Singapore, Singapore
Duration: 29 May 20173 Jun 2017


Conference2017 IEEE International Conference on Robotics and Automation
Abbreviated titleICRA


  • ICRA2017

    Rights statement: © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

    Accepted author manuscript (Post-print), 1.03 MB, PDF document

Related information

Relations Get citation (various referencing formats)

ID: 9004049