Skip to content

3D eye model-based gaze estimation from a depth sensor

Research output: Chapter in Book/Report/Conference proceedingConference contribution

In this paper, we address the 3D eye gaze estimation problem using a low-cost, simple-setup, and nonintrusive consumer depth sensor (Kinect sensor). We present an effective and accurate method based on 3D eye model to estimate the point of gaze of a subject with the tolerance of free head movement. To determine the parameters involved in the proposed eye model, we propose i) an improved convolution-based means of gradients iris center localization method to accurately and efficiently locate the iris center in 3D space; ii) a geometric constraints-based method to estimate the eyeball center under the constraints that all the iris center points are distributed on a sphere originated from the eyeball center and the sizes of two eyeballs of a subject are identical; iii) an effective Kappa angle calculation method based on the fact that the visual axes of both eyes intersect at a same point with the screen plane. The final point of gaze is calculated by using the estimated eye model parameters. We experimentally evaluate our gaze estimation method on five subjects. The experimental results show the good performance of the proposed method with an average estimation accuracy of 3.78, which outperforms several state-of-the-arts.
Original languageEnglish
Title of host publicationProceedings of the 2016 IEEE International Conference on Robotics and Biomimetics
PublisherIEEE
ISBN (Electronic)978-1509043644
ISBN (Print)978-1509043651
DOIs
Publication statusPublished - 2 Mar 2017
Event2016 IEEE International Conference on Robotics and Biomimetics - Qingdao, China
Duration: 3 Dec 20167 Dec 2016

Conference

Conference2016 IEEE International Conference on Robotics and Biomimetics
Abbreviated titleROBIO 2016
CountryChina
CityQingdao
Period3/12/167/12/16

Documents

  • Gaze_ROBIO16

    Rights statement: © 2017 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.

    Accepted author manuscript (Post-print), 209 KB, PDF document

Related information

Relations Get citation (various referencing formats)

ID: 5228394