Gaze estimation driven solution for interacting children with ASD

Haibin Cai, Xiaolong Zhou, Hui Yu, Honghai Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

202 Downloads (Pure)

Abstract

This paper investigates gaze estimation solutions for interacting children with Autism Spectrum Disorders (ASD). Previous research shows that satisfactory accuracy of gaze estimation can be achieved in constrained settings. However, most of the existing methods can not deal with large head movement (LHM) that frequently happens when interacting with children with ASD scenarios. We propose a gaze estimation method aiming at dealing with large head movement and achieving real time performance. An intervention table equipped with multiple sensors is designed to capture images with LHM. Firstly, reliable facial features and head poses are tracked using supervised decent method. Secondly, a convolution based integer-differential eye localization approach is used to locate the eye center efficiently and accurately. Thirdly, a rotation invariant gaze estimation model is built based on the located facial features, eye center, head pose and the depth data captured from Kinect. Finally, a multi-sensor fusion strategy is proposed to adaptively select the optimal camera to estimate the gaze as well as to fuse the depth information of the Kinect with the web camera. Experimental results showed that the gaze estimation method can achieve acceptable accuracy even in LHM situation and could potentially be applied in therapy for children with ASD.
Original languageEnglish
Title of host publication2015 International Symposium on Micro-NanoMechatronics and Human Science (MHS)
PublisherIEEE
Pages419-424
ISBN (Electronic)978-1-4673-8217-5
ISBN (Print)978-1-4673-8218-2
DOIs
Publication statusPublished - Jun 2016

Fingerprint

Dive into the research topics of 'Gaze estimation driven solution for interacting children with ASD'. Together they form a unique fingerprint.

Cite this