Human eye-gaze tracking has been receiving increasing interest over the years. Recent advacements in mobile technology and a growing interest in capturing natural human behaviour have motivated an emerging interest in tracking and analysing eye movements continuously in unconstrained real-life conditions, referred to as pervasive eye-gaze tracking. The notion behind the paradigm of pervasive eye-gaze tracking is multi-faceted and typically relates to characteristics that facilitate eye-gaze tracking in uncontrolled real-life scenarios, such as robustness to varying illumination conditions and extenisve head rotations, the capability of estimating the eye-gaze at increased distance from the imaging hardware, reduced or implicit calibration in order to allow for situations that do not permit user co-operation and calibration awareness, and the estimation of eye-gaze on mobile devices comprising integrated hardware without requiring further hardware modification. This will potentially broaden the application areas for eye-gaze tracking within scenarios that may not permit for controlled conditions, such as for gaze-based interaction in public spaces.
Our research work has been motivated by an increasing interest in pervasive eye-gaze tracking and aims to address several of the main challenges associated with this field of interest. Specifically, our research work aims for eye-gaze tracking by joint head and eye pose estimation from image frames captured by a consumer-grade camera under ambient illumination alone. In order to achieve this, we propose methods to estimate the eye-gaze from low-resolution eye images, while allowing head and face movement without requiring prolonged user co-operation during calibration prior to the estimation of gaze. Our research work defines a spherical eye-in-head rotation model that permits gaze estimation under head movement by compensating for the change in eye region appearance due to head rotation. Furthermore, we have developed a method for the estimation of head pose under non-rigid face movement that exploits the information contained within trajectories of a set of feature points spread randomly over the face region, without seeking specific facial landmarks for model-fitting that are susceptible to occlusion during head rotations.
Part of this research work forms part of the project R&I-2016-010-V WildEye, financed by the Malta Council for Science and Technology through FUSION: The R&I Technology Development Programme 2016, in collaboration with Seasus Ltd. This project aims to develop a low-cost eye-gaze tracking platform as an alternative communication channel for disabled persons that may not otherwise control a computer via traditional peripheral devices, such as the mouse and keyboard.
