Advanced Eye Gaze Detector
Abstract
This paper addresses the eye gaze tracking problem using a low-cost and more convenient web camera in a desktop environment, as opposed to gaze tracking techniques requiring specific hardware, e.g., infrared high-resolution camera and infrared light sources, as well as a cumbersome calibration process. In the proposed method, we first track the human face in a real-time video sequence to extract the eye regions. Then, we combine intensity energy and edge strength to obtain the iris centre and utilize the piecewise eye corner detector to detect the eye corner. We adopt a sinusoidal head model to simulate the 3-D head shape and propose adaptive weighted facial features embedded in the pose from the orthography and scaling with iterations algorithm, whereby the head pose can be estimated. Finally, the eye gaze tracking is accomplished by integration of the eye vector and the head movement information. Experiments are performed to estimate the eye movement and head pose on the BioID dataset and pose Dataset, respectively. In addition, experiments for gaze tracking are performed in real-time video sequences under a desktop environment. The proposed method is not sensitive to the light conditions. Experimental results show that our method achieves an average accuracy of around 1.28◦ without head movement and 2.27◦ with the minor movement of the head.
Full Text:
PDFReferences
A. T. Duchowski, “A breadth-first survey of eye-tracking applications,” Behaviour Res. Methods, Instrum., Comput., vol. 34, no. 4, pp. 455–470, 2002.
S. P. Liversedge and J. M. Findlay, “Saccadic eye movements and cognition,” Trends Cognitive Sci., vol. 4, no. 1, pp. 6–14, 2000.
M. Mason, B. Hood, andC.N. Macrae, “Look into my eyes:Gaze direction and person memory,” Memory, vol. 12, no. 5, pp. 637–643, 2004.
Z. Kang and S. J. Landry, “An eye movement analysis algorithm for a multielement target tracking task: Maximum transition-based agglomerative hierarchical clustering,” IEEE Trans. Human Mach. Syst., vol. 45, no. 1, pp. 13–24, Feb. 2015.
D. A. Robinson, “A method of measuring eye movement using a scleral search coil in a magnetic field,” IEEE Trans. Bio-med. Electron., vol. 10, no. 4, pp. 137–145, Oct. 1963.
A. E. Kaufman, A. Bandopadhay, and B. D. Shaviv, “An eye tracking computer user interface,” in Proc. IEEE Symp. Res. Frontiers Virtual The reality, 1993, pp. 120–121.
J. S. Babcock and J. B. Pelz, “Building a lightweight eye-tracking headgear,” in Proc. Symp. Eye Tracking Res. Appl., 2004, pp. 109–114.
K. Takemura, K. Takahashi, J. Takamatsu, and T. Ogasawara, “Estimating 3-D point-of-regard in a real environment using a head-mounted eye-tracking system,” IEEE Trans. Human Mach. Syst., vol. 44, no. 4, pp. 531–536, Aug. 2014.
J. J.Cerrolaza, A.Villanueva, and R. Cabeza, “Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems,” in Proc. Symp. Eye Tracking Res. Appl., 2008, pp. 259–266.
A. Haro, I. Essa, and M. Flickner, “A non-invasive computer vision system for reliable eye tracking,” in Proc. Extended Abstracts Human Factors Comput. Syst., 2000, pp. 167–168.
D. W. Hansen and A. E. Pece, “Eye tracking in the wild,” Comput. Vis. Image Understanding, vol. 98, no. 1, pp. 155–181, 2005.
E. D. Guestrin and E. Eizenman, “General theory of remote gaze estimation using the pupil center and corneal reflections,” IEEE Trans. Biomed. Eng., vol. 53, no. 6, pp. 1124–1133, Jun. 2006.
J. Zhu and J. Yang, “Subpixel eye gaze tracking,” in Proc. 5th IEEE Int. Conf. Autom. Face Gesture Recog., 2002, pp. 124–129.
R.Valenti, N. Sebe, and T. Gevers, “Combining head pose and eye location information for gaze estimation,” IEEE Trans. Image Process., vol. 21, no. 2, pp. 802–815, Feb. 2012.
D. Torricelli, S. Conforto, M. Schmid, and T. Dalessio, “A neural-based the remote eye gazes tracker under natural head motion,” Comput. Methods Programs Biomed. vol. 92, no. 1, pp. 66–78, 2008. ch. 2, ch. 5.
DOI: http://dx.doi.org/10.36039/AA072016006.
Refbacks
- There are currently no refbacks.
This work is licensed under a Creative Commons Attribution 3.0 License.