DOI QR코드

DOI QR Code

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul (Department of Computer Science, College of Software, Sangmyung University) ;
  • Park, Min Woo (Department of Computer Science, Graduate School, Sangmyung University)
  • Received : 2012.12.31
  • Accepted : 2013.04.08
  • Published : 2013.04.30

Abstract

To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Keywords

References

  1. Y. J. Ko, E. C. Lee and K. R. Park, "A robust gaze detection method by compensating for facial movements based on corneal specularities," Pattern Recognition Letters, vol. 29, no. 10, pp. 1474-1485, 2008. http://dx.doi.org/10.1016/j.patrec.2008.02.026
  2. E. C. Lee and K. R. Park, "A robust eye gaze tracking method based on virtual eyeball model," Machine Vision and Applications, vol. 20, no. 5, pp. 319-337, 2009. http://dx.doi.org/10.1007/s00138-008-0129-z
  3. T. D. Andrew, S. Vinay, R. Tim, K. G. Anand, J. M. Brian and K. Barbara, "Binocular eye tracking in virtual reality for inspection training," in Proc. of the 2000 symposium on Eye tracking research & applications, pp. 89-96, 2000. http://dl.acm.org/citation.cfm?id=355031
  4. A. W. Paul, M. A. Gretchen and A. B. Barbara, "Instructor use of eye position based feedback for pilot training," in Proc. of the Human Factors and Ergonomics Society Annual Meeting, vol. 42, no. 20, pp. 1388-1392, 1998. doi: 10.1177/154193129804202005
  5. H. Heo, E. C. Lee, C. J. Kim, K. R. Park and M. C. Whang, "A realistic game system using multi-modal user interface," IEEE Transactions on Consumer Electronics, vol. 53, no. 6, pp. 1364-1372, 2010. http://dx.doi.org/10.1109/TCE.2010.5606271
  6. H. G. Joseph, J. S. Mark, L. Marion, S. Neil and M. W. Anna, "Eye tracking in web search tasks:design implications," in Proc. of the 2000 symposium on Eye tracking research & applications, pp. 51-58, 2000. http://dl.acm.org/citation.cfm?id=507082
  7. T. D. Andrew, "Eye Tracking Methodology: Theory and Practice," Springer, 2007.
  8. C.Colombo and A. Del. Bimbo, "Interacting through eyes," Robotics and Autonomous Systems, vol. 19, no. 3-4, pp. 359-368, 1997. http://dx.doi.org/10.1016/S0921-8890(96)00062-0
  9. Y. Ebisawa, "Improved video-based eye-gaze detection method," IEEE Transactions on Instrumentation and Measurement, vol. 47, no. 4, pp. 948-955, 1998. http://dx.doi.org/10.1109/19.744648
  10. D. H. Yoo, J. H. Kim, B. R. Lee and M. J. Chung, "Non-contact eye gaze tracking system by mapping of corneal reflections," in Proc. of Fifth IEEE International Conference on Automatic Face Gesture Recognition, pp. 101-106, 2002.
  11. S. W. Shih and J. Liu, "A novel approach to 3-D gaze tracking using stereo cameras," IEEE Transactions on Systems Man and Cybernetics Part B, vol. 34, no. 1, pp. 234-245, 2004. http://dx.doi.org/10.1109/TSMCB.2003.811128
  12. T. Ohno, N. Mukawa and A. Yoshikawa, "FreeGaze: a gaze tracking system for everyday gaze interaction," in Proc. of the 2000 symposium on Eye tracking research & applications, pp. 125-132, 2002. http://dl.acm.org/citation.cfm?id=507098
  13. E. C. Lee, K. R. Park, M. C. Whang and J. Park, "Robust gaze tracking method for stereoscopic virtual reality system," Lecture Notes in Computer Science, vol. 4552, pp. 700-709, 2007. http://dx.doi.org/10.1007/978-3-540-73110-8_76
  14. C. W. Cho, J. W. Lee, E. C. Lee and K. R. Park, "A robust gaze tracking method by using frontal viewing and eye tracking cameras," Optical Engineering, vol. 48, no. 12, pp. 127202-1-127202-15, 2009. http://dx.doi.org/10.1117/1.3275453
  15. L. Kristian, "Measuring gaze point on handheld mobile devices," in Proc. of ACM CHI 2004 Conference on Human Factors in Computing Systems, pp. 1556, 2004. http://dl.acm.org/citation.cfm?id=986132
  16. D. Heiko, D. L. Alexander and A. Schmidt, "Eye-gaze interaction for mobile phones," in Proc. of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology, pp. 364-371, 2007. http://dl.acm.org/citation.cfm?id=1378122
  17. O. Gustav and K. Lundin, "Eye movement study of reading text on a mobile phone using paging, scrolling, leading, and RSVP," in Proc. of the 6th International Conference on Mobile and Ubiquitous Multimedia, pp. 176-183, 2007. http://dl.acm.org/citation.cfm?id=1329493
  18. R. H. Thomas, E. Eva and L. O. Andreas, "Use your head: exploring face tracking for mobile interaction," in Proc. of the SIGCHI conference on Human Factors in computing systems, pp. 845-850, 2006. http://dl.acm.org/citation.cfm?id=1125617
  19. E. C. Lee, Y. J. Ko and K. R. Park, "Gaze tracking based on active appearance model and multiple support vector regression on mobile devices," Optical Engineering, vol. 48, no. 7, pp. 077002-1-077002-11, 2009. http://dx.doi.org/10.1117/1.3158992
  20. R. C. Gonzalez and R. E. Woods, "Digital Image Processing, 2nd Edition," Prentice-Hall Inc, 2002.
  21. J. R. Matey, O. Naroditsky, K. Hanna, R. Kolczynski, D. J. LoIacono, S. Mangru, M. Tinker, T. M. Zappia and W. Y. Zhao, "Iris on the move: acquisition of images for iris recognition in less constrained environments," Proceedings of the IEEE, vol. 94, no. 11, pp. 1936-1947, 2006. http://dx.doi.org/10.1109/JPROC.2006.884091
  22. W. Peng, M. B. Green, J. Qiang and J. Wayman, "Automatic eye detection and its validation," in Proc. Computer Vision and Pattern Recognition - Workshops, pp. 164, 2005. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1565482&tag=1
  23. E. C. Lee and S. H. Son, "Anti-spoofing method for iris recognition by combining the optical and textural features of human eye," KSII Transactions on Internet and Information Systems, vol. 6, no. 9, pp. 2424-2441, 2012. http://dx.doi.org/10.3837/tiis.2012.09.026
  24. P. H. John, W. A. Allan and R. Peter, "Eye-gaze control of multimedia systems," in Proc. of the Sixth International Conference on Human-Computer Interaction, pp. 37-42, 1995. http://dx.doi.org/10.1016/S0921-2647(06)80008-0

Cited by

  1. 2D Gaze Estimation Based on Pupil-Glint Vector Using an Artificial Neural Network vol.6, pp.6, 2016, https://doi.org/10.3390/app6060174