Evaluation of Gaze Depth Estimation using a Wearable Binocular Eye tracker and Machine Learning

착용형 양안 시선추적기와 기계학습을 이용한 시선 초점 거리 추정방법 평가

  • Received : 2017.08.14
  • Accepted : 2017.12.11
  • Published : 2018.03.01

Abstract

In this paper, we propose a gaze depth estimation method based on a binocular eye tracker for virtual reality and augmented reality applications. The proposed gaze depth estimation method collects a wide range information of each eye from the eye tracker such as the pupil center, gaze direction, inter pupil distance. It then builds gaze estimation models using Multilayer perceptron which infers gaze depth with respect to the eye tracking information. Finally, we evaluated the gaze depth estimation method with 13 participants in two ways: the performance based on their individual models and the performance based on the generalized model. Through the evaluation, we found that the proposed estimation method recognized gaze depth with 90.1% accuracy for 13 individual participants and with 89.7% accuracy for including all participants.

본 논문은 가상현실 및 증강현실을 위해 양안식 눈추적기 기반의 시선 깊이 추정 기법을 제안한다. 제안한 방법은 먼저 양안식 눈추적기로부터 안구 및 시선과 관련된 다양한 정보를 획득한다. 이후 획득된 정보를 바탕으로 다층퍼셉트론 알고리즘 기반의 시선 추적과 인식 모델을 통해 눈 시선 깊이를 추정한다. 제안한 방법을 검증하기 위해 13명의 참여자를 모집하고 개인별 시선 추적과 범용 시선 추적에 대한 성능을 분석하였다. 실험결과 개인별 모델에서는 90.1%, 그리고 전체 사용자를 대상으로 한 범용 모델에서는 89.7%의 정확도를 보였다.

Keywords

References

  1. https://www.facebook.com/spaces [Accessed Aug. 09, 2017]
  2. S. Orts-Escolano, C. Rhemann, S. Fanello, W. Chang, A. Kowdle, Y. Degtyarev, D. Kim, P. L. Davidson, S. Khamis, M. Dou, V. Tankovich, C. Loop, Q. Cai, P. A. Chou, S. Mennicken, J. Valentin, V. Pradeep, S. Wang, S. B. Kang, P. Kohli, Y. Lutchyn, C. Keskin, and S. Izadi.. "Holoportation: Virtual 3D Teleportation in Real-time," In Proceedings of the 29th Annual Symposium on User Interface Software and Technology (UIST '16), pp. 741-754, 2016.
  3. D. W. Hansen, Q. Ji, In the Eye of the Beholder: A Survey of Models for Eyes and Gaze, IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 32, No. 3, pp. 478-500, 2010. https://doi.org/10.1109/TPAMI.2009.30
  4. C. Cho, J. Lee, E. Lee and K. Park, "Robust gaze-tracking method by using frontal-viewing and eye-tracking cameras," Opt. Eng, Vol. 48, No. 12, 2009.
  5. K. Tan, D.J Kriegman, N Ahuja, "Appearance-based Eye Gaze Estimation," Proceedings. Sixth IEEE Workshop on Applications of Computer Vision (WACV 2002).
  6. Y. Sugano, Y. Matsushita, Y. Sato, "Appearance-based gaze estimation using visual saliency," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol 35, No.2, pp. 329-341, 2013. https://doi.org/10.1109/TPAMI.2012.101
  7. E. G. Mlot, H. Bahmani, S. Wahl, and E. Kasneci. 2016. 3D Gaze Estimation using Eye Vergence. In Proceedings of the 9th International Joint Conference on Biomedical Engineering Systems and Technologies, 2016.
  8. T. Toyama, D. Sonntag, J. Orlosky, K. Kiyokawa, A Natural Interface for Multi-focal Plane Head Mounted Displays Using 3D gaze," In Proceedings of the 2014 International Working Conference on Advanced Visual Interfaces (AVI '14). pp. 25-32, 2014.
  9. Y. Itoh, J. Orlosky, K. Kiyokawa, T. Amano, and M. Sugimoto, "Monocular Focus Estimation Method for a Freely-Orienting Eye using Purkinje-Sanson Images," In Proceeding of VR 2017.
  10. J. Lee, C. Cho, K. Shin, E. Lee, and K. Park, 3D Gaze Tracking Method Using Purkinje, Images on Eye Optical Model and Pupil, Optics and Lasers in Engineering, Vol. 50, No. 5, pp. 736-751, 2012.
  11. M. Kassner, W. Patera, and A. Bulling, "Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction," In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct), pp. 1151-1160.
  12. scikit-learn http://scikit-learn.org/stable/ [Accessed Aug. 09, 2017]
  13. Weka http://www.cs.waikato.ac.nz/-ml/weka/ [Accessed Aug. 09, 2017]
  14. Y. Lee, C. Shin, A. Plopski, Y. Itoh, A. Dey, G. Lee, S. Kim, M. Billinghurst, "Estimating Gaze Depth Using Multi-Layer Perceptron," 2017 International Symposium on Ubiquitous Virtual Reality (ISUVR), Nara, Japan, 2017, pp. 26-29.