DOI QR코드

DOI QR Code

Design of Immersive Walking Interaction Using Deep Learning for Virtual Reality Experience Environment of Visually Impaired People

시각 장애인 가상현실 체험 환경을 위한 딥러닝을 활용한 몰입형 보행 상호작용 설계

  • Oh, Jiseok (Department of Software, Catholic University of Pusan) ;
  • Bong, Changyun (Department of Software, Catholic University of Pusan) ;
  • Kim, Jinmo (Division of Computer Engineering, Hansung University)
  • 오지석 (부산가톨릭대학교 소프트웨어학과) ;
  • 봉찬균 (부산가톨릭대학교 소프트웨어학과) ;
  • 김진모 (한성대학교 컴퓨터공학부)
  • Received : 2019.05.29
  • Accepted : 2019.06.21
  • Published : 2019.07.14

Abstract

In this study, a novel virtual reality (VR) experience environment is proposed for enabling walking adaptation of visually impaired people. The core of proposed VR environment is based on immersive walking interactions and deep learning based braille blocks recognition. To provide a realistic walking experience from the perspective of visually impaired people, a tracker-based walking process is designed for determining the walking state by detecting marching in place, and a controller-based VR white cane is developed that serves as the walking assistance tool for visually impaired people. Additionally, a learning model is developed for conducting comprehensive decision-making by recognizing and responding to braille blocks situated on roads that are followed during the course of directions provided by the VR white cane. Based on the same, a VR application comprising an outdoor urban environment is designed for analyzing the VR walking environment experience. An experimental survey and performance analysis were also conducted for the participants. Obtained results corroborate that the proposed VR walking environment provides a presence of high-level walking experience from the perspective of visually impaired people. Furthermore, the results verify that the proposed learning algorithm and process can recognize braille blocks situated on sidewalks and roadways with high accuracy.

본 연구는 시각 장애인의 도보 적응을 위한 새로운 가상현실 체험 환경을 제안한다. 제안하는 가상현실 체험 환경의 핵심은 몰입형 보행 상호작용과 딥러닝 기반 점자 블록 인식으로 구성된다. 우선, 시각 장애인의 입장에서 현실적인 걷기 경험을 제공함을 목적으로 제자리 걸음을 감지하여 걷기를 판단하는 트래커 기반 걷기 처리과정과 시각 장애인의 보행 보조 도구를 가상현실에 적용한 컨트롤러 기반 VR 흰지팡이를 설계한다. 또한, VR 흰지팡이를 활용한 길 안내 과정에서 도로 위의 점자 블록 인지 및 반응 등 종합적인 의사결정을 수행하는 학습 모델을 제안한다. 이를 기반으로 가상현실 도보 체험 환경에 대한 실험을 위하여 실외 도시 환경으로 구성된 가상현실 어플리케이션을 제작하고, 참가자를 대상으로 설문 실험 및 성능 분석을 진행하였다. 결과적으로 제안한 가상현실 체험 환경이 시각 장애인의 입장에서 현존감 높은 도보 체험을 제공하고 있음을 확인하였다. 그리고 제안한 학습과 처리과정이 인도와 차도, 인도 위의 점자 블록을 높은 정확도로 인지함을 확인하였다.

Keywords

References

  1. H. Joo, T. Simon, and Y. Sheikh, "Total capture: A 3d deformation model for tracking faces, hands, and bodies," in Proceedings of The 2018 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), ser. CVPR '18, vol. abs/1801.01615. Washington, DC, USA: IEEE Computer Society, 18-22 June 2018, pp. 8320-8329. [Online]. Available: http://arxiv.org/abs/1801.01615
  2. S. Marwecki, M. Brehm, L. Wagner, L.-P. Cheng, F. F. Mueller, and P. Baudisch, "Virtualspace - overloading physical space with multiple virtual reality users," in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ser. CHI '18. New York, NY, USA: ACM, 21-26 April 2018, pp. 241:1-241:10. [Online]. Available: http://doi.acm.org/10.1145/3173574.3173815
  3. S. Park, W. Park, H. Heo, and J. Kim, "A study on presence of collaboration based multi-user interaction in immersive virtual reality," Journal of the Korea Computer Graphics Society, vol. 24, no. 3, pp. 11-20, 2018. [Online]. Available: https://doi.org/10.15701/kcgs.2018.24.3.11
  4. M. Slater and M. V. Sanchez-Vives, "Enhancing our lives with immersive virtual reality," Frontiers in Robotics and AI, vol. 3, p. 74, 2016.
  5. C. Pacchierotti, F. Chinello, M. Malvezzi, L. Meli, and D. Prattichizzo, "Two finger grasping simulation with cutaneous and kinesthetic force feedback," in Haptics: Perception, Devices, Mobility, and Communication (EuroHaptics 2012), P. Isokoski and J. Springare, Eds. Berlin, Heidelberg: Springer Berlin Heidelberg, 13-15 June 2012, pp. 373-382.
  6. E. Remelli, A. Tkach, A. Tagliasacchi, and M. Pauly, "Low-dimensionality calibration through local anisotropic scaling for robust hand model personalization," 2017 IEEE International Conference on Computer Vision (ICCV), pp. 2554-2562, 22-29 October 2017.
  7. Z.-C. Dong, X.-M. Fu, C. Zhang, K. Wu, and L. Liu, "Smooth assembled mappings for large-scale real walking," ACM Trans. Graph., vol. 36, no. 6, pp. 211:1-211:13, Nov. 2017. [Online]. Available: http://doi.acm.org/10.1145/3130800.3130893
  8. J. Lee, K. Jeong, and J. Kim, "Mave: Maze-based immersive virtual environment for new presence and experience," Computer Animation and Virtual Worlds, vol. 28, no. 3-4, p. e1756, 2017. https://doi.org/10.1002/cav.1756
  9. H. Wu, Y. Wang, J. Qiu, J. Liu, and X. L. Zhang, "User-defined gesture interaction for immersive vr shopping applications," Behaviour & Information Technology, vol. 0, no. 0, pp. 1-16, 2018. [Online]. Available: https://doi.org/10.1080/0144929X.2018.1552313
  10. G. Yu, H.-J. Kim, H.-S. Kim, and J. Lee, "Virtual home training - virtual reality small scale rehabilitation system," Journal of the Korea Computer Graphics Society, vol. 24, no. 3, pp. 93-100, 2018. [Online]. Available: https://doi.org/10.15701/kcgs.2018.24.3.93
  11. H.-S. Kim, H.-J. Kim, Y.-S. Lee, and J. Lee, "Criminal profiling simulation training and assessment system based on virtual," Journal of the Korea Computer Graphics Society, vol. 24, no. 3, pp. 83-92, 2018. [Online]. Available: https://doi.org/10.15701/kcgs.2018.24.3.83
  12. M. Kim, C. Jeon, and J. Kim, "A study on immersion and presence of a portable hand haptic system for immersive virtual reality," Sensors, vol. 17, no. 5, p. 1141, 2017. [Online]. Available: http://www.mdpi.com/1424-8220/17/5/1141 https://doi.org/10.3390/s17051141
  13. M. Kim, J. Lee, C. Jeon, and J. Kim, "A study on interaction of gaze pointer-based user interface in mobile virtual reality environment," Symmetry, vol. 9, no. 9, p. 189, 2017. [Online]. Available: http://www.mdpi.com/2073-8994/9/9/189 https://doi.org/10.3390/sym9090189
  14. S. Han and J. Kim, "A study on immersion of hand interaction for mobile platform virtual reality contents," Symmetry, vol. 9, no. 2, p. 22, 2017. https://doi.org/10.3390/sym9020022
  15. W. Zhao, J. Chai, and Y.-Q. Xu, "Combining markerbased mocap and rgb-d camera for acquiring high-fidelity hand motion data," in Proceedings of the ACM SIG-GRAPH/Eurographics Symposium on Computer Animation, ser. SCA '12. Aire-la-Ville, Switzerland, Switzerland: Eurographics Association, 29-31 July 2012, pp. 33-42.
  16. C. Carvalheiro, R. Nobrega, H. da Silva, and R. Rodrigues, "User redirection and direct haptics in virtual environments," in Proceedings of the 2016 ACM on Multimedia Conference, ser. MM '16. New York, NY, USA: ACM, 15-19 October 2016, pp. 1146-1155.
  17. A. Jayasiri, S. Ma, Y. Qian, K. Akahane, and M. Sato, "Desktop versions of the string-based haptic interface-spidar," in 2015 IEEE Virtual Reality (VR). IEEE, 23-27 March 2015, pp. 199-200.
  18. D. Leonardis, M. Solazzi, I. Bortone, and A. Frisoli, "A 3-rsr haptic wearable device for rendering fingertip contact forces," IEEE Transactions on Haptics, vol. 10, no. 3, pp. 305-316, July 2017. https://doi.org/10.1109/TOH.2016.2640291
  19. D. Prattichizzo, F. Chinello, C. Pacchierotti, and M. Malvezzi, "Towards wearability in fingertip haptics: A 3-dof wearable device for cutaneous force feedback," IEEE Transactions on Haptics, vol. 6, no. 4, pp. 506-516, Oct 2013. https://doi.org/10.1109/TOH.2013.53
  20. M. Slater and M. Usoh, "Simulating peripheral vision in immersive virtual environments," Computers & Graphics, vol. 17, no. 6, pp. 643 - 653, 1993. https://doi.org/10.1016/0097-8493(93)90113-N
  21. M. Slater, M. Usoh, and A. Steed, "Taking steps: The influence of a walking technique on presence in virtual reality," ACM Trans. Comput.-Hum. Interact., vol. 2, no. 3, pp. 201-219, Sept. 1995. https://doi.org/10.1145/210079.210084
  22. V. Vinayagamoorthy, M. Garau, A. Steed, and M. Slater, "An eye gaze model for dyadic interaction in an immersive virtual environment: Practice and experience," Computer Graphics Forum, vol. 23, no. 1, pp. 1-11, 2004. https://doi.org/10.1111/j.1467-8659.2004.00001.x
  23. K. Jeong, S. Han, D. Lee, and J. Kim, "A Study on Virtual Reality Techniques for Immersive Traditional Fairy Tale Contents Production," Journal of the Korea Computer Graphics Society, vol. 22, pp. 43-52, 2016. https://doi.org/10.15701/kcgs.2016.22.3.43
  24. J. Lee, M. Kim, and J. Kim, "A study on immersion and vr sickness in walking interaction for immersive virtual reality applications," Symmetry, vol. 9, no. 5, p. 78, 2017. https://doi.org/10.3390/sym9050078
  25. C. Szegedy, V. Vanhoucke, S. Ioffe, J. Shlens, and Z. Wojna, "Rethinking the inception architecture for computer vision," CoRR, vol. abs/1512.00567, 2015. [Online]. Available: http://arxiv.org/abs/1512.00567
  26. B. G. Witmer, C. J. Jerome, and M. J. Singer, "The factor structure of the presence questionnaire," Presence: Teleoper. Virtual Environ., vol. 14, no. 3, pp. 298-312, jun 2005. https://doi.org/10.1162/105474605323384654