DOI QR코드

DOI QR Code

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences

상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스

  • Pyun, Hae-Gul (Global School of Media, Soongsil University) ;
  • An, Haeng-A (Global School of Media, Soongsil University) ;
  • Yuk, Seongmin (Global School of Media, Soongsil University) ;
  • Park, Jinho (Global School of Media, Soongsil University)
  • 편해걸 (숭실대학교 글로벌미디어학부) ;
  • 안향아 (숭실대학교 글로벌미디어학부) ;
  • 육성민 (숭실대학교 글로벌미디어학부) ;
  • 박진호 (숭실대학교 글로벌미디어학부)
  • Received : 2014.12.22
  • Accepted : 2015.02.11
  • Published : 2015.02.20

Abstract

This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.

본 논문은 기존 립모션 장비에 홀로그램과 햅틱 장비를 결합함으로써 향상된 몰입감과 사용 용이성을 제공할 수 있는 사용자 인터페이스를 제시한다. 립모션은 사용자 손동작의 물리적 행동이 직접 가상의 화면상에 영향을 주는 장치인데, 화면상의 가상 손모양을 제어해야 한다는 점과 가상환경에서의 영향을 사용자에게 전달할 수 없다는 한계를 가진다. 우리는 홀로그램을 립모션과 결합시켜 사용자로 하여금 실제 손과 가상 화면을 같은 공간에 배치함으로써 몰입감을 향상시킨다. 아울러 사용자의 손에 촉각을 전달할 수 있는 햅틱 장비를 장착하여 감각의 상호 작용을 실현하는 프로타입을 제시한다.

Keywords

References

  1. http://www.microsoft.com/en-us/kinectforwind ows
  2. https://www.leapmotion.com
  3. P. Garg, N. Aggarwal, S. Sofat, "Vision Based Hand Gesture Recognition", World Academy of Science, Engineering and Technology, Vol. 49, pp. 972-977, 2009.
  4. S. Mitra and T. Acharya, "Gesture Recognition: A Survey", IEEE Trans. Systems, Man, and Cybernetics, Part C: Applications and Rev., vol. 37, no. 3, pp. 311-324, May 2007. https://doi.org/10.1109/TSMCC.2007.893280
  5. C. Manresa et al., "Hand Tracking and Gesture Recognition for Human-Computer Interaction", Electronic Letters on Computer Vision and Image Analysis, vol. 5, no. 3, pp. 96-104, May 2005.
  6. T. Schlomer, B. Poppinga, N. Henze, S. Boll, "Gesture Recognition with a Wii Controller", Proceedings of the Second International Conference on Tangible and Embedded Interaction (TEI'08), Feb. 18-20, 2008.
  7. K. K. Biswas, S.K. Basu, "Gesture Recognition using Microsoft Kinect", IEEE, Automation, Robotics and Applications (ICARA), 2011 5th International Conference on, pp. 100-103, Dec. 6-8, 2011.
  8. J.L. Raheja, A. Chaudhary, K. Singal, "Tracking of Fingertips and Centres of Palm using Kinect", In proceedings of the 3rd IEEE International Conference on Computational Intelligence, Modelling and Simulation, Malaysia, Sept. 20-22, pp. 248-252, 2011.
  9. K. Khoshelham, "Accuracy Analysis of Kinect Depth Data", International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XXXVIII-5/W12, 2011 ISPRS Calgary Workshop, pp. 29-31, Calgary, Canada, August 2011.
  10. V. Frati, D. Prattichizzo, "Using Kinect for hand tracking and rendering in wearable haptics", IEEE World Haptics Conference (WHC), Istanbul, pp. 317-321, June 21-24, 2011.
  11. Guan-Feng He, Jin-Woong Park, Sun-Kyung Kang, Sung-Tae Jung, "Development of Gesture Recognition-Based 3D Serious Games", Journal of Korea Game Society, Vol. 11, No. 6, pp.103-114, 2011.
  12. Chang-Ik Jang, "A Study of the Physical Experience Using Serious Game Design Traffic Safety Education for Children applied using 3D Depth Gesture Recognition Technology", Journal of Korea Game Society, Vol. 12, No. 6, pp.5-14, 2012. https://doi.org/10.7583/JKGS.2012.12.6.5
  13. Super Nova team, Bronze prize, https://www.leap-motion.kr/index.php/contestfinalresult

Cited by

  1. Development of emotion recognition interface using complex EEG/ECG bio-signal for interactive contents vol.76, pp.9, 2017, https://doi.org/10.1007/s11042-016-4203-7
  2. Research of Real-Time Emotion Recognition Interface Using Multiple Physiological Signals of EEG and ECG vol.15, pp.2, 2015, https://doi.org/10.7583/JKGS.2015.15.2.105