DOI QR코드

DOI QR Code

전자 기기 조종을 위한 인간 동작 인식 기술 분석

An Analysis of Human Gesture Recognition Technologies for Electronic Device Control

  • 최민석 (상명대학교 미디어소프트웨어학과) ;
  • 장백철 (상명대학교 미디어소프트웨어학과)
  • 투고 : 2014.08.26
  • 심사 : 2014.11.05
  • 발행 : 2014.12.31

초록

본 논문에서 우리는 인간 동작 인식 기술을 카메라 기반, 추가적인 하드웨어 기반, 그리고 주파수 기반 기술들로 분류한다. 각 기술 항목에 대한 대표적인 기술사례들을 설명하고, 그들의 장점과 단점을 기술한다. 인간 동작 인식 기술에 대한 중요한 성능 이슈 항목을 정의하고, 소개된 인간 동작 인식 기술들을 정의된 성능 이슈 항목에 따라 분석한다. 분석 결과 카메라 기반 인간 동작 인식 기술들은 공통적으로 손쉽게 사용할 수 있고, 높은 정확도로 동작을 인식할 수 있지만, 비용적, 인지 범위 등의 단점이 있다. 이에 비해 추가 하드웨어 기반 동작 인식 기술들은 공간의 제약, 빛이나 소음 등의 영향을 받지 않거나 최소화하였지만, 사용자가 직접 착용을 해야 하는 단점을 가진다. 최근에는 이러한 문제점을 보완하고자 주파수 기반 동작 인식 기술들이 연구 및 개발 중에 있다. 이들은 공간의 제약을 줄이고, 추가적 장비 없이 쉽게 동작인식을 할 수 있지만 아직 상용화 되지 않은 초기 연구 단계이며, 다른 신호나 주파수가 정확도에 영향을 줄 수 있다는 단점이 있다.

In this paper, we categorize existing human gesture recognition technologies to camera-based, additional hardware-based and frequency-based technologies. Then we describe several representative techniques for each of them, emphasizing their strengths and weaknesses. We define important performance issues for human gesture recognition technologies and analyze recent technologies according to the performance issues. Our analyses show that camera-based technologies are easy to use and have high accuracy, but they have limitations on recognition ranges and need additional costs for their devices. Additional hardware-based technologies are not limited by recognition ranges and not affected by light or noise, but they have the disadvantage that human must wear or carry additional devices and need additional costs for their devices. Finally, frequency-based technologies are not limited by recognition ranges, and they do not need additional devices. However, they have not commercialized yet, and their accuracies can be deteriorated by other frequencies and signals.

키워드

참고문헌

  1. Korea Application of Next Generation IT Convergence Industry, "Technical Report on Motion Sensing Smart Input Devices", The next generation of PC Standardization Forum, pp.176, 2008.
  2. Microsoft's X-box Kinect appearance, www.xbox.com/ko-KR/browse/xbox-360/kinect/Gesture
  3. Microsoft's X-box Kinect performance property, http://www.intelliansys.co.kr/news/news_dsne ws_view.asp?idx=4
  4. Microsoft, Xbox 360+Kinect, What is Kinect?, http://www.xbox.com/ko-KR/Kinect. (accessed Feb. 6, 2014)
  5. KT Advanced Institute of Technology, Technology Hot Issues, "Eye-Tracking technology trends and utilization", pp.2, Aug. 2010.
  6. Nintendo, Wii, Wii Remote Plus, c2008 http://nintendo.co.kr/Wii/wii/controler.php#tabmenu, (accessed Feb. 6, 2014)
  7. Sidhant Gupta, Dan Morris, Shwetak N Patel, Desney Tan, "SoundWave: Using the Doppler Effect to Sense Gestures", CHI' 12 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp.1911-1914, 2012.
  8. Qifan Pu, Sidhant Gupta, Shyamnath Gollakota, Shwetak Patel, "Whole-Home Gesture Recognition Using Wireless Signals", MobiCom' 13 Proceedings of the 19th annual international conference on Mobile computing&networking, pp.27-38, 2013.
  9. Sukju Hong, Chilwoo Lee, "Vision-based gesture recognition technology research trends", Korea Robotics Society, Vol.3, No.3, pp.15-22, 2006.
  10. Leap Motion, What is the Leap Motion Controller?, Jan. 28, 2013, https://leapmotion.zendesk.com/entries/39268303-Buying-a-Leap-Motion-Controller. (accessed Feb. 6, 2014)
  11. Intellian Systems, "Kinect, Goes beyond the limits of the touch", DIGITAL SIGNAGE NEWS, Vol.4, Aug. 2011
  12. KOCCA, "CT INSIGHT : Human-Device Interaction Technology", No27, pp.27, Dec. 2012
  13. Youngrim Choi, Saehong Cho, "Eye Tracking system status and utilization", Digital Contents Society, Vol.8, No.1, pp.9-14, 2012.
  14. Korea Communications Agency, "The largest ICT industry buzzword, 'Smart UI' into the development of technology", Trend Focus: Broadcast.Communicate.Propagate, 65, pp.31, Aug. 2013.
  15. Nitendo Wii Remote, http://www.osculator.net/wiki/uploads/Main/pry-wiimote.gif
  16. Dongpyo Hong, Woontack Woo, "Recent Research Trend of Gesture-based User Interfaces", Telecommunications Review, Vol.18, No.3, pp.4, Jun. 2008.
  17. Gabe Cohn, Dan Morris, Shwetak N. Patel, Desney S. Tan, "Humantenna: Using the Body as an Antenna for Real-Time Whole-Body Interaction", CHI'12 Proceeding of the SIGCHI Conference on Human Factors in Computing Systems, pp.1901-1910, 2012.
  18. Jeremy Scott, David Dearman, Koji Yatani, Khai N. Truong, "Sensing Foot Gestures from the Pocket", UIST'10 Proceeding of the 23nd annual ACM symposium on User interface software and technology, pp.199-208, 2010.
  19. Chadwick A. Wingrave, Brian Williamson, Paul Varcholik, Jeremy Rose, Andrew Miller, Emiko Charbonneau, Jared Bott, Joseph J.LaViola Jr, "The Wiimote and Beyond: Spatially Convenient Devices for 3D User Interfaces", IEEE Computer Graphics and Applications, pp.27, Apr. 2010.
  20. Minseok Kim, Jisoo Park, Dongwoo Seo, Jaeyeol Lee, Sangmin Lee, Jaesung Kim, "Utilizing the multi-display interaction with the Wiimote", 2010 KIS KISS, pp.724-726, 2010.
  21. Thalmic Labs, Introducing Myo, c2013, https://www.thalmic.com/en/myo/, (accessed Feb. 6, 2014)
  22. Qifan Pu, Siyu Jiang, Shyam Gollakota, "Whole-Home Gesture Recognition Using Wireless Signals (Demo)", ACM SIGCOMM'13, 2013.