Design of a Background Image Based Multi-Degree-of-Freedom Pointing Device

배경영상 기반 다자유도 포인팅 디바이스의 설계

  • Jang, Suk-Yoon (Graduate Program in Biomedical Engineering, Yonsei University) ;
  • Kho, Jae-Won (Dept. of Computer Control, Yuhan College)
  • 장석윤 (연세대학교 생체공학협동) ;
  • 고재원 (유한대학 컴퓨터제어과)
  • Published : 2008.11.25

Abstract

As interactive multimedia have come into wide use, user interfaces such as remote controllers or classical computer mice have several limitations that cause inconvenience. We propose a vision-based pointing device to resolve this problem. We analyzed the moving image from the camera which is embedded in the pointing device and estimate the movement of the device. The pose of the cursor can be determined from this result. To process in the real time, we used the low resolution of $288{\times}208$ pixel camera and comer points of the screen were tracked using local optical flow method. The distance from screen and device was calculated from the size of screen in the image. The proposed device has simple configurations, low cost, easy use, and intuitive handhold operation like traditional mice. Moreover it shows reliable performance even in the dark condition.

대화형 멀티미디어 기기들이 급속하게 보급되면서 리모컨이나 컴퓨터 마우스와 같은 기존의 사용자 인터페이스 방식으로는 제약이 많아 사용이 불편한 경우가 많은데, 이러한 문제를 해결하기 위한 방안으로 비전기반의 포인팅 디바이스를 소개한다. 포인팅 디바이스에 카메라를 장착하여 카메라에 입력되는 배경 동영상을 분석하면 카메라의 움직임을 추정할 수 있는데 이를 근거로 커서를 움직이는 방법을 이용한다. 동작이 실시간으로 이루어 질 수 있도록 $288{\times}208$ 화소의 저해상도의 카메라를 이용하고 여기에 입력된 스크린의 모서리 영상을 찾아 국부적으로 옵티컬 플로우(Optical flow) 방법으로 트래킹하고 스크린의 크기를 계산하면 스크린과 디바이스간 거리와 디바이스의 움직임 경로를 알아낼 수 있다. 이러한 방법으로 만들어진 포인팅 디바이스는 기존의 비전기반 다자유도 입력장치들에 비하여 어두운 환경에서 유리하고 구현이 간단하며 기존의 마우스와 같이 한 손에 쥐고 사용방법을 학습하지 않고도 직관적으로 사용가능한 장점을 갖는다.

Keywords

References

  1. G. M. Eom, K. S. Kim, C. S. Kim, J. Lee, S. C. Chung, B. S. Lee, H. Higa, N. Furuse, R. Futami, and T. Watanabe , "Gyro-Mouse for the disabled: click and position control of the mouse cursor," IJCAS, vol. 5, no. 2, pp. 147-154 , 2007
  2. J.ReKimoto, "Tilting operations for small screen interfaces," UIST'96, pp.167-168
  3. B. Frohlich and J. Plate, "The Cubic Mouse. A new device for three-dimensional input," Proc. ACM CHI 2000, pp. 526-531, April 2000
  4. M. S. Sohn and G. H. Lee, "SonarPen: An ultrasonic pointing device for an interactive TV," IEEE Transactions on Consumer Electronics, vol. 50, no. 2, pp. 413-419, May 2004 https://doi.org/10.1109/TCE.2004.1309402
  5. P. Baudisch, M. Sinclair, and A. Wilson, "Soap: a pointing device that works in mid-air," UIST'06, Oct.15-18, Montreux Switzerland, 2006
  6. http://www.gyration.com
  7. http://www.sico.co.kr/Z3/index.asp
  8. I. S. MacKenzie and S. Jusoh, "An evaluation of two input devices for remote pointing," Proceedings of the eighth IFIP international conference on EHCI 2001, pp235-249, Heidelberg, Germany, Springer-Verlag, 2001
  9. I. S. MacKenzie, T. Kauppinen, and M. Silfverherg, "Accuracy measures for evaluating computer pointing devices," CHI 2001, pp.9-16, 2001
  10. D. DeMenthon and L.S. Davis, "Exact and approximate solutions of the perspective-three- point problem," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.14(11), pp.1100-1105, Nov.1992 https://doi.org/10.1109/34.166625
  11. D. DeMenthon, "From artificial vision to synthetic reality: System for interaction with a computer using video image analysis (in French)," PhD thesis, University Joseph Fourier, Grenoble, France, Oct.4. 1993
  12. M. Munich and P. Perona, "Visual input for pen-based computers," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.24, no.3, pp.313-328, March 2002 https://doi.org/10.1109/34.990134
  13. Z. Zhang, Y. Wu, Y. Shan, and S. Shafer, "Visual panel: Virtual mouse, keyboard and 3D controller with an ordinary piece of paper," PUI 2001, Orlando, Florida, Nov. 15-16. 2001
  14. P. Nesi and A. D. Bimbo, "A vision-based 3-D mouse," Int. J. Human-Computer Studies, vol.44, no.1, pp.73-91, 1996 https://doi.org/10.1006/ijhc.1996.0004
  15. M. Betke, J. Gips, and P Fleming, "The camera mouse: Visual tracking of body features to provide computer access for people with severe disabilities," IEEE Trans. Neural. Syst. Rehabil. Eng. v10(1), Mar. 2002
  16. Y.Wu and T.S.Huang, "Hand modeling analysis, and recognition for vision-based human computer interaction," IEEE Signal Processing Magazine, pp.51-60, May 2001
  17. 한승일, 황용현, 이병국, 이준재, "스테레오 비전을 이용한 포인팅 디바이스에 관한 연구," KSIAM IT series Vol.10, No.2, pp.67-80, 2006
  18. K. Hinckley, M. Sinclair, E. Hanson, R. Szeliski, and M. Conway, "The VideoMouse: A camera- based multi-degree-of-freedom input device," ACM UIST'99, pp. 103-112, 1999
  19. H. Cantzler and C. Hoile, "A novel form of a pointing device," in Proc. Vision, Video and Graphics, pp.57-62, 2003
  20. P. Vartiainen, S. Chande, and K. Ramo, "Mobile visual interaction: enhancing local communication and collaboration with visual interactions," MUM'06, Stanford, California, Dec. 2006
  21. H. Jiang , E. Ofek, N. Moraveji, and Y. Shi, "Direct Pointer: direct manipulation for large- display interaction using handheld cameras," CHI 2006, Canada, April 2006
  22. J. Wang, S. Zhai, and J. Canny, "Camera phone based motion sensing: Interaction, techniques, applications and performance study," UIST'06, Montreux, Switzerland, pp.101-110, Oct. 2006
  23. B.Lucas and T.Kanade, "An iterative image registration technique with an application to stereo vision," Proceedings of DARPA Image Understanding Workshop, pp.121-130, 1981
  24. J. Canny, "A computational approach to edge detection," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.8, no.6, pp. 679-698, Nov. 1986 https://doi.org/10.1109/TPAMI.1986.4767851
  25. http://wii.nintendo.com