DOI QR코드

DOI QR Code

3차원 공간에서 손가락 제스쳐 인터랙션을 이용한 간접제어의 사용성에 관한 실험연구

An Experimental Research on the Usability of Indirect Control using Finger Gesture Interaction in Three Dimensional Space

  • 함경선 (연세대학교 기술경영학협동과정) ;
  • 이다혜 (연세대학교 정보대학원) ;
  • 홍희정 (연세대학교 기술경영학협동과정) ;
  • 박성재 (연세대학교 기술경영학협동과정) ;
  • 김진우 (연세대학교 기술경영학협동과정)
  • 투고 : 2014.07.31
  • 심사 : 2014.09.16
  • 발행 : 2014.11.28

초록

자연스러운 컴퓨터 인터랙션(Natural computer interaction)을 위 신기술의 출현은 기업에게 제품 혁신이라는 새로운 기회를 안겨다 줄 수 있다. 본 연구는 인간이 갖는 소통의 방식 중에서 손가락 제스처 인터랙션에 관심을 갖는다. 그간 십 수 년간 걸친 기술 발전에 힘입어 상용화 수준의 기술이 마련됨에 따라 이를 활용한 제품이나 서비스가 곧 대중화 될 것이라는 생각이 현재는 지배적이다. 이 시점에서 저자는 이 제스처 인터랙션이 과연 쓸 만 할까라는 생각을 가지고 손가락 제스처 인터랙션이 사용자들에게 끼치는 인지적인 영향을 알아보고자 실험연구를 진행한다. 손가락 제스처를 찍기, 집기, 잡기로 정의하고 각각 2차원과 3차원 공간에서 사용할 때 사용자들이 느끼는 사용성을 측정하여 제스처의 효과를 보여주고자 한다. 립모션 기술을 사용하여 2차원, 3차원 실험 도구를 개발하였고 48명의 피실험자를 통해 결과적으로는 2차원에서는 제스처간 사용성의 차이가 없으나 3차원에서는 그 차이를 발견할 수 있었다. 또한 모든 제스처가 3차원보다는 2차원에서 사용성이 좋은 것으로 나타났으며, 특히 3차원에서는 여러 손가락 보다는 하나의 손가락을 사용하는 것이 더 좋은 것으로 나타나 흥미를 끈다.

The emerging technologies for the natural computer interaction can give manufacturers new opportunities of product innovation. This paper is the study on a method of human communication about a finger gestures interaction. As technological advance has been so rapid over the last few decades, the utilizing products or services will be soon popular. The purpose of this experiment are as follows; What is the usefulness of gesture interaction? What is the cognitive impact on gesture interaction users. The finger gestures interaction consist of poking, picking and grasping. By measuring each usability in 2D and 3D space, this study shows the effect of finger gestures interaction. The 2D and 3D experimental tool is developed by using LeapMotion technology. As a results, the experiments involved 48 subjects shows that there is no difference in usability between the gestures in 2D space but in 3D space, the meaningful difference has been found. In addition, all gestures express good usability in 2D space rather than 3D space. Especially, there are the attractive interest that using uni-finger is better than multi-fingers.

키워드

참고문헌

  1. D. A. Norman, The design of everyday things, Basic books, 2002.
  2. G. Chen and D. Kotz, A survey of context-aware mobile computing research, Technical Report TR2000-381, Dept. of Computer Science, Dartmouth College, 2000.
  3. V. Buchmann, S. Violich, M. Billinghurst, and A. Cockburn, "FingARtips: gesture based direct manipulation in Augmented Reality," in Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia, pp.212-221, 2004.
  4. E. Kaiser, A. Olwal, D. McGee, H. Benko, A. Corradini, and X. Li, "Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality," in Proceedings of the 5th international conference on Multimodal interfaces, pp.12-19, 2003.
  5. T. G. Zimmerman, J. Lanier, C. Blanchard, S. Bryson, and Y. Harvill, "A hand gesture interface device," in ACM SIGCHI Bulletin, pp.189-192, 1987.
  6. J. P. Wachs, M. Kolsch, H. Stern, and Y. Edan, "Vision-based hand-gesture applications," Communications of the ACM, Vol.54, pp.60-71, 2011.
  7. M. C. Thomas and A. P. M. S. Pradeepa, "A COMPREHENSIVE REVIEW ON VISION BASED HAND GESTURE RECOGNITION TECHNOLOGY," International Journal, Vol.2, 2014.
  8. Z. Ren, J. Meng, J. Yuan, and Z. Zhang, "Robust hand gesture recognition with kinect sensor," in Proceedings of the 19th ACM international conference on Multimedia, pp.759-760, 2011.
  9. Z. Ren, J. Yuan, J. Meng, and Z. Zhang, "Robust part-based hand gesture recognition using kinect sensor," IEEE Trans. Multimedia, Vol.15, pp.1110-1120, 2013. https://doi.org/10.1109/TMM.2013.2246148
  10. R. J. Jacob, "Human-computer interaction: input devices," ACM Computing Surveys (CSUR), Vol.28, pp.177-179, 1996.
  11. H. Benko, A. D. Wilson, and P. Baudisch, "Precise selection techniques for multi-touch screens," in Proceedings of the SIGCHI conference on Human Factors in computing systems, pp.1263-1272, 2006.
  12. C. Telkenaroglu and T. Capin, "Dual-finger 3d interaction techniques for mobile devices," Personal and ubiquitous computing, Vol.17, pp.1551-1572, 2013. https://doi.org/10.1007/s00779-012-0594-2
  13. L. Lamberti and F. Camastra, "Real-time hand gesture recognition using a color glove," in Image Analysis and Processing-ICIAP 2011, ed: Springer, pp.365-373, 2011.
  14. F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, "Analysis of the accuracy and robustness of the leap motion controller," Sensors (Basel, Switzerland), Vol.13, pp.6380, 2013. https://doi.org/10.3390/s130506380
  15. Z. Zhang, "Microsoft kinect sensor and its effect," MultiMedia, IEEE, Vol.19, pp.4-10, 2012.
  16. M. Nielsen, M. Storring, T. B. Moeslund, and E. Granum, "A procedure for developing intuitive and ergonomic gesture interfaces for HCI," in Gesture-Based Communication in Human-Computer Interaction, ed: Springer, pp.409-420, 2004.
  17. D. Vogel and R. Balakrishnan, "Distant freehand pointing and clicking on very large, high resolution displays," in Proceedings of the 18th annual ACM symposium on User interface software and technology, pp.33-42, 2005.
  18. M. Moehring and B. Froehlich, "Effective manipulation of virtual objects within arm's reach," in Virtual Reality Conference (VR), 2011 IEEE, pp.131-138, 2011.
  19. A. Kulshreshth and J. J. LaViola Jr, "Exploring the usefulness of finger-based 3D gesture menu selection," in Proceedings of the 32nd annual ACM conference on Human factors in computing systems, pp.1093-1102, 2014.
  20. http://software.intel.com
  21. S. Zhao and R. Balakrishnan, "Simple vs. compound mark hierarchical marking menus," in Proceedings of the 17th annual ACM symposium on User interface software and technology, pp.33-42, 2004.
  22. D. J. Sturman, Whole-hand input, Massachusetts Institute of Technology, 1991.
  23. M. Hassenzahl and N. Sandweg, "From mental effort to perceived usability: transforming experiences into summary assessments," in CHI'04 extended abstracts on Human factors in computing systems, pp.1283-1286, 2004.
  24. M. H. van Beurden, W. A. Ijsselsteijn, and Y. A. de Kort, "User experience of gesture based interfaces: a comparison with traditional interaction methods on pragmatic and hedonic qualities," in Gesture and Sign Language in Human-Computer Interaction and Embodied Communication, ed: Springer, pp.36-47, 2012.
  25. N. Bevan, "International standards for HCI and usability," International journal of human-computer studies, Vol.55, pp.533-552, 2001. https://doi.org/10.1006/ijhc.2001.0483
  26. J. Sauro, "Measuring usability with the system usability scale (SUS)," ed, 2011.
  27. G. H. Subramanian, "A Replication of Perceived Usefulness and Perceived Ease of Use Measurement," Decision Sciences, Vol.25, pp.863-874, 1994. https://doi.org/10.1111/j.1540-5915.1994.tb01873.x
  28. D. R. Compeau and C. A. Higgins, "Application of social cognitive theory to training for computer skills," Information systems research, Vol.6, pp.118-143, 1995. https://doi.org/10.1287/isre.6.2.118
  29. P. M. Fitts, "The information capacity of the human motor system in controlling the amplitude of movement," Journal of experimental psychology, Vol.47, p.381, 1954. https://doi.org/10.1037/h0055392
  30. P. Kabbash and W. A. Buxton, "The "prince" technique: Fitts' law and selection using area cursors," in Proceedings of the SIGCHI conference on Human factors in computing systems, pp.273-279, 1995.
  31. T. Moscovich and J. F. Hughes, "Multi-finger cursor techniques," in Proceedings of Graphics Interface 2006, pp.1-7, 2006.
  32. M. Prachyabrued and C. W. Borst, "Virtual grasp release method and evaluation," International Journal of Human-Computer Studies, Vol.70, pp.828-848, 2012. https://doi.org/10.1016/j.ijhcs.2012.06.002
  33. M. R. Mine, F. P. Brooks Jr, and C. H. Sequin, "Moving objects in space: exploiting proprioception in virtual-environment interaction," in Proceedings of the 24th annual conference on Computer graphics and interactive techniques, pp.19-26, 1997.
  34. F. Paas, J. E. Tuovinen, H. Tabbers, and P. W. Van Gerven, "Cognitive load measurement as a means to advance cognitive load theory," Educational psychologist, Vol.38, pp.63-71, 2003. https://doi.org/10.1207/S15326985EP3801_8
  35. S. Meyer, O. Cohen, and E. Nilsen, "Device comparisons for goal-directed drawing tasks," in Conference companion on Human factors in computing systems, pp.251-252, 1994.
  36. http://www.leapmotion.com
  37. 최종훈, "영상콘텐츠용 디지털액자의 제스처 인터랙션 방식," 한국콘텐츠학회논문지, Vol.10, pp.120-127, 2010. https://doi.org/10.5392/JKCA.10.10.120
  38. 박기창, 서성채, 정승문, 강임철, 김병기, "GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계," 한국콘텐츠학회논문지, Vol.13, pp.55-63, 2013. https://doi.org/10.5392/JKCA.2013.13.01.055
  39. http://www.youtube.com/watch?v=XL2Eak4CX6g