DOI QR코드

DOI QR Code

Conditions of Applications, Situations and Functions Applicable to Gesture Interface

  • Ryu, Tae-Beum (Department of Industrial and Management Engineering, Hanbat National University) ;
  • Lee, Jae-Hong (Department of Industrial Engineering, Seoul National University) ;
  • Song, Joo-Bong (Department of Industrial Engineering, Seoul National University) ;
  • Yun, Myung-Hwan (Department of Industrial Engineering, Seoul National University)
  • 투고 : 2012.07.13
  • 심사 : 2012.07.28
  • 발행 : 2012.08.31

초록

Objective: This study developed a hierarchy of conditions of applications(devices), situations and functions which are applicable to gesture interface. Background: Gesture interface is one of the promising interfaces for our natural and intuitive interaction with intelligent machines and environments. Although there were many studies related to developing new gesture-based devices and gesture interfaces, it was little known which applications, situations and functions are applicable to gesture interface. Method: This study searched about 120 papers relevant to designing and applying gesture interfaces and vocabulary to find the gesture applicable conditions of applications, situations and functions. The conditions which were extracted from 16 closely-related papers were rearranged, and a hierarchy of them was developed to evaluate the applicability of applications, situations and functions to gesture interface. Results: This study summarized 10, 10 and 6 conditions of applications, situations and functions, respectively. In addition, the gesture applicable condition hierarchy of applications, situation and functions were developed based on the semantic similarity, ordering and serial or parallel relationship among them. Conclusion: This study collected gesture applicable conditions of application, situation and functions, and a hierarchy of them was developed to evaluate the applicability of gesture interface. Application: The gesture applicable conditions and hierarchy can be used in developing a framework and detailed criteria to evaluate applicability of applications situations and functions. Moreover, it can enable for designers of gesture interface and vocabulary to determine applications, situations and functions which are applicable to gesture interface.

키워드

참고문헌

  1. Bhuiyan, M. and Picking, R., "Gesture-controlled user interfaces, what have we done and what's next?" 5th Collaborative Research Symposium on Security, E-Learning, Internet and Networking(pp. 59-60), Darmstadt. Germany. 2009.
  2. Bhuiyan, M. and Picking, R. A., Gesture controlled user interface for inclusive design and evaluative study of its usability, Journal of Software Engineering and Applications, 4(513-521), 2011. https://doi.org/10.4236/jsea.2011.49059
  3. Blatt, L. and Schell, A., "Gesture Set Economics for Text and Spreadsheet Editors". Proceedings of the Human Factors and Ergonomics Society 34th Annual meeting(pp. 410-414), Orlando, FL. 1990.
  4. Guo, C. and Sharlin, E., "Exploring the use of tangible user interfaces for human-robot interaction: a comparative study". CHI '08: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems(pp. 121-130), Florence. Italy. 2008.
  5. Hummels, C. and Stappers, P. J., "Meaningful gestures for human computer interaction: beyond hand postures". Proceeding of Third IEEE International Conference on Automatic Face and Gesture Recognition(pp. 591-596), Nara. Japan. 1998.
  6. Hurtienne, J., Stößel, C. and Sturm, C., Physical gestures for abstract concepts: Inclusive design with primary metaphors. Interacting with Computers, 22(6), 475-484, 2010. https://doi.org/10.1016/j.intcom.2010.08.009
  7. Jia, P., Hu, H., Lu, T. and Yuan, K., Head gesture recognition for hands-free control of an intelligent wheelchair. Industrial Robot: An International Journal, 34(1), 60-68, 2007. https://doi.org/10.1108/01439910710718469
  8. Kela, J., Korpipaa, P., Mäntyjarvi, J., Kallio, S., Savino, G., Jozzo, L. and Marca, D., Accelerometer-based gesture control for a design environment, Personal and Ubiquitous Computing, 10(5), 285-299, 2006. https://doi.org/10.1007/s00779-005-0033-8
  9. Kuhnel, C., Westermann, T. and Hemmert, F., I'm home: Defining and evaluating a gesture set for smart-home control, International Journal of Human-Computer Studies, 69(11), 693-704, 2011. https://doi.org/10.1016/j.ijhcs.2011.04.005
  10. Li, J., Communication of Emotion in Social Robots through Simple Head and Arm Movements, International Journal of Social Robotics, 3, 125-142, 2010.
  11. Mitra, S. and Acharya, T., Gesture recognition: A survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(3), 311-324, 2007. https://doi.org/10.1109/TSMCC.2007.893280
  12. Nielsen, M., Storring, M., Moeslund, T. B. and Granum, E., "A procedure for developing intuitive and ergonomic gesture interfaces for manmachine interaction". Proceedings of the 5th International Gesture Workshop(pp. 1-12), Aalborg. Denmark. 2003.
  13. Nishikawa, A., Hosoi, T., Koara, K., Negoro, D., Hikita, A., Asano, S., Kakutani, H., Miyazaki, F., Sekimoto, M., Yasui, M., Miyake, Y., Takiguchi, S., and Monden, M., FAce MOUSe: A novel humanmachine interface for controlling the position of a laparoscope. IEEE Transactions on Robotics and Automation 19(5), 825-841, 2003. https://doi.org/10.1109/TRA.2003.817093
  14. Nesselrath, R., Lu, C., Schulz, C.H., Frey, J. and Alexandersson, J., A gesture based system for context-sensitive interaction with smart homes, In R. Wichert and B.Eberhardt(Eds), Advanced Technologies and Societal Change, Springer, Berlin, 209-219, 2011.
  15. Oviatt, S., DeAngeli, A. and Kuhn, K., Integration and synchronization of input modes during multimodal human-computer interaction. Referring Phenomena in a Multimedia Context and their Computational Treatment, 1-13, 1997.
  16. Oviatt, S., Ten Myths of Multimodal Interaction. Communications of the ACM, 42(11), 74-81, 1999. https://doi.org/10.1145/319382.319398
  17. Rauschert, I., Agrawal, P., Sharma, R., Fuhrmann, S., Brewer, I. and MacEachren, A. M., "Designing a human-centered, multimodal GIS interface to support emergency management". Proceedings of the 10th ACM International Symposium on Advances in Geographic Information Systems(pp. 119-124), McLean. VA. 2002.
  18. Rico, J., "Usable gestures for mobile interfaces: evaluating social acceptability", Proceedings of the 28th international conference on Human factors in computing system(pp. 887-896), Atlanta. GA. 2010.
  19. Ronkainen, S., Koskinen, E., Liu, Y. and Korhonen, P., Environment Analysis as a Basis for Designing Multimodal and Multidevice User Interfaces, Human-Computer Interaction, 25(2), 148-193, 2010. https://doi.org/10.1080/07370020903586712
  20. Rhyne, J., Dialogue Management for Gestural Interfaces. Computer graphics, 21(2), 137-142, 1987. https://doi.org/10.1145/24919.24933
  21. Shan, C., Gesture Control for Consumer Electronics, Multimedia Interaction and Intelligent User Interfaces, 107-128, 2010. https://doi.org/10.1007/978-1-84996-507-1_5
  22. Wachs, J., Kolsch, M. and Stern, H., Vision-based hand-gesture applications. Communications of the ACM, 54(2), 60-71, 2011. https://doi.org/10.1145/1897816.1897838
  23. Wickens, C. D. and Hollands, J. G., Engineering psychology and human performance, 3rd ed., Prentice Hall, 1999.
  24. Wilson, A. and Oliver. N., GWindows: Towards Robust Perception-Based UI. in First IEEE Workshop on Computer Vision and Pattern Recognition for Human Computer Interaction. 2003.
  25. Young, J., Sung, J., Voida, A. and Sharlin, E., Evaluating human-robot interaction, International Journal of Social Robotics, 3, 53-67, 2011. https://doi.org/10.1007/s12369-010-0081-8