• 제목/요약/키워드: Gesture-based Interaction

검색결과 152건 처리시간 0.019초

Study on Gesture and Voice-based Interaction in Perspective of a Presentation Support Tool

  • Ha, Sang-Ho;Park, So-Young;Hong, Hye-Soo;Kim, Nam-Hun
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.593-599
    • /
    • 2012
  • Objective: This study aims to implement a non-contact gesture-based interface for presentation purposes and to analyze the effect of the proposed interface as information transfer assisted device. Background: Recently, research on control device using gesture recognition or speech recognition is being conducted with rapid technological growth in UI/UX area and appearance of smart service products which requires a new human-machine interface. However, few quantitative researches on practical effects of the new interface type have been done relatively, while activities on system implementation are very popular. Method: The system presented in this study is implemented with KINECT$^{(R)}$ sensor offered by Microsoft Corporation. To investigate whether the proposed system is effective as a presentation support tool or not, we conduct experiments by giving several lectures to 40 participants in both a traditional lecture room(keyboard-based presentation control) and a non-contact gesture-based lecture room(KINECT-based presentation control), evaluating their interests and immersion based on contents of the lecture and lecturing methods, and analyzing their understanding about contents of the lecture. Result: We check that whether the gesture-based presentation system can play effective role as presentation supporting tools or not depending on the level of difficulty of contents using ANOVA. Conclusion: We check that a non-contact gesture-based interface is a meaningful tool as a sportive device when delivering easy and simple information. However, the effect can vary with the contents and the level of difficulty of information provided. Application: The results presented in this paper might help to design a new human-machine(computer) interface for communication support tools.

마커 및 제스처 상호작용이 가능한 증강현실 저작도구 (Augmented Reality Authoring Tool with Marker & Gesture Interactive Features)

  • 심진욱;공민제;김하영;채승호;정경호;서종훈;한탁돈
    • 한국멀티미디어학회논문지
    • /
    • 제16권6호
    • /
    • pp.720-734
    • /
    • 2013
  • 본 논문에서는 사용자들이 손쉽게 마커기반과 제스처 상호작용 방법들을 적용한 증강현실 콘텐츠를 제작할 수 있는 증강현실 저작도구 시스템을 제안한다. 기존의 증강현실 저작도구들은 가상의 객체를 증강하는데 초점이 맞춰져 있었고, 이러한 증강현실 콘텐츠와 상호작용을 하기 위해서는 사용자가 마커나 센서를 이용하는 방법을 사용하였다. 우리는 이러한 제한적인 상호작용 방법의 문제점을 마커기반 상호작용 방법과 깊이 인식 카메라인 Kinect를 사용한 제스처 상호작용 방법을 적용시킴으로써 해결하고자 한다. 제안하는 시스템에서는 사용자가 인터페이스를 통하여 간단한 형태의 마커기반 증강현실 콘텐츠를 쉽게 제작할 수 있다. 또한, 능동적으로 사용자가 증강현실 콘텐츠와 상호작용할 수 있는 방법들을 제공하고 있다. 본 연구에서 제공하는 상호작용 방법으로는 마커기반의 상호작용 방법으로 2개의 마커를 이용한 방법과 마커의 가림현상(Occlusion)을 이용한 방법이 있다. 그리고 사용자의 맨손을 인식, 추적하여 객체의 확대 축소, 이동, 회전이 가능한 제스처 상호작용 방법을 제공한다. 저작도구 시스템에 대한 사용성 평가와 마커 및 제스처 상호작용에 대한 사용성을 비교평가하여, 본 연구의 긍정적 결과를 확인하였다.

사용자 손 제스처 인식 기반 입체 영상 제어 시스템 설계 및 구현 (Design and Implementation of a Stereoscopic Image Control System based on User Hand Gesture Recognition)

  • 송복득;이승환;최홍규;김성훈
    • 한국정보통신학회논문지
    • /
    • 제26권3호
    • /
    • pp.396-402
    • /
    • 2022
  • 영상 미디어를 위한 사용자 인터랙션은 다양한 형태로 개발되고 있으며, 특히, 인간의 제스처를 활용한 인터랙션이 활발히 연구되고 있다. 그 중에, 손 제스처 인식의 경우 3D Hand Model을 기반으로 실감 미디어 분야에서 휴먼 인터페이스로 활용되고 있다. 손 제스처 인식을 기반으로 한 인터페이스의 활용은 사용자가 미디어 매체에 보다 쉽고 편리하게 접근할 수 있도록 도와준다. 이러한 손 제스처 인식을 활용한 사용자 인터랙션은 컴퓨터 환경 제약 없이 빠르고 정확한 손 제스처 인식 기술을 적용하여 영상을 감상할 수 있어야 한다. 본 논문은 오픈 소스인 미디어 파이프 프레임워크와 머신러닝의 k-NN(K-Nearest Neighbor)을 활용하여 빠르고 정확한 사용자 손 제스처 인식 알고리즘을 제안한다. 그리고 컴퓨터 환경 제약을 최소화하기 위하여 인터넷 서비스가 가능한 웹 서비스 환경 및 가상 환경인 도커 컨테이너를 활용하여 사용자 손 제스처 인식 기반의 입체 영상 제어 시스템을 설계하고 구현한다.

멀티모달 실감 경험 I/O 인터랙션 시스템 개발 (Development for Multi-modal Realistic Experience I/O Interaction System)

  • 박재언;황민철;이정년;허환;정용무
    • 감성과학
    • /
    • 제14권4호
    • /
    • pp.627-636
    • /
    • 2011
  • 본 연구는 단순 입력 기반 유니모달 인터랙션의 한계를 극복하고 단순 입력 방식이 아닌 멀티모달 기반 사용자의 행위, 의도, 및 집중도를 활용하여 실감적이고 몰입도를 향상시키는 인터랙션 시스템을 제안하는데 그 목적이 있다. 본 연구의 선행연구에서 기존 문헌연구를 토대로 메타분석방법을 활용하여 인터랙션을 위한 3차원 동작 인식 기술의 정확도를 분석하여 최종적인 센서 기반 인터랙션 방법이 선정되었고, 직관적 제스쳐 인터랙션 요소를 추출하여 본 시스템에 반영하였다. 또한 생리반응을 이용한 집중력 판단 기술을 개발하여 사용자 의도를 판단하는 연구를 진행하였다. 본 연구에서 제안하는 시스템은 3부분으로 나눌 수 있다. 선행연구에서 선정된 인터랙션 요소들을 적용하여 가속도(Accelator) 센서와 연성(Flexible) 센서를 활용하여 손 동작을 인식하는 시스템을 구현하였고, 동공 인터랙션을 통한 안경형 시선 추적기를 구현하여 인터랙션이 가능하게 하였으며, 심혈관 반응과 피부 온열 반응을 측정하여 사용자의 의도를 반영한 시스템을 최종 구현하였다. 실감형 디지털 엔터테인먼트 플랫폼 기술 개발을 위한 기초 연구로서 활용이 가능할 것으로 판단된다.

  • PDF

Investigating Key User Experience Factors for Virtual Reality Interactions

  • Ahn, Junyoung;Choi, Seungho;Lee, Minjae;Kim, Kyungdoh
    • 대한인간공학회지
    • /
    • 제36권4호
    • /
    • pp.267-280
    • /
    • 2017
  • Objective: The aim of this study is to investigate key user experience factors of interactions for Head Mounted Display (HMD) devices in the Virtual Reality Environment (VRE). Background: Virtual reality interaction research has been conducted steadily, while interaction methods and virtual reality devices have improved. Recently, all of the virtual reality devices are head mounted display based ones. Also, HMD-based interaction types include Remote Controller, Head Tracking, and Hand Gesture. However, there is few study on usability evaluation of virtual reality. Especially, the usability of HMD-based virtual reality was not investigated. Therefore, it is necessary to study the usability of HMD-based virtual reality. Method: HMD-based VR devices released recently have only three interaction types, 'Remote Controller', 'Head Tracking', and 'Hand Gesture'. We search 113 types of research to check the user experience factors or evaluation scales by interaction type. Finally, the key user experience factors or relevant evaluation scales are summarized considering the frequency used in the studies. Results: There are various key user experience factors by each interaction type. First, Remote controller's key user experience factors are 'Ease of learning', 'Ease of use', 'Satisfaction', 'Effectiveness', and 'Efficiency'. Also, Head tracking's key user experience factors are 'Sickness', 'Immersion', 'Intuitiveness', 'Stress', 'Fatigue', and 'Ease of learning'. Finally, Hand gesture's key user experience factors are 'Ease of learning', 'Ease of use', 'Feedback', 'Consistent', 'Simple', 'Natural', 'Efficiency', 'Responsiveness', 'Usefulness', 'Intuitiveness', and 'Adaptability'. Conclusion: We identified key user experience factors for each interaction type through literature review. However, we did not consider objective measures because each study adopted different performance factors. Application: The results of this study can be used when evaluating HMD-based interactions in virtual reality in terms of usability.

A Decision Tree based Real-time Hand Gesture Recognition Method using Kinect

  • Chang, Guochao;Park, Jaewan;Oh, Chimin;Lee, Chilwoo
    • 한국멀티미디어학회논문지
    • /
    • 제16권12호
    • /
    • pp.1393-1402
    • /
    • 2013
  • Hand gesture is one of the most popular communication methods in everyday life. In human-computer interaction applications, hand gesture recognition provides a natural way of communication between humans and computers. There are mainly two methods of hand gesture recognition: glove-based method and vision-based method. In this paper, we propose a vision-based hand gesture recognition method using Kinect. By using the depth information is efficient and robust to achieve the hand detection process. The finger labeling makes the system achieve pose classification according to the finger name and the relationship between each fingers. It also make the classification more effective and accutate. Two kinds of gesture sets can be recognized by our system. According to the experiment, the average accuracy of American Sign Language(ASL) number gesture set is 94.33%, and that of general gestures set is 95.01%. Since our system runs in real-time and has a high recognition rate, we can embed it into various applications.

지능형 UI와 Entertainment를 위한 동작 이해 휴대기기 (Motion-Understanding Cell Phones for Intelligent User Interaction and Entertainment)

  • 조성정;최은석;방원철;양징;조준기;기은광;손준일;김동윤;김상룡
    • 한국HCI학회:학술대회논문집
    • /
    • 한국HCI학회 2006년도 학술대회 1부
    • /
    • pp.684-691
    • /
    • 2006
  • As many functionalities such as cameras and MP3 players are converged to mobile phones, more intuitive and interesting interaction methods are essential. In this paper, we present applications and their enabling technologies for gesture interactive cell phones. They employ gesture recognition and real-time shake detection algorithm for supporting motion-based user interface and entertainment applications respectively. The gesture recognition algorithm classifies users' movement into one of predefined gestures by modeling basic components of acceleration signals and their relationships. The recognition performance is further enhanced by discriminating frequently confusing classes with support vector machines. The shake detection algorithm detects in real time the exact motion moment when the phone is shaken significantly by utilizing variance and mean of acceleration signals. The gesture interaction algorithms show reliable performance for commercialization; with 100 novice users, the average recognition rate was 96.9% on 11 gestures (digits 1-9, O, X) and users' movements were detected in real time. We have applied the motion understanding technologies to Samsung cell phones in Korean, American, Chinese and European markets since May 2005.

  • PDF

Hand Gesture Recognition Suitable for Wearable Devices using Flexible Epidermal Tactile Sensor Array

  • Byun, Sung-Woo;Lee, Seok-Pil
    • Journal of Electrical Engineering and Technology
    • /
    • 제13권4호
    • /
    • pp.1732-1739
    • /
    • 2018
  • With the explosion of digital devices, interaction technologies between human and devices are required more than ever. Especially, hand gesture recognition is advantageous in that it can be easily used. It is divided into the two groups: the contact sensor and the non-contact sensor. Compared with non-contact gesture recognition, the advantage of contact gesture recognition is that it is able to classify gestures that disappear from the sensor's sight. Also, since there is direct contacted with the user, relatively accurate information can be acquired. Electromyography (EMG) and force-sensitive resistors (FSRs) are the typical methods used for contact gesture recognition based on muscle activities. The sensors, however, are generally too sensitive to environmental disturbances such as electrical noises, electromagnetic signals and so on. In this paper, we propose a novel contact gesture recognition method based on Flexible Epidermal Tactile Sensor Array (FETSA) that is used to measure electrical signals according to movements of the wrist. To recognize gestures using FETSA, we extracted feature sets, and the gestures were subsequently classified using the support vector machine. The performance of the proposed gesture recognition method is very promising in comparison with two previous non-contact and contact gesture recognition studies.

A method for image-based shadow interaction with virtual objects

  • Ha, Hyunwoo;Ko, Kwanghee
    • Journal of Computational Design and Engineering
    • /
    • 제2권1호
    • /
    • pp.26-37
    • /
    • 2015
  • A lot of researchers have been investigating interactive portable projection systems such as a mini-projector. In addition, in exhibition halls and museums, there is a trend toward using interactive projection systems to make viewing more exciting and impressive. They can also be applied in the field of art, for example, in creating shadow plays. The key idea of the interactive portable projection systems is to recognize the user's gesture in real-time. In this paper, a vision-based shadow gesture recognition method is proposed for interactive projection systems. The gesture recognition method is based on the screen image obtained by a single web camera. The method separates only the shadow area by combining the binary image with an input image using a learning algorithm that isolates the background from the input image. The region of interest is recognized with labeling the shadow of separated regions, and then hand shadows are isolated using the defect, convex hull, and moment of each region. To distinguish hand gestures, Hu's invariant moment method is used. An optical flow algorithm is used for tracking the fingertip. Using this method, a few interactive applications are developed, which are presented in this paper.