• 제목/요약/키워드: Gesture Type Input

검색결과 15건 처리시간 0.027초

제스처 형태의 한글입력을 위한 오토마타에 관한 연구 (A Study on the Automata for Hangul Input of Gesture Type)

  • 임양원;임한규
    • 한국산업정보학회논문지
    • /
    • 제16권2호
    • /
    • pp.49-58
    • /
    • 2011
  • 터치스크린을 이용한 스마트 디바이스의 보급이 활성화되어 한글 입력방식도 다양해지고 있다. 본 논문에서는 스마트 디바이스에 적합한 한글 입력방식을 조사 분석하고 오토마타 이론을 이용하여 터치 UI에 적합한 제스처 형태의 한글 입력방식에서 사용할 수 있는 간단하고 효율적인 오토마타를 제시하였다.

Hand Gesture Recognition Using an Infrared Proximity Sensor Array

  • Batchuluun, Ganbayar;Odgerel, Bayanmunkh;Lee, Chang Hoon
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제15권3호
    • /
    • pp.186-191
    • /
    • 2015
  • Hand gesture is the most common tool used to interact with and control various electronic devices. In this paper, we propose a novel hand gesture recognition method using fuzzy logic based classification with a new type of sensor array. In some cases, feature patterns of hand gesture signals cannot be uniquely distinguished and recognized when people perform the same gesture in different ways. Moreover, differences in the hand shape and skeletal articulation of the arm influence to the process. Manifold features were extracted, and efficient features, which make gestures distinguishable, were selected. However, there exist similar feature patterns across different hand gestures, and fuzzy logic is applied to classify them. Fuzzy rules are defined based on the many feature patterns of the input signal. An adaptive neural fuzzy inference system was used to generate fuzzy rules automatically for classifying hand gestures using low number of feature patterns as input. In addition, emotion expression was conducted after the hand gesture recognition for resultant human-robot interaction. Our proposed method was tested with many hand gesture datasets and validated with different evaluation metrics. Experimental results show that our method detects more hand gestures as compared to the other existing methods with robust hand gesture recognition and corresponding emotion expressions, in real time.

A Measurement System for 3D Hand-Drawn Gesture with a PHANToMTM Device

  • Ko, Seong-Young;Bang, Won-Chul;Kim, Sang-Youn
    • Journal of Information Processing Systems
    • /
    • 제6권3호
    • /
    • pp.347-358
    • /
    • 2010
  • This paper presents a measurement system for 3D hand-drawn gesture motion. Many pen-type input devices with Inertial Measurement Units (IMU) have been developed to estimate 3D hand-drawn gesture using the measured acceleration and/or the angular velocity of the device. The crucial procedure in developing these devices is to measure and to analyze their motion or trajectory. In order to verify the trajectory estimated by an IMU-based input device, it is necessary to compare the estimated trajectory to the real trajectory. For measuring the real trajectory of the pen-type device, a PHANToMTM haptic device is utilized because it allows us to measure the 3D motion of the object in real-time. Even though the PHANToMTM measures the position of the hand gesture well, poor initialization may produce a large amount of error. Therefore, this paper proposes a calibration method which can minimize measurement errors.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • 대한인간공학회지
    • /
    • 제31권5호
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

A Structure and Framework for Sign Language Interaction

  • Kim, Soyoung;Pan, Younghwan
    • 대한인간공학회지
    • /
    • 제34권5호
    • /
    • pp.411-426
    • /
    • 2015
  • Objective: The goal of this thesis is to design the interaction structure and framework of system to recognize sign language. Background: The sign language of meaningful individual gestures is combined to construct a sentence, so it is difficult to interpret and recognize the meaning of hand gesture for system, because of the sequence of continuous gestures. This being so, in order to interpret the meaning of individual gesture correctly, the interaction structure and framework are needed so that they can segment the indication of individual gesture. Method: We analyze 700 sign language words to structuralize the sign language gesture interaction. First of all, we analyze the transformational patterns of the hand gesture. Second, we analyze the movement of the transformational patterns of the hand gesture. Third, we analyze the type of other gestures except hands. Based on this, we design a framework for sign language interaction. Results: We elicited 8 patterns of hand gesture on the basis of the fact on whether the gesture has a change from starting point to ending point. And then, we analyzed the hand movement based on 3 elements: patterns of movement, direction, and whether hand movement is repeating or not. Moreover, we defined 11 movements of other gestures except hands and classified 8 types of interaction. The framework for sign language interaction, which was designed based on this mentioned above, applies to more than 700 individual gestures of the sign language, and can be classified as an individual gesture in spite of situation which has continuous gestures. Conclusion: This study has structuralized in 3 aspects defined to analyze the transformational patterns of the starting point and the ending point of hand shape, hand movement, and other gestures except hands for sign language interaction. Based on this, we designed the framework that can recognize the individual gestures and interpret the meaning more accurately, when meaningful individual gesture is input sequence of continuous gestures. Application: When we develop the system of sign language recognition, we can apply interaction framework to it. Structuralized gesture can be used for using database of sign language, inventing an automatic recognition system, and studying on the action gestures in other areas.

TV 가이드 영역에서의 음성기반 멀티모달 사용 유형 분석 (Speech-Oriented Multimodal Usage Pattern Analysis for TV Guide Application Scenarios)

  • 김지영;이경님;홍기형
    • 대한음성학회지:말소리
    • /
    • 제58호
    • /
    • pp.101-117
    • /
    • 2006
  • The development of efficient multimodal interfaces and fusion algorithms requires knowledge of usage patterns that show how people use multiple modalities. We analyzed multimodal usage patterns for TV-guide application scenarios (or tasks). In order to collect usage patterns, we implemented a multimodal usage pattern collection system having two input modalities: speech and touch-gesture. Fifty-four subjects participated in our study. Analysis of the collected usage patterns shows a positive correlation between the task type and multimodal usage patterns. In addition, we analyzed the timing between speech-utterances and their corresponding touch-gestures that shows the touch-gesture occurring time interval relative to the duration of speech utterance. We believe that, for developing efficient multimodal fusion algorithms on an application, the multimodal usage pattern analysis for the given application, similar to our work for TV guide application, have to be done in advance.

  • PDF

피지컬 인터페이스의 구현에 관한 연구

  • 오병근
    • 디자인학연구
    • /
    • 제16권2호
    • /
    • pp.131-140
    • /
    • 2003
  • 컴퓨터를 이용한 인터랙션 디자인에 있어서 일반적인 입력방식은 키보드나 마우스를 이용하여 사용자가 모니터를 통해 인터페이스를 조작하는 제한적 형태이다. 그러나 이것은 기존의 전기나 전자의 기초기술을 활용하여 다른 형식의 입력방법을 창작하고 구현하여 인터랙션 디자인의 영역을 넓힐 수 있다. 최근 컴퓨터나 영상매체를 이용한 미디어 예술표현도 다양한 방식으로 제공되는 인터페이스를 사용자가 직접 참여하여 작동해야 예술적 표현이 완성되는 형식이 선보이고 있다. 피지컬 인터페이스를 이용한 인터랙션 디자인도 사용자가 작동하는 인터페이스로부터의 신호를 디지털이나 아날로그 형식으로 구분하고 컴퓨터에 입력하거나 출력하므로써 이를 활용할 수 있다. 이 때 인터페이스의 신호는 컴퓨터가 안전하게 받아들 일 수 있는 형식이 필요하므로 이를 위한 전기적 회로장치 구현이 필요하다. 인터페이스의 형식도 컴퓨터 키보드나 마우스 등의 기존 형태가 아닌 스위치나 센서, 카메라 등을 이용한 또 다른 물리적 형태의 창작물이 되는 것이다. 이러한 형태의 인터랙션 디자인은 인간이 원래부터 사용하는 언어와 몸짓을 이용한 인터랙션의 인간적(Humanity) 풍부함을 부여할 수 있는 디자인이라고 할 수 있다.

  • PDF

태블릿 PC에서의 스타일러스 펜 및 손 기반인터랙션을 위한 소프트 키보드 타입 비교 (Comparison of Soft Keyboard Types for Stylus Pen and Finger-based Interaction on Tablet PCs)

  • 안진호;안준영;이재일;김경도
    • 대한산업공학회지
    • /
    • 제42권1호
    • /
    • pp.57-64
    • /
    • 2016
  • Pen-based interaction is universally available on smart devices and especially on Tablet PCs. Previous studies compared various input methods like fingers, a mouse or a stylus pen on PCs or on a touchscreen based devices such as smart phones. At the same time, various soft keyboard applications are being developed on application stores of smart devices. However, these previous studies did not suggest which one is a suitable keyboard application for Tablet PCs when users perform a certain interaction as input type. In this study, we compared two types of input methods (finger and pen) and three types of soft keyboard applications (QWERTY, Gesture and Swype) in a Tablet PC using performance measurements (accuracy and input speed) and discussed what types of applications showed better performance with each interaction on tablet PC. From these results, recommendations for the keyboard types depending on the input methods on tablet PCs were developed.

Gesture Interaction Design based on User Preference for the Elastic Handheld Device

  • Yoo, Hoon Sik;Ju, Da Young
    • 대한인간공학회지
    • /
    • 제35권6호
    • /
    • pp.519-533
    • /
    • 2016
  • Objective: This study lays its aims at the definition of relevant operation method and function by researching on the value to be brought when applying smart device that can hand carry soft and flexible materials like jelly. Background: New technology and material play a role in bringing type transformation of interface and change of operation system. Recently, importance has been increased on the study of Organic User Interface (OUI) that conducts research on the value of new method of input and output adopting soft and flexible materials for various instruments. Method: For fulfillment of the study, 27 kinds of gestures have been defined that are usable in handheld device based on existing studies. Quantitative research of survey was conducted of adult male and female of 20s through 30s and an analysis was done on the function that can be linked to gestures with highest level of satisfaction. In order to analyze needs and hurdles of users for the defined gesture, a focus group interview was conducted aiming at the groups of early adopters and ordinary users. Results: As a result, it was found that users have much value regarding usability and fun for elastic device and analysis could be conducted on preferred gesture and its linkable functions. Conclusion: What is most significant with this study is that it sheds new light on the values of a device made of elastic material. Beyond finding and defining the gestures and functions that can be applied to a handheld elastic device, the present study identified the value elements of an elastic device - 'usability and 'fun' -, which users can basically desire from using it. Application: The data that this study brought forth through preference and satisfaction test with the gestures and associated functions will help commercialize an elastic device in future.

가상 칠판을 위한 손 표현 인식 (Hand Expression Recognition for Virtual Blackboard)

  • 허경용;김명자;송복득;신범주
    • 한국정보통신학회논문지
    • /
    • 제25권12호
    • /
    • pp.1770-1776
    • /
    • 2021
  • 손 표현 인식을 위해서는 손의 정적인 형태를 기반으로 하는 손 자세 인식과 손의 움직임을 기반으로 하는 손 동작 인식이 함께 사용된다. 본 논문에서는 가상의 칠판 위에서 움직이는 손의 궤적을 기반으로 기호를 인식하는 손 표현인식 방법을 제안하였다. 손으로 가상의 칠판에 그린 기호를 인식하기 위해서는 손의 움직임으로부터 기호를 인식하는 방법은 물론, 데이터 입력의 시작과 끝을 찾아내기 위한 손 자세 인식 역시 필요하다. 본 논문에서는 손 자세 인식을 위해 미디어파이프를, 시계열 데이터에서 손 동작을 인식하기 위해 순환 신경망의 한 종류인 LSTM(Long Short Term Memory)을 사용하였다. 제안하는 방법의 유효성을 보이기 위해 가상 칠판에 쓰는 숫자 인식에 제안하는 방법을 적용하였을 때 약 94%의 인식률을 얻을 수 있었다.