• Title/Summary/Keyword: Gestures

Search Result 484, Processing Time 0.047 seconds

The Relationship Between the Communicative Gesture and the Vocabulary Acquisition of Infants (7~24개월 영아의 의사소통적 몸짓과 어휘 습득간의 관계)

  • Kim, Eui Hyang;Kim, Myoung Soon
    • Korean Journal of Child Studies
    • /
    • v.27 no.6
    • /
    • pp.217-234
    • /
    • 2006
  • This study examined variability of gestures and their correlation with vocabulary acquisition in the communication of infants. Subjects were 96 infants, 7 to 24 months of age, residing in Seoul and its vicinity. Instruments were the Communication and Symbolic Behavior Scales (Iverson, et al., 1999) and the MacAuther Communicative Development Inventory-Korean (Bae and Lim, 2002). Data were analyzed by one-way ANOVA and Pearson's Correlation. Results identified monthly changes in types of communicative gesture used by infants: more deictic at younger and more representational at older ages. Deictic gestures were related to size of the receptive vocabulary and size of the whole vocabulary. Representational gestures were related to acquisition of expressive vocabulary, size of the receptive, and size of the whole vocabulary.

  • PDF

Gestures as a Means of Human-Friendly Communication between Man and Machine

  • Bien, Zeungnam
    • Proceedings of the IEEK Conference
    • /
    • 2000.07a
    • /
    • pp.3-6
    • /
    • 2000
  • In this paper, ‘gesture’ is discussed as a means of human-friendly communication between man and machine. We classify various gestures into two Categories: ‘contact based’ and ‘non-contact based’ Each method is reviewed and some real applications are introduced. Also, key design issues of the method are addressed and some contributions of soft-computing techniques, such as fuzzy logic, artificial neural networks (ANN), rough set theory and evolutionary computation, are discussed.

  • PDF

Design of Hand Gestures for Smart Home Appliances based on a User Centered Approach (스마트홈 내 기기와의 상호작용을 위한 사용자 중심의 핸드 제스처 도출)

  • Choi, Eun-Jung;Kwon, Sung-Hyuk;Lee, Dong-Hun;Lee, Ho-Jin;Chung, Min-K.
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.38 no.3
    • /
    • pp.182-190
    • /
    • 2012
  • With the progress of both wire and wireless home networking technology, various projects on smart home have been carried out in the world (Harper, 2003), and at the same time, new approaches to interact with smart home systems efficiently and effectively have also been investigated. A gesture-based interface is one of these approaches. Especially with advance of gesture recognition technologies, a variety of research studies on gesture interactions with the functions of IT devices have been conducted. However, there are few research studies which suggested and investigated the use of gestures for controlling smart home appliances. In this research the gestures for selected smart home appliances are suggested based on a user centered approach. A total of thirty-eight functions were selected, and a total of thirty participants generated gestures for each function. Based on the Nielsen (2004), Lee et al. (2010) and Kuhnel et al. (2011), the gesture with the highest frequency for each function (Top gesture) has been suggested and investigated.

A Study of Pattern-based Gesture Interaction in Tabletop Environments (테이블탑 환경에서 패턴 기반의 제스처 인터렉션 방법 연구)

  • Kim, Gun-Hee;Cho, Hyun-Chul;Pei, Wen-Hua;Ha, Sung-Do;Park, Ji-Hyung
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.696-700
    • /
    • 2009
  • In this paper, we present a framework which enables users to interact naturally with hand gestures on a digital table. In general tabletop applications, one gesture is mapped to one function or command. Therefore, users should know these relations, and make predefined gestures as input. In contrast, users can make input gesture without cognitive load in our system. Instead of burdening users, the system possesses knowledge about gesture interaction, and infers proactively users' gestures and intentions. When users make a gesture on the digital surface, the system begins to analyze the gestures and designs the response according to users' intention.

  • PDF

Feature-Strengthened Gesture Recognition Model based on Dynamic Time Warping (Dynamic Time Warping 기반의 특징 강조형 제스처 인식 모델)

  • Kwon, Hyuck Tae;Lee, Suk Kyoon
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.3
    • /
    • pp.143-150
    • /
    • 2015
  • As smart devices get popular, research on gesture recognition using their embedded-accelerometer draw attention. As Dynamic Time Warping(DTW), recently, has been used to perform gesture recognition on data sequence from accelerometer, in this paper we propose Feature-Strengthened Gesture Recognition(FsGr) Model which can improve the recognition success rate when DTW is used. FsGr model defines feature-strengthened parts of data sequences to similar gestures which might produce unsuccessful recognition, and performs additional DTW on them to improve the recognition rate. In training phase, FsGr model identifies sets of similar gestures, and analyze features of gestures per each set. During recognition phase, it makes additional recognition attempt based on the result of feature analysis to improve the recognition success rate, when the result of first recognition attempt belongs to a set of similar gestures. We present the performance result of FsGr model, by experimenting the recognition of lower case alphabets.

A Study on Gesture Recognition Using Principal Factor Analysis (주 인자 분석을 이용한 제스처 인식에 관한 연구)

  • Lee, Yong-Jae;Lee, Chil-Woo
    • Journal of Korea Multimedia Society
    • /
    • v.10 no.8
    • /
    • pp.981-996
    • /
    • 2007
  • In this paper, we describe a method that can recognize gestures by obtaining motion features information with principal factor analysis from sequential gesture images. In the algorithm, firstly, a two dimensional silhouette region including human gesture is segmented and then geometric features are extracted from it. Here, global features information which is selected as some meaningful key feature effectively expressing gestures with principal factor analysis is used. Obtained motion history information representing time variation of gestures from extracted feature construct one gesture subspace. Finally, projected model feature value into the gesture space is transformed as specific state symbols by grouping algorithm to be use as input symbols of HMM and input gesture is recognized as one of the model gesture with high probability. Proposed method has achieved higher recognition rate than others using only shape information of human body as in an appearance-based method or extracting features intuitively from complicated gestures, because this algorithm constructs gesture models with feature factors that have high contribution rate using principal factor analysis.

  • PDF

Design and Evaluation of a Hand-held Device for Recognizing Mid-air Hand Gestures (공중 손동작 인식을 위한 핸드 헬드형 기기의 설계 및 평가)

  • Seo, Kyeongeun;Cho, Hyeonjoong
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.4 no.2
    • /
    • pp.91-96
    • /
    • 2015
  • We propose AirPincher, a handheld pointing device for recognizing delicate mid-air hand gestures to control a remote display. AirPincher is designed to overcome disadvantages of the two kinds of existing hand gesture-aware techniques such as glove-based and vision-based. The glove-based techniques cause cumbersomeness of wearing gloves every time and the vision-based techniques incur performance dependence on distance between a user and a remote display. AirPincher allows a user to hold the device in one hand and to generate several delicate finger gestures. The gestures are captured by several sensors proximately embedded into AirPincher. These features help AirPincher avoid the aforementioned disadvantages of the existing techniques. We experimentally find an efficient size of the virtual input space and evaluate two types of pointing interfaces with AirPincher for a remote display. Our experiments suggest appropriate configurations to use the proposed device.

Primitive Body Model Encoding and Selective / Asynchronous Input-Parallel State Machine for Body Gesture Recognition (바디 제스처 인식을 위한 기초적 신체 모델 인코딩과 선택적 / 비동시적 입력을 갖는 병렬 상태 기계)

  • Kim, Juchang;Park, Jeong-Woo;Kim, Woo-Hyun;Lee, Won-Hyong;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.1
    • /
    • pp.1-7
    • /
    • 2013
  • Body gesture Recognition has been one of the interested research field for Human-Robot Interaction(HRI). Most of the conventional body gesture recognition algorithms used Hidden Markov Model(HMM) for modeling gestures which have spatio-temporal variabilities. However, HMM-based algorithms have difficulties excluding meaningless gestures. Besides, it is necessary for conventional body gesture recognition algorithms to perform gesture segmentation first, then sends the extracted gesture to the HMM for gesture recognition. This separated system causes time delay between two continuing gestures to be recognized, and it makes the system inappropriate for continuous gesture recognition. To overcome these two limitations, this paper suggests primitive body model encoding, which performs spatio/temporal quantization of motions from human body model and encodes them into predefined primitive codes for each link of a body model, and Selective/Asynchronous Input-Parallel State machine(SAI-PSM) for multiple-simultaneous gesture recognition. The experimental results showed that the proposed gesture recognition system using primitive body model encoding and SAI-PSM can exclude meaningless gestures well from the continuous body model data, while performing multiple-simultaneous gesture recognition without losing recognition rates compared to the previous HMM-based work.

Real-Time Recognition Method of Counting Fingers for Natural User Interface

  • Lee, Doyeob;Shin, Dongkyoo;Shin, Dongil
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.5
    • /
    • pp.2363-2374
    • /
    • 2016
  • Communication occurs through verbal elements, which usually involve language, as well as non-verbal elements such as facial expressions, eye contact, and gestures. In particular, among these non-verbal elements, gestures are symbolic representations of physical, vocal, and emotional behaviors. This means that gestures can be signals toward a target or expressions of internal psychological processes, rather than simply movements of the body or hands. Moreover, gestures with such properties have been the focus of much research for a new interface in the NUI/NUX field. In this paper, we propose a method for recognizing the number of fingers and detecting the hand region based on the depth information and geometric features of the hand for application to an NUI/NUX. The hand region is detected by using depth information provided by the Kinect system, and the number of fingers is identified by comparing the distance between the contour and the center of the hand region. The contour is detected using the Suzuki85 algorithm, and the number of fingers is calculated by detecting the finger tips in a location at the maximum distance to compare the distances between three consecutive dots in the contour and the center point of the hand. The average recognition rate for the number of fingers is 98.6%, and the execution time is 0.065 ms for the algorithm used in the proposed method. Although this method is fast and its complexity is low, it shows a higher recognition rate and faster recognition speed than other methods. As an application example of the proposed method, this paper explains a Secret Door that recognizes a password by recognizing the number of fingers held up by a user.

Gesture Interaction Design based on User Preference for the Elastic Handheld Device

  • Yoo, Hoon Sik;Ju, Da Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.6
    • /
    • pp.519-533
    • /
    • 2016
  • Objective: This study lays its aims at the definition of relevant operation method and function by researching on the value to be brought when applying smart device that can hand carry soft and flexible materials like jelly. Background: New technology and material play a role in bringing type transformation of interface and change of operation system. Recently, importance has been increased on the study of Organic User Interface (OUI) that conducts research on the value of new method of input and output adopting soft and flexible materials for various instruments. Method: For fulfillment of the study, 27 kinds of gestures have been defined that are usable in handheld device based on existing studies. Quantitative research of survey was conducted of adult male and female of 20s through 30s and an analysis was done on the function that can be linked to gestures with highest level of satisfaction. In order to analyze needs and hurdles of users for the defined gesture, a focus group interview was conducted aiming at the groups of early adopters and ordinary users. Results: As a result, it was found that users have much value regarding usability and fun for elastic device and analysis could be conducted on preferred gesture and its linkable functions. Conclusion: What is most significant with this study is that it sheds new light on the values of a device made of elastic material. Beyond finding and defining the gestures and functions that can be applied to a handheld elastic device, the present study identified the value elements of an elastic device - 'usability and 'fun' -, which users can basically desire from using it. Application: The data that this study brought forth through preference and satisfaction test with the gestures and associated functions will help commercialize an elastic device in future.