• Title/Summary/Keyword: hand gesture analysis

Search Result 55, Processing Time 0.024 seconds

Gesture Control Gaming for Motoric Post-Stroke Rehabilitation

  • Andi Bese Firdausiah Mansur
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.10
    • /
    • pp.37-43
    • /
    • 2023
  • The hospital situation, timing, and patient restrictions have become obstacles to an optimum therapy session. The crowdedness of the hospital might lead to a tight schedule and a shorter period of therapy. This condition might strike a post-stroke patient in a dilemma where they need regular treatment to recover their nervous system. In this work, we propose an in-house and uncomplex serious game system that can be used for physical therapy. The Kinect camera is used to capture the depth image stream of a human skeleton. Afterwards, the user might use their hand gesture to control the game. Voice recognition is deployed to ease them with play. Users must complete the given challenge to obtain a more significant outcome from this therapy system. Subjects will use their upper limb and hands to capture the 3D objects with different speeds and positions. The more substantial challenge, speed, and location will be increased and random. Each delegated entity will raise the scores. Afterwards, the scores will be further evaluated to correlate with therapy progress. Users are delighted with the system and eager to use it as their daily exercise. The experimental studies show a comparison between score and difficulty that represent characteristics of user and game. Users tend to quickly adapt to easy and medium levels, while high level requires better focus and proper synchronization between hand and eye to capture the 3D objects. The statistical analysis with a confidence rate(α:0.05) of the usability test shows that the proposed gaming is accessible, even without specialized training. It is not only for therapy but also for fitness because it can be used for body exercise. The result of the experiment is very satisfying. Most users enjoy and familiarize themselves quickly. The evaluation study demonstrates user satisfaction and perception during testing. Future work of the proposed serious game might involve haptic devices to stimulate their physical sensation.

Development for Multi-modal Realistic Experience I/O Interaction System (멀티모달 실감 경험 I/O 인터랙션 시스템 개발)

  • Park, Jae-Un;Whang, Min-Cheol;Lee, Jung-Nyun;Heo, Hwan;Jeong, Yong-Mu
    • Science of Emotion and Sensibility
    • /
    • v.14 no.4
    • /
    • pp.627-636
    • /
    • 2011
  • The purpose of this study is to develop the multi-modal interaction system. This system provides realistic and an immersive experience through multi-modal interaction. The system recognizes user behavior, intention, and attention, which overcomes the limitations of uni-modal interaction. The multi-modal interaction system is based upon gesture interaction methods, intuitive gesture interaction and attention evaluation technology. The gesture interaction methods were based on the sensors that were selected to analyze the accuracy of the 3-D gesture recognition technology using meta-analysis. The elements of intuitive gesture interaction were reflected through the results of experiments. The attention evaluation technology was developed by the physiological signal analysis. This system is divided into 3 modules; a motion cognitive system, an eye gaze detecting system, and a bio-reaction sensing system. The first module is the motion cognitive system which uses the accelerator sensor and flexible sensors to recognize hand and finger movements of the user. The second module is an eye gaze detecting system that detects pupil movements and reactions. The final module consists of a bio-reaction sensing system or attention evaluating system which tracks cardiovascular and skin temperature reactions. This study will be used for the development of realistic digital entertainment technology.

  • PDF

Gesture Interaction Design based on User Preference for the Elastic Handheld Device

  • Yoo, Hoon Sik;Ju, Da Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.6
    • /
    • pp.519-533
    • /
    • 2016
  • Objective: This study lays its aims at the definition of relevant operation method and function by researching on the value to be brought when applying smart device that can hand carry soft and flexible materials like jelly. Background: New technology and material play a role in bringing type transformation of interface and change of operation system. Recently, importance has been increased on the study of Organic User Interface (OUI) that conducts research on the value of new method of input and output adopting soft and flexible materials for various instruments. Method: For fulfillment of the study, 27 kinds of gestures have been defined that are usable in handheld device based on existing studies. Quantitative research of survey was conducted of adult male and female of 20s through 30s and an analysis was done on the function that can be linked to gestures with highest level of satisfaction. In order to analyze needs and hurdles of users for the defined gesture, a focus group interview was conducted aiming at the groups of early adopters and ordinary users. Results: As a result, it was found that users have much value regarding usability and fun for elastic device and analysis could be conducted on preferred gesture and its linkable functions. Conclusion: What is most significant with this study is that it sheds new light on the values of a device made of elastic material. Beyond finding and defining the gestures and functions that can be applied to a handheld elastic device, the present study identified the value elements of an elastic device - 'usability and 'fun' -, which users can basically desire from using it. Application: The data that this study brought forth through preference and satisfaction test with the gestures and associated functions will help commercialize an elastic device in future.

A Study on Children Edutainment Contents Development with Hand Gesture Recognition and Electronic Dice (전자주사위 및 손동작 인식을 활용한 아동용 에듀테인먼트 게임 콘텐츠 개발에 관한 연구)

  • Ok, Soo-Yol
    • Journal of Korea Multimedia Society
    • /
    • v.14 no.10
    • /
    • pp.1348-1364
    • /
    • 2011
  • As the existing edutainment contents for children are mostly comprised of educational tools which unilaterally induce educatees to passively respond to them, the content-creating methodologies in terms of which active and voluntary learning is made possible is urgently needed. In this paper, we present the implementation of the tangible 'electronic dice' interface as an interactive tool for behavior-based edutainment contents, and propose a methodology for developing edutainment contents for children by utilizing the recognition technique of hand movement based on depth-image information. Also proposed in the paper are an authoring and management tool of learning quizzes that allows educators to set up and manage their learning courseware, and a log analysis system of learning achievement for real-time monitoring of educational progress. The behavior-based tangible interface and edutainment contents that we propose provide the easy-to-operate interaction with a real object, which augments educatees' interest in learning, thus leading to their active and voluntary attitude toward learning. Furthermore, The authoring and management tool and log analysis system allow us to construct learning programs by children's achievement level and to monitor in real-time the learning development of children educatees by understanding the situation and behavior of their learning development from the analytic results obtained by observing the processes of educatees' solving problems for themselves, and utilizing them for evaluation materials for lesson plans.

The Relationship between Lexical Retrieval and Coverbal Gestures (어휘인출과 구어동반 제스처의 관계)

  • Ha, Ji-Wan;Sim, Hyun-Sub
    • Korean Journal of Cognitive Science
    • /
    • v.22 no.2
    • /
    • pp.123-143
    • /
    • 2011
  • At what point in the process of speech production are gestures involved? According to the Lexical Retrieval Hypothesis, gestures are involved in the lexicalization in the formulating stage. According to the Information Packaging Hypothesis, gestures are involved in the conceptual planning of massages in the conceptualizing stage. We investigated these hypotheses, using the game situation in a TV program that induced the players to involve in both lexicalization and conceptualization simultaneously. The transcription of the verbal utterances was augmented with all arm and hand gestures produced by the players. Coverbal gestures were classified into two types of gestures: lexical gestures and motor gestures. As a result, concrete words elicited lexical gestures significantly more frequently than abstract words, and abstract words elicited motor gestures significantly more frequently than concrete words. The difficulty of conceptualization in concrete words was significantly correlated with the amount of lexical gestures. However, the amount of words and the word frequency were not correlated with the amount of both gestures. This result supports the Information Packaging Hypothesis. Most of all, the importance of motor gestures was inferred from the result that abstract words elicited motor gestures more frequently rather than concrete words. Motor gestures, which have been considered as unrelated to verbal production, were excluded from analysis in many gestural studies. This study revealed motor gestures seemed to be connected to the abstract conceptualization.

  • PDF

Combining Dynamic Time Warping and Single Hidden Layer Feedforward Neural Networks for Temporal Sign Language Recognition

  • Thi, Ngoc Anh Nguyen;Yang, Hyung-Jeong;Kim, Sun-Hee;Kim, Soo-Hyung
    • International Journal of Contents
    • /
    • v.7 no.1
    • /
    • pp.14-22
    • /
    • 2011
  • Temporal Sign Language Recognition (TSLR) from hand motion is an active area of gesture recognition research in facilitating efficient communication with deaf people. TSLR systems consist of two stages: a motion sensing step which extracts useful features from signers' motion and a classification process which classifies these features as a performed sign. This work focuses on two of the research problems, namely unknown time varying signal of sign languages in feature extraction stage and computing complexity and time consumption in classification stage due to a very large sign sequences database. In this paper, we propose a combination of Dynamic Time Warping (DTW) and application of the Single hidden Layer Feedforward Neural networks (SLFNs) trained by Extreme Learning Machine (ELM) to cope the limitations. DTW has several advantages over other approaches in that it can align the length of the time series data to a same prior size, while ELM is a useful technique for classifying these warped features. Our experiment demonstrates the efficiency of the proposed method with the recognition accuracy up to 98.67%. The proposed approach can be generalized to more detailed measurements so as to recognize hand gestures, body motion and facial expression.

Repetitive hand gesture recognition based on frequency analysis (주파수 분석을 이용한 반복적인 손동작 인식)

  • Kim, Jiye;Park, Jong-Il
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2017.06a
    • /
    • pp.166-167
    • /
    • 2017
  • 가상 현실에 대한 관심이 높아지면서, 가상 물체와 사람 사이의 보다 자연스러운 상호작용이 중요하게 되었다. 그 중 가장 많이 사용되는 방식 중 하나가 바로 손동작이다. 사람들은 손동작을 통해 자신의 감정을 전달하거나 자신의 의견을 표현할 수 있기 때문에 손동작은 Natural User Interface(NUI)의 중요한 위치를 차지하고 있다. 본 논문에서는 사람들의 손동작 중 비교적 큰 비중을 차지하는 반복적인 궤적을 그리는 손동작 인식을 위한 방법을 제안한다. 손이 움직이는 방향과 거리의 3 차원 좌표 값을 이용하여 벡터화를 한 후, 이 데이터를 Fast Fourier transform(FFT)와 Support Vector Machine(SVM)을 통해 반복적인 손동작을 인식함으로써 자연스러운 손동작을 비교적 정확히 인식할 수 있다.

  • PDF

Recognition of Hand gesture to Human-Computer Interaction (손동작 인식을 통한 Human-Computer Interaction 구현)

  • 이래경;김성신
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.1
    • /
    • pp.28-32
    • /
    • 2001
  • 인간의 손동작 인식은 오랫동안 언어로서의 역할을 해왔던 통신수단의 한 방법이다. 현대의 사회가 정보화 사회로 진행됨에 따라 보다 빠르고 정확한 의사소통 및 정보의 전달을 필요로 하는 가운데 사람과 컴퓨터간의 상호 연결 혹은 사람의 의사 표현에 있어 기존의 장치들이 가지는 단점을 보안하며 이 부분에 사람의 두 손으로 표현되는 자유로운 몸짓을 이용하려는 연구가 최근에 많이 진행되고 있는 추세이다. 본 논문에선 2차원 입력 영상으로부터 동적인 손동작의 사용 없이 손의 특징을 이용한 새로운 인식 알고리즘을 제안하고, 보다 높은 인식률과 실 시간적 처리를 위해 Radial Basis Function Network 및 부가적인 특징점을 통한 손동작의 인식을 구현하였다. 또한 인식된 손동작의 의미를 바탕으로 인식률 및 손동작 표현의 의미성에 대한 정확도를 판별하기 위해 로봇의 제어에 적용한 실험을 수행하였다.

  • PDF

Interaction Intent Analysis of Multiple Persons using Nonverbal Behavior Features (인간의 비언어적 행동 특징을 이용한 다중 사용자의 상호작용 의도 분석)

  • Yun, Sang-Seok;Kim, Munsang;Choi, Mun-Taek;Song, Jae-Bok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.19 no.8
    • /
    • pp.738-744
    • /
    • 2013
  • According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction.

Hand Motion Signal Extraction Based on Electric Field Sensors Using PLN Spectrum Analysis (PLN 성분 분석을 통한 전기장센서 기반 손동작신호 추출)

  • Jeong, Seonil;Kim, Youngchul
    • Smart Media Journal
    • /
    • v.9 no.4
    • /
    • pp.97-101
    • /
    • 2020
  • Using passive electric field sensor which operates in non-contact mode, we can measure the electric potential induced from the change of electric charges on a sensor caused by the movement of human body or hands. In this study, we propose a new method, which utilizes PLN induced to the sensor around the moving object, to detect one's hand movement and extract gesture frames from the detected signals. Signals from the EPS sensors include a large amount of power line noise usually existing in the places such as rooms or buildings. Using the fact that the PLN is shielded in part by human access to the sensor, signals caused by motion or hand movement are detected. PLN consists mainly of signals with frequency of 60 Hz and its harmonics. In our proposed method, signals only 120 Hz component in frequency domain are chosen selectively and exclusively utilized for detection of hand movement. We use FFT to measure a spectral-separated frequency signal. The signals obtained from sensors in this way are continued to be compared with the threshold preset in advance. Once motion signals are detected passing throng the threshold, we determine the motion frame based on period between the first threshold passing time and the last one. The motion detection rate of our proposed method was about 90% while the correct frame extraction rate was about 85%. The method like our method, which use PLN signal in order to extract useful data about motion movement from non-contact mode EPS sensors, has been rarely reported or published in recent. This research results can be expected to be useful especially in circumstance of having surrounding PLN.