• Title/Summary/Keyword: Hand movement

Search Result 817, Processing Time 0.035 seconds

Pacman Game Using Recognition of Hand Movement from Game Industry (게임 산업에서 손동작 인식을 이용한 팩맨 게임)

  • Shin, Seong-Yoon;Rhee, Yang-Won
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.17 no.3
    • /
    • pp.51-57
    • /
    • 2012
  • Pacman is known as the terminator of the loved classic game in the world. In this paper, classic pacman game without using a keyboard or a mouse we should be able to play the game with simple hand gestures. In other words, It use motion game to be a substitute for the hand direction key using the center coordinates. Also, movements of the monster is to be replaced depending on your hand movements by extracting a pointer of hand taking pictures in the MFC dialog using Cam. In this paper, YCbCbr image is convert to RGB images to extract skin color, and multiplication operations and hybrid median filtering was used in order to obtain better images. And it is used to obtain movement of the hand area based on obtain the center of gravity of the hand region.

Visual Feedback and Human Performance in the Foot Mouse Control

  • Hong, Seung-Kweon;Kim, Seon-Soo
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.6
    • /
    • pp.725-731
    • /
    • 2012
  • Objective: The aim of this study is to investigate visual feedback effects and human performance in the foot mouse control. Background: Generally, computer mouse tasks are controlled by visual feedback. In order to understand the characteristics of a foot mouse control, it is important to investigate the patterns of visual feedback involved in foot-mouse control tasks. Human performance of foot mouse control is also an important factor to understand the foot mouse control. Method: Three types of mouse control were determined to investigate visual feedback effects and human performance in the foot mouse control. Visual feedback effects in the foot mouse control were compared with those of a typical hand mouse. The cursor movement speed and mental workload were measured in the three types of tasks and two types of mouses. Results: Mouse control tasks with an element of homing-in to the target were more quickly performed by the hand mouse than the foot mouse. Mental workload was also higher in the foot mouse than the hand mouse. However, in the steering movement, human performance of the foot mouse control was not lower than that of the hand mouse control. Visual feedback in the foot mouse control was less required than in the hand mouse control. Conclusion: The foot mouse was not efficient in the most mouse control tasks, compared to the hand mouse. However, the foot mouse was efficient in the steering movement, moving a cursor within a path with lateral constraints. Application: The results of this study might help to develop the foot mouse.

A Study on the Eye-Hand Coordination for Korean Text Entry Interface Development (한글 문자 입력 인터페이스 개발을 위한 눈-손 Coordination에 대한 연구)

  • Kim, Jung-Hwan;Hong, Seung-Kweon;Myung, Ro-Hae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.26 no.2
    • /
    • pp.149-155
    • /
    • 2007
  • Recently, various devices requiring text input such as mobile phone IPTV, PDA and UMPC are emerging. The frequency of text entry for them is also increasing. This study was focused on the evaluation of Korean text entry interface. Various models to evaluate text entry interfaces have been proposed. Most of models were based on human cognitive process for text input. The cognitive process was divided into two components; visual scanning process and finger movement process. The time spent for visual scanning process was modeled as Hick-Hyman law, while the time for finger movement was determined as Fitts' law. There are three questions on the model-based evaluation of text entry interface. Firstly, are human cognitive processes (visual scanning and finger movement) during the entry of text sequentially occurring as the models. Secondly, is it possible to predict real text input time by previous models. Thirdly, does the human cognitive process for text input vary according to users' text entry speed. There was time gap between the real measured text input time and predicted time. The time gap was larger in the case of participants with high speed to enter text. The reason was found out investigating Eye-Hand Coordination during text input process. Differently from an assumption that visual scan on the keyboard is followed by a finger movement, the experienced group performed both visual scanning and finger movement simultaneously. Arrival Lead Time was investigated to measure the extent of time overlapping between two processes. 'Arrival Lead Time' is the interval between the eye fixation on the target button and the button click. In addition to the arrival lead time, it was revealed that the experienced group uses the less number of fixations during text entry than the novice group. This result will contribute to the improvement of evaluation model for text entry interface.

Research on EEG Parameters for Movement Prediction Based on Individual Difference of Athletic Ability and Lateral Asymmetry of Hemisphere (운동능력과 뇌편측성의 개인차에 따른 사지움직임예측을 위한 EEG 변수추출에 관한 연구)

  • Whang, Min-Cheol;Lim, Joa-Sang
    • Journal of the Ergonomics Society of Korea
    • /
    • v.21 no.3
    • /
    • pp.1-12
    • /
    • 2002
  • Recently, EEG gains much interests due to its applicability for people to communicate directly with computers without detouring motor output. This study was designed to address this issue if EEG can be successfully used to predict limb movement. It was found that ordinary people appeared to show significant difference in brainwaves between right hand (foot) and left hand (foot) movement. Lateral asymmetry was also found to interact significantly with EEG. Further research is urged with refined method to provide more useful insights into EEG-based BCI.

Motion and Force Estimation System of Human Fingers (손가락 동작과 힘 추정 시스템)

  • Lee, Dong-Chul;Choi, Young-Jin
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.10
    • /
    • pp.1014-1020
    • /
    • 2011
  • This presents a motion and force estimation system of human fingers by using an Electromyography (EMG) sensor module and a data glove system to be proposed in this paper. Both EMG sensor module and data glove system are developed in such a way to minimize the number of hardware filters in acquiring the signals as well as to reduce their sizes for the wearable. Since the onset of EMG precedes the onset of actual finger movement by dozens to hundreds milliseconds, we show that it is possible to predict the pattern of finger movement before actual movement by using the suggested system. Also, we are to suggest how to estimate the grasping force of hand based on the relationship between RMS taken EMG signal and the applied load. Finally we show the effectiveness of the suggested estimation system through several experiments.

Activations of Cerebral and Cerebellar Cortex Induced by Repetitive Bilateral Motor Excercise (반복적 양측 운동학습에 따른 대뇌 및 소뇌 피질 활성화)

  • Tae, Ki-Sik;Song, Sung-Jae;Kim, Young-Ho
    • Journal of Biomedical Engineering Research
    • /
    • v.28 no.1
    • /
    • pp.139-147
    • /
    • 2007
  • The aim of this study was to evaluate effects of short-tenn repetitive-bilateral excercise on the activation of motor network using functional magnetic resonance imaging (fMRI). The training program was performed at 1 hr/day, 5 days/week during 6 weeks. Fugl-Meyer Assessments (FMA) were performed every two weeks during the training. We compared cerebral and cerebellar cortical activations in two different tasks before and after the training program: (1) the only unaffected hand movement (Task 1); and (2) passive movements of affected hand by the active movement of unaffected hand (Task 2). fMRI was performed at 3T with wrist flexion-extension movement at 1 Hz during the motor tasks. All patients showed significant improvements of FMA scores in their paretic limbs after training. fMRI studies in Task 1 showed that cortical activations decreased in ipsilateral sensorimotor cortex but increased in contralateral sensorimotor cortex and ipsilateral cerebellum. Task 2 showed cortical reorganizations in bilateral sensorimotor cortex, premotor area, supplemetary motor area and cerebellum. Therefore, this study demonstrated that plastic changes of motor network occurred as a neural basis of the improvement subsequent to repetitive-bilateral excercise using the symmetrical upper-limb ann motion trainer.

SVM-Based EEG Signal for Hand Gesture Classification (서포트 벡터 머신 기반 손동작 뇌전도 구분에 대한 연구)

  • Hong, Seok-min;Min, Chang-gi;Oh, Ha-Ryoung;Seong, Yeong-Rak;Park, Jun-Seok
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.29 no.7
    • /
    • pp.508-514
    • /
    • 2018
  • An electroencephalogram (EEG) evaluates the electrical activity generated by brain cell interactions that occur during brain activity, and an EEG can evaluate the brain activity caused by hand movement. In this study, a 16-channel EEG was used to measure the EEG generated before and after hand movement. The measured data can be classified as a supervised learning model, a support vector machine (SVM). To shorten the learning time of the SVM, a feature extraction and vector dimension reduction by filtering is proposed that minimizes motion-related information loss and compresses EEG information. The classification results showed an average of 72.7% accuracy between the sitting position and the hand movement at the electrodes of the frontal lobe.

A Study on Tangible Gesture Interface Prototype Development of the Quiz Game (퀴즈게임의 체감형 제스처 인터페이스 프로토타입 개발)

  • Ahn, Jung-Ho;Ko, Jae-Pil
    • Journal of Digital Contents Society
    • /
    • v.13 no.2
    • /
    • pp.235-245
    • /
    • 2012
  • This paper introduce a quiz game contents based on gesture interface. We analyzed the off-line quiz games, extracted its presiding components, and digitalized them so that the proposed game contents is able to substitute for the off-line quiz games. We used the Kinect camera to obtain the depth images and performed the preprocessing including vertical human segmentation, head detection and tracking and hand detection, and gesture recognition for hand-up, hand vertical movement, fist shape, pass and fist-and-attraction. Especially, we defined the interface gestures designed as a metaphor for natural gestures in real world so that users are able to feel abstract concept of movement, selection and confirmation tangibly. Compared to our previous work, we added the card compensation process for completeness, improved the vertical hand movement and the fist shape recognition methods for the example selection and presented an organized test to measure the recognition performance. The implemented quiz application program was tested in real time and showed very satisfactory gesture recognition results.

Recognition of Virtual Written Characters Based on Convolutional Neural Network

  • Leem, Seungmin;Kim, Sungyoung
    • Journal of Platform Technology
    • /
    • v.6 no.1
    • /
    • pp.3-8
    • /
    • 2018
  • This paper proposes a technique for recognizing online handwritten cursive data obtained by tracing a motion trajectory while a user is in the 3D space based on a convolution neural network (CNN) algorithm. There is a difficulty in recognizing the virtual character input by the user in the 3D space because it includes both the character stroke and the movement stroke. In this paper, we divide syllable into consonant and vowel units by using labeling technique in addition to the result of localizing letter stroke and movement stroke in the previous study. The coordinate information of the separated consonants and vowels are converted into image data, and Korean handwriting recognition was performed using a convolutional neural network. After learning the neural network using 1,680 syllables written by five hand writers, the accuracy is calculated by using the new hand writers who did not participate in the writing of training data. The accuracy of phoneme-based recognition is 98.9% based on convolutional neural network. The proposed method has the advantage of drastically reducing learning data compared to syllable-based learning.

A Study on the Sensory Motor Coordination to Visual and Sound Stimulation (빛과 소리 자극에 대한 지각 운동의 협력에 관한 연구)

  • Kim, Nam-Gyun;Ko, Yong-Ho;Ifukube, T.
    • Journal of Biomedical Engineering Research
    • /
    • v.15 no.1
    • /
    • pp.77-82
    • /
    • 1994
  • We investigated the characteristic of the sensory motor coordination by measuring the hand point ins and the gaze movement to the visual and sound stimulation. Our results showed that the gaze vol ocity to sound stimulation did not depend on stimulation direction, but lagged behind 0.2 sec toward the pheriperal direction to the visual stimulation. Our data showed that to both visual and sound stimulation, the error of hand pointing value increased with an increasement of eccentricity.

  • PDF