• 제목/요약/키워드: Gestures

검색결과 484건 처리시간 0.025초

LCD Display 설비 Contents의 Kinect기반 동작제어 기술 구현에 관한 연구 (A Study on Implementing Kinect-Based Control for LCD Display Contents)

  • 노정규
    • 전기학회논문지
    • /
    • 제63권4호
    • /
    • pp.565-569
    • /
    • 2014
  • Recently, various kinds of new computer controlled devices have been introduced in a wide range of areas, and convenient user interfaces for controlling the devices are strongly needed. To implement natural user interfaces(NUIs) on top of the devices, new technologies like a touch screen, Wii Remote, wearable interfaces, and Microsoft Kinect were presented. This paper presents a natural and intuitive gesture-based model for controlling contents of LCD display. Microsoft Kinect sensor and its SDK are used to recognize human gestures, and the gestures are interpreted into corresponding commands to be executed. A command dispatch model is also proposed in order to handle the commands more naturally. I expect the proposed interface can be used in various fields, including display contents control.

제스처 할당 모드를 이용한 마리오네트 조정 시스템 (Marionette Control System using Gesture Mode Change)

  • 천경민;곽수희;류근호
    • 제어로봇시스템학회논문지
    • /
    • 제21권2호
    • /
    • pp.150-156
    • /
    • 2015
  • In this paper, a marionette control system using wrist and finger gestures through an IMU sensor is studied. The signals from the sensor device are conditioned and recognized, then the commands are sent to the 8 motors of the marionette via Bluetooth (5 motors control the motion of the marionette, and 3 motors control the location of the marionette). It is revealed that the degree of freedom of fingers are not independent from each other, therefore, some gestures are hardly made. Gesture mode changes for difficult postures of the fingers in cases of a lack of finger DOF are proposed. Therefore, the gesture mode change switches the assignment of gesture as required. Experimental results show that gesture mode change is successful for appropriate postures of a marionette.

Implementation of a Gesture Recognition Signage Platform for Factory Work Environments

  • Rho, Jungkyu
    • International Journal of Internet, Broadcasting and Communication
    • /
    • 제12권3호
    • /
    • pp.171-176
    • /
    • 2020
  • This paper presents an implementation of a gesture recognition platform that can be used in a factory workplaces. The platform consists of signages that display worker's job orders and a control center that is used to manage work orders for factory workers. Each worker does not need to bring work order documents and can browse the assigned work orders on the signage at his/her workplace. The contents of signage can be controlled by worker's hand and arm gestures. Gestures are extracted from body movement tracked by 3D depth camera and converted to the commandsthat control displayed content of the signage. Using the control center, the factory manager can assign tasks to each worker, upload work order documents to the system, and see each worker's progress. The implementation has been applied experimentally to a machining factory workplace. This flatform provides convenience for factory workers when they are working at workplaces, improves security of techincal documents, but can also be used to build smart factories.

궤적의 방향 변화 분석에 의한 제스처 인식 알고리듬 (Gesture Recognition Algorithm by Analyzing Direction Change of Trajectory)

  • 박장현;김민수
    • 한국정밀공학회지
    • /
    • 제22권4호
    • /
    • pp.121-127
    • /
    • 2005
  • There is a necessity for the communication between intelligent robots and human beings because of wide spread use of them. Gesture recognition is currently being studied in regards to better conversing. On the basis of previous research, however, the gesture recognition algorithms appear to require not only complicated algorisms but also separate training process for high recognition rates. This study suggests a gesture recognition algorithm based on computer vision system, which is relatively simple and more efficient in recognizing various human gestures. After tracing the hand gesture using a marker, direction changes of the gesture trajectory were analyzed to determine the simple gesture code that has minimal information to recognize. A map is developed to recognize the gestures that can be expressed with different gesture codes. Through the use of numerical and geometrical trajectory, the advantages and disadvantages of the suggested algorithm was determined.

제스처 및 음성 인식을 이용한 윈도우 시스템 제어에 관한 연구 (Study about Windows System Control Using Gesture and Speech Recognition)

  • 김주홍;진성일이남호이용범
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 1998년도 추계종합학술대회 논문집
    • /
    • pp.1289-1292
    • /
    • 1998
  • HCI(human computer interface) technologies have been often implemented using mouse, keyboard and joystick. Because mouse and keyboard are used only in limited situation, More natural HCI methods such as speech based method and gesture based method recently attract wide attention. In this paper, we present multi-modal input system to control Windows system for practical use of multi-media computer. Our multi-modal input system consists of three parts. First one is virtual-hand mouse part. This part is to replace mouse control with a set of gestures. Second one is Windows control system using speech recognition. Third one is Windows control system using gesture recognition. We introduce neural network and HMM methods to recognize speeches and gestures. The results of three parts interface directly to CPU and through Windows.

  • PDF

신경인지 검사를 위한 모션 센싱 시스템 (Motion Sensing System for Automation of Neuropsycological Test)

  • 조원서;천경민;류근호
    • 센서학회지
    • /
    • 제26권2호
    • /
    • pp.128-134
    • /
    • 2017
  • Until now, neuropsychological tests can diagnose the brain dysfunction, however, cannot distinguish the objective data of experiment enough to distinguish the relationships between brain dysfunction and cerebropathia. In this paper, an automatic cognitive test equipment system with 6-axis motion sensors was proposed for the automation of neuropsychological tests. Fist-Edge-Palm(FEP) test and Go-no go test were used to evaluate motor programming of frontal lobe. The motion data from the specially designed motion glove are transmitted wirelessly to a computer to detect the gestures automatically. The healthy 20 and 11 persons are investigated for the FEP and Go-No go test, respectively. The recognition rates of gestures of FEP and Go-No go test are min. 91.38% and 89.09%. In conclusion, the automations of cognitive tests are successful to diagnose the brain diagnostics quantitatively.

사지 실행증의 평가 및 신경생리학적 고찰 (Assessment and Neurophysiology of the Limb Apraxia: Review Article)

  • 최진호;박지원;권용현
    • The Journal of Korean Physical Therapy
    • /
    • 제18권2호
    • /
    • pp.7-16
    • /
    • 2006
  • The purpose of this study was to review the limb apraxia. It includes the evaluation and neurophysiological aspects for limb apraxia. Limb apraxia comprised a wide spectrum of higher-order motor disorders that results from acquired brain disease affecting the performance of skilled and/or learned movements with the forelimbs and is a common sequela of left brain damage that consists of a deficit in performing gestures to verbal command or to imitation. There are two forms in limb apraxia; ideational apraxia and ideomotor apraxia. A assessment of limb apraxia typically includes pantomiming and imitation of transitive, intransitive, and meaningless gestures. Limb apraxia has been attributed to damage confined to the cerebral cortex, cortico-cortical connecting pathways, and basal ganglia.

  • PDF

English Input Keypad Method Using Picker -Based Interface

  • Kwon, Soon-Kak;Kim, Heung-Jun
    • 한국멀티미디어학회논문지
    • /
    • 제18권11호
    • /
    • pp.1383-1390
    • /
    • 2015
  • By according to the development of the mobile devices, a touch screen provides the variety of inputting character and the flexibility of user interface. Currently, the physically simple touch method is widely used for English input but this simple touch is not increasing the variety of inputs and flexibility of the user interfaces. In this paper, we propose a new method to input English characters continuously by recognizing gestures instead of the simple touches. The proposed method places the rotational pickers on the screen for changing the alphabetical sequence instead of the keys and inputs English characters through the flick gestures and the touches. Simulation results show that the proposed keypad method has better performance than the keypad of the conventional methods.

근전도와 임피던스를 이용한 손동작 추정 (Estimation of Hand Gestures Using EMG and Bioimpedance)

  • 김수찬
    • 전기학회논문지
    • /
    • 제65권1호
    • /
    • pp.194-199
    • /
    • 2016
  • EMG has specific information which is related to movements according to the activities of muscles. Therefore, users can intuitively control a prosthesis. For this reason, biosignals are very useful and convenient in this kind of application. Bioimpednace also provides specific information about movements like EMG. In this study, we used both EMG and bioimpedance to classify the typical hand gestures such as hand open, hand close, no motion (rest), supination, and pronation. Nine able-bodied subjects and one amputee were used as experimental data set. The accuracy was $98{\pm}1.9%$ when 2 bio-impedance and 8 EMG channels were used together for normal subjects. The number of EMG channels affected the accuracy, but it was stable when more than 5 channels were used. For the amputee, the accuracy is higher when we use both of them than when using only EMG. Therefore, accurate and stable hand motion estimation is possible by adding bioimepedance which shows structural information and EMG together.

Safe and Reliable Intelligent Wheelchair Robot with Human Robot Interaction

  • Hyuk, Moon-In;Hyun, Joung-Sang;Kwang, Kum-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.120.1-120
    • /
    • 2001
  • This paper proposes a prototype of a safe and reliable wheelchair robot with Human Robot Interaction (HRI). Since the wheelchair users are usually the handicapped, the wheelchair robot must guarantee the safety and reliability for the motion while considering users intention, A single color CCD camera is mounted for input user´s command based on human-friendly gestures, and a ultra sonic sensor array is used for sensing external motion environment. We use face and hand directional gestures as the user´s command. By combining the user´s command with the sensed environment configuration, the planner of the wheelchair robot selects an optimal motion. We implement a prototype wheelchair robot, MR, HURI (Mobile Robot with Human Robot Interaction) ...

  • PDF