• Title/Summary/Keyword: Gesture Analysis

Search Result 142, Processing Time 0.033 seconds

A Study on Hand Gesture Recognition with Low-Resolution Hand Images (저해상도 손 제스처 영상 인식에 대한 연구)

  • Ahn, Jung-Ho
    • Journal of Satellite, Information and Communications
    • /
    • v.9 no.1
    • /
    • pp.57-64
    • /
    • 2014
  • Recently, many human-friendly communication methods have been studied for human-machine interface(HMI) without using any physical devices. One of them is the vision-based gesture recognition that this paper deals with. In this paper, we define some gestures for interaction with objects in a predefined virtual world, and propose an efficient method to recognize them. For preprocessing, we detect and track the both hands, and extract their silhouettes from the low-resolution hand images captured by a webcam. We modeled skin color by two Gaussian distributions in RGB color space and use blob-matching method to detect and track the hands. Applying the foodfill algorithm we extracted hand silhouettes and recognize the hand shapes of Thumb-Up, Palm and Cross by detecting and analyzing their modes. Then, with analyzing the context of hand movement, we recognized five predefined one-hand or both-hand gestures. Assuming that one main user shows up for accurate hand detection, the proposed gesture recognition method has been proved its efficiency and accuracy in many real-time demos.

An Analysis of Human Gesture Recognition Technologies for Electronic Device Control (전자 기기 조종을 위한 인간 동작 인식 기술 분석)

  • Choi, Min-Seok;Jang, Beakcheol
    • Journal of the Korea Society of Computer and Information
    • /
    • v.19 no.12
    • /
    • pp.91-100
    • /
    • 2014
  • In this paper, we categorize existing human gesture recognition technologies to camera-based, additional hardware-based and frequency-based technologies. Then we describe several representative techniques for each of them, emphasizing their strengths and weaknesses. We define important performance issues for human gesture recognition technologies and analyze recent technologies according to the performance issues. Our analyses show that camera-based technologies are easy to use and have high accuracy, but they have limitations on recognition ranges and need additional costs for their devices. Additional hardware-based technologies are not limited by recognition ranges and not affected by light or noise, but they have the disadvantage that human must wear or carry additional devices and need additional costs for their devices. Finally, frequency-based technologies are not limited by recognition ranges, and they do not need additional devices. However, they have not commercialized yet, and their accuracies can be deteriorated by other frequencies and signals.

Performance Analysis of Exercise Gesture-Recognition Using Convolutional Block Attention Module (합성 블록 어텐션 모듈을 이용한 운동 동작 인식 성능 분석)

  • Kyeong, Chanuk;Jung, Wooyong;Seon, Joonho;Sun, Young-Ghyu;Kim, Jin-Young
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.21 no.6
    • /
    • pp.155-161
    • /
    • 2021
  • Gesture recognition analytics through a camera in real time have been widely studied in recent years. Since a small number of features from human joints are extracted, low accuracy of classifying models is get in conventional gesture recognition studies. In this paper, CBAM (Convolutional Block Attention Module) with high accuracy for classifying images is proposed as a classification model and algorithm calculating the angle of joints depending on actions is presented to solve the issues. Employing five exercise gestures images from the fitness posture images provided by AI Hub, the images are applied to the classification model. Important 8-joint angles information for classifying the exercise gestures is extracted from the images by using MediaPipe, a graph-based framework provided by Google. Setting the features as input of the classification model, the classification model is learned. From the simulation results, it is confirmed that the exercise gestures are classified with high accuracy in the proposed model.

Gesture Interaction Design based on User Preference for the Elastic Handheld Device

  • Yoo, Hoon Sik;Ju, Da Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.35 no.6
    • /
    • pp.519-533
    • /
    • 2016
  • Objective: This study lays its aims at the definition of relevant operation method and function by researching on the value to be brought when applying smart device that can hand carry soft and flexible materials like jelly. Background: New technology and material play a role in bringing type transformation of interface and change of operation system. Recently, importance has been increased on the study of Organic User Interface (OUI) that conducts research on the value of new method of input and output adopting soft and flexible materials for various instruments. Method: For fulfillment of the study, 27 kinds of gestures have been defined that are usable in handheld device based on existing studies. Quantitative research of survey was conducted of adult male and female of 20s through 30s and an analysis was done on the function that can be linked to gestures with highest level of satisfaction. In order to analyze needs and hurdles of users for the defined gesture, a focus group interview was conducted aiming at the groups of early adopters and ordinary users. Results: As a result, it was found that users have much value regarding usability and fun for elastic device and analysis could be conducted on preferred gesture and its linkable functions. Conclusion: What is most significant with this study is that it sheds new light on the values of a device made of elastic material. Beyond finding and defining the gestures and functions that can be applied to a handheld elastic device, the present study identified the value elements of an elastic device - 'usability and 'fun' -, which users can basically desire from using it. Application: The data that this study brought forth through preference and satisfaction test with the gestures and associated functions will help commercialize an elastic device in future.

Hand Feature Extraction Algorithm Using Curvature Analysis For Recognition of Various Hand Gestures (다양한 손 제스처 인식을 위한 곡률 분석 기반의 손 특징 추출 알고리즘)

  • Yoon, Hong-Chan;Cho, Jin-Soo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.20 no.5
    • /
    • pp.13-20
    • /
    • 2015
  • In this paper, we propose an algorithm that can recognize not only the number of stretched fingers but also determination of attached fingers for extracting features required for hand gesture recognition. The proposed algorithm detects the hand area in the input image by the skin color range filter based on a color model and labeling, and then recognizes various hand gestures by extracting the number of stretched fingers and determination of attached fingers using curvature information extracted from outlines and feature points. Experiment results show that the recognition rate and the frame rate are similar to those of the conventional algorithm, but the number of gesture cases that can be defined by the extracted characteristics is about four times higher than the conventional algorithm, so that the proposed algorithm can recognize more various gestures.

Gadget Arms: Interactive Data Visualization using Hand Gesture in Extended Reality (가젯암: 확장현실을 위한 손 제스처 기반 대화형 데이터 시각화 시스템)

  • Choi, JunYoung;Jeong, HaeJin;Jeong, Won-Ki
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.2
    • /
    • pp.31-41
    • /
    • 2019
  • Extended Reality (XR), such as virtual and augmented reality, has huge potential for immersive data visualization and analysis. In XR, users can interact with data and other users realistically by navigating the shared virtual space, allowing for more intuitive data analysis. However, creating a visualization in XR also poses a challenge because complicated, low-level programming is required, which hinders broad adaptation in visual analytics. This paper proposes an interactive visualization authoring tool based on hand gesture for immersive data visualization-Gadget Arms. The proposed system provides a novel user interaction to create and place visualization in the 3D virtual world. This simple, but intuitive, user interaction enables user designs the entire visualization space in the XR without using a host computer and low-level programming. Our user study also confirmed that the proposed user interaction significantly improves the usability of the visualization authoring tool.

Performance of Human Skin Detection in Images According to Color Spaces

  • Kim, Jun-Yup;Do, Yong-Tae
    • Proceedings of the Korea Society of Information Technology Applications Conference
    • /
    • 2005.11a
    • /
    • pp.153-156
    • /
    • 2005
  • Skin region detection in images is an important process in many computer vision applications targeting humans such as hand gesture recognition and face identification. It usually starts at a pixel-level, and involves a pre-process of color spae transformation followed by a classification process. A color space transformation is assumed to increase separability between skin classes and other classes, to increase similarity among different skin tones, and to bring a robust performance under varying imaging conditions, without any complicated analysis. In this paper, we examine if the color space transformation actually brings those benefits to the problem of skin region detection on a set of human hand images with different postures, backgrounds, people, and illuminations. Our experimental results indicate that color space transfomation affects the skin detection performance. Although the performance depends on camera and surround conditions, normalized [R, G, B] color space may be a good choice in general.

  • PDF

Combining Object Detection and Hand Gesture Recognition for Automatic Lighting System Control

  • Pham, Giao N.;Nguyen, Phong H.;Kwon, Ki-Ryong
    • Journal of Multimedia Information System
    • /
    • v.6 no.4
    • /
    • pp.329-332
    • /
    • 2019
  • Recently, smart lighting systems are the combination between sensors and lights. These systems turn on/off and adjust the brightness of lights based on the motion of object and the brightness of environment. These systems are often applied in places such as buildings, rooms, garages and parking lot. However, these lighting systems are controlled by lighting sensors, motion sensors based on illumination environment and motion detection. In this paper, we propose an automatic lighting control system using one single camera for buildings, rooms and garages. The proposed system is one integration the results of digital image processing as motion detection, hand gesture detection to control and dim the lighting system. The experimental results showed that the proposed system work very well and could consider to apply for automatic lighting spaces.

On-line dyamic hand gesture recognition system for virtual reality using elementary component classifiers (기본 요소분류기를 이용한 가상현실용 실시간 동적 손 제스처 인식 시스템의 구현에 관한 연구)

  • 김종성;이찬수
    • Journal of the Korean Institute of Telematics and Electronics C
    • /
    • v.34C no.9
    • /
    • pp.68-76
    • /
    • 1997
  • This paper presents a system which recognizes dynamic hand gestures for virtual reality(VR). A dynamic hand gesture is a method of communication for a computer and human who uses gestures, especially both hands and fingers. Since the human hands and fingers are not the same in physical dimension, the same form of a gestrue produced by two persons with their hands may not have the same numerical values which are obtained through electronic sensors. In this paper, we apply a fuzzy min-max neural network and feature analysis method using fuzzy logic for on-line pattern recognition.

  • PDF

COARTICULATION AND GESTURE OVERLAP BETWEEN SYLLABLES

  • Cao, Jianfen
    • Proceedings of the KSPS conference
    • /
    • 1996.10a
    • /
    • pp.208-213
    • /
    • 1996
  • This paper reports a preliminary investigation on the time course of intersyllabic coarticulation in Standard Chinese. In this investigation, around 3800 phonetically compact C1V1-C2V2 type disyllabic structures 3re employed to observe the acoustic effect of coarticulation in general, and about 400 disyllabic words are designed as the materials to examine: (1) How the articulators move from one syllable to the next? (2) What is the extent to which the syllables overlapped? And (3) In what sense, the syllables are produced in parallel; and in what sense, they are in sequence? For the convenience of description, we just take the offset transition of V1 end the onset transition of C2 os the rough representations for anticipatory and carryover effect respectively, durational measurements are made correspondingly. To evaluate the possible influence on the behavior of gestural overlap from stress contrast and constituent difference of the syllables, analysis of variance are counducted as well. Based on this study, Some impressions about general nature of coarticulation behined the intersyllabic gesture overlapping in this language are discussed.

  • PDF