• Title/Summary/Keyword: Touch gesture

Search Result 73, Processing Time 0.021 seconds

A Study on Smart Touch Projector System Technology Using Infrared (IR) Imaging Sensor (적외선 영상센서를 이용한 스마트 터치 프로젝터 시스템 기술 연구)

  • Lee, Kuk-Seon;Oh, Sang-Heon;Jeon, Kuk-Hui;Kang, Seong-Soo;Ryu, Dong-Hee;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.7
    • /
    • pp.870-878
    • /
    • 2012
  • Recently, very rapid development of computer and sensor technologies induces various kinds of user interface (UI) technologies based on user experience (UX). In this study, we investigate and develop a smart touch projector system technology on the basis of IR sensor and image processing. In the proposed system, a user can control computer by understanding the control events based on gesture of IR pen as an input device. In the IR image, we extract the movement (or gesture) of the devised pen and track it for recognizing gesture pattern. Also, to correct the error between the coordinate of input image sensor and display device (projector), we propose a coordinate correction algorithm to improve the accuracy of operation. Through this system technology as the next generation human-computer interaction, we can control the events of the equipped computer on the projected image screen without manipulating the computer directly.

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.1
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.

Interacting with Touchless Gestures: Taxonomy and Requirements

  • Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.475-481
    • /
    • 2012
  • Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures' natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.

Guidelines for Satisfactory Flick Performances in Touch Screen Mobile Phone (풀터치 휴대폰의 플릭(Flick) 성능에 대한 평가 및 가이드라인)

  • Kim, Huhn
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.4
    • /
    • pp.541-546
    • /
    • 2010
  • The gesture 'Flick' is the most fundamental and important part for efficient interactions in the touch screen that are being extensively applied to mobile phones. This study investigated users' satisfaction of the flick operation in representative touch phones, and measured their performances with established three measures: gap between finger and initial cursor, the number of moved lists per 0.2 seconds, and the number of moved lists after ten continuous flicks. The measurement was performed with high speed camera and motion analysis software. The flick movement in mobile phone with high users' satisfaction showed that the gap between finger and cursor positions was less and the speed reached high within 0.6 seconds quickly and then was drastically slow down. Especially, maximal and common time intervals between continuous flicks were measured with an experiment. Based on the evaluation and measurement, several design guidelines for efficient flick performances were suggested.

A Study on the Automata for Hangul Input of Gesture Type (제스처 형태의 한글입력을 위한 오토마타에 관한 연구)

  • Lim, Yang-Won;Lim, Han-Kyu
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.16 no.2
    • /
    • pp.49-58
    • /
    • 2011
  • Owing to the spread of smart devices using touch screen, the Korean language (Hangul) input methods have been varied. In this paper, we have researched and analyzed for finding the best input way of Hangul, which is appropriate to smart devices. By using automata theory, we have suggested easy and efficient automata. The automata, which we suggested, can be used in a Hangul input of gesture type.

The Design of Efficient User Environment on the FTIR Multi Touch (FTIR 멀티터치 테이블에서 효율적인 사용자 환경 개발)

  • Park, Sang-Bong;Ahn, Jung-Seo
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.12 no.2
    • /
    • pp.85-94
    • /
    • 2012
  • In this paper, we develop the new gestures of screen control with fingers on the FTIR multi touch table. It also describes recognition of mobile devices on the table using infrared camera. The FTIR multi touch table was incovenient to the existiog Bluetooth connection, because it is not provided with an HID(Human Input Device) interface. The proposed data transmission method using mobile device is to relieve the inconvenience of the data transfer and proceed more effectively. The new gestures and data transmission method is verified by FTIR multi touch table that is implemented for testing.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.5
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

Study on Gesture and Voice-based Interaction in Perspective of a Presentation Support Tool

  • Ha, Sang-Ho;Park, So-Young;Hong, Hye-Soo;Kim, Nam-Hun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.593-599
    • /
    • 2012
  • Objective: This study aims to implement a non-contact gesture-based interface for presentation purposes and to analyze the effect of the proposed interface as information transfer assisted device. Background: Recently, research on control device using gesture recognition or speech recognition is being conducted with rapid technological growth in UI/UX area and appearance of smart service products which requires a new human-machine interface. However, few quantitative researches on practical effects of the new interface type have been done relatively, while activities on system implementation are very popular. Method: The system presented in this study is implemented with KINECT$^{(R)}$ sensor offered by Microsoft Corporation. To investigate whether the proposed system is effective as a presentation support tool or not, we conduct experiments by giving several lectures to 40 participants in both a traditional lecture room(keyboard-based presentation control) and a non-contact gesture-based lecture room(KINECT-based presentation control), evaluating their interests and immersion based on contents of the lecture and lecturing methods, and analyzing their understanding about contents of the lecture. Result: We check that whether the gesture-based presentation system can play effective role as presentation supporting tools or not depending on the level of difficulty of contents using ANOVA. Conclusion: We check that a non-contact gesture-based interface is a meaningful tool as a sportive device when delivering easy and simple information. However, the effect can vary with the contents and the level of difficulty of information provided. Application: The results presented in this paper might help to design a new human-machine(computer) interface for communication support tools.

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.1
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

Smart Factory Platform based on Multi-Touch and Image Recognition Technologies (멀티터치 기술과 영상인식 기술 기반의 스마트 팩토리 플랫폼)

  • Hong, Yo-Hoon;Song, Seung-June;Jang, Kwang-Mun;Rho, Jungkyu
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.18 no.1
    • /
    • pp.23-28
    • /
    • 2018
  • In this work, we developed a platform that can monitor status and manage events of factory workplaces by providing events and data collected from various types of multi-touch technology based sensors installed in the workplace. By using the image recognition technology, faces of the people in the factory workplace are recognized and the customized contents for each worker are provided, and security of contents is enhanced by the authenticating an individual worker through face recognition. Contents control function through gesture recognition is constructed, so that workers can easily search documents. Also, it is possible to provide contents for workers by implementing face recognition function in mobile devices. The result of this work can be used to improve workplace safety, convenience of workers, contents security and can be utilized as a base technology for future smart factory construction.