• Title/Summary/Keyword: Gesture-based interaction

Search Result 153, Processing Time 0.022 seconds

A Study on Smart Touch Projector System Technology Using Infrared (IR) Imaging Sensor (적외선 영상센서를 이용한 스마트 터치 프로젝터 시스템 기술 연구)

  • Lee, Kuk-Seon;Oh, Sang-Heon;Jeon, Kuk-Hui;Kang, Seong-Soo;Ryu, Dong-Hee;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.7
    • /
    • pp.870-878
    • /
    • 2012
  • Recently, very rapid development of computer and sensor technologies induces various kinds of user interface (UI) technologies based on user experience (UX). In this study, we investigate and develop a smart touch projector system technology on the basis of IR sensor and image processing. In the proposed system, a user can control computer by understanding the control events based on gesture of IR pen as an input device. In the IR image, we extract the movement (or gesture) of the devised pen and track it for recognizing gesture pattern. Also, to correct the error between the coordinate of input image sensor and display device (projector), we propose a coordinate correction algorithm to improve the accuracy of operation. Through this system technology as the next generation human-computer interaction, we can control the events of the equipped computer on the projected image screen without manipulating the computer directly.

Real-time Finger Gesture Recognition (실시간 손가락 제스처 인식)

  • Park, Jae-Wan;Song, Dae-Hyun;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.847-850
    • /
    • 2008
  • On today, human is going to develop machine by using mutual communication to machine. Including vision - based HCI(Human Computer Interaction), the technique which to recognize finger and to track finger is important in HCI systems, in HCI systems. In order to divide finger, this paper uses more effectively dividing the technique using subtraction which is separation of background and foreground, as well as to divide finger from limited background and cluttered background. In order to divide finger, the finger is recognized to make "Template-Matching" by identified fingertip images. And, identified gestures be compared the tracked gesture after tracking recognized finger. In this paper, after obtaining interest area, not only using subtraction image and template-matching but to perform template-matching in the area. So, emphasis is placed on decreasing perform speed and reaction speed, and we propose technique which is more effectively recognizing gestures.

  • PDF

Novel User Interaction Technologies in 3D Display Systems

  • Hopf, Klaus;Chojecki, Paul;Neumann, Frank
    • 한국정보디스플레이학회:학술대회논문집
    • /
    • 2007.08b
    • /
    • pp.1227-1230
    • /
    • 2007
  • This paper describes recent advances in the R&D work achieved at Fraunhofer HHI (Germany) that are believed to provide key technologies for the development of future human-machine interfaces. The paper focus on the area of vision based interaction technologies that will be one essential component in future three-dimensional display systems.

  • PDF

Robot Gesture Reconition System based on PCA algorithm (PCA 알고리즘 기반의 로봇 제스처 인식 시스템)

  • Youk, Yui-Su;Kim, Seung-Young;Kim, Sung-Ho
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2008.04a
    • /
    • pp.400-402
    • /
    • 2008
  • The human-computer interaction technology (HCI) that has played an important role in the exchange of information between human being and computer belongs to a key field for information technology. Recently, control studies through which robots and control devices are controlled by using the movements of a person's body or hands without using conventional input devices such as keyboard and mouse, have been going only in diverse aspects, and their importance has been steadily increasing. This study is proposing a recognition method of user's gestures by applying measurements from an acceleration sensor to the PCA algorithm.

  • PDF

Effect of Tactile Feedback for Button GUI on Mobile Touch Devices

  • Shin, Heesook;Lim, Jeong-Mook;Lee, Jong-Uk;Lee, Geehyuk;Kyung, Ki-Uk
    • ETRI Journal
    • /
    • v.36 no.6
    • /
    • pp.979-987
    • /
    • 2014
  • This paper describes new tactile feedback patterns and the effect of their input performance for a button GUI activated by a tap gesture on mobile touch devices. Based on an analysis of touch interaction and informal user tests, several tactile feedback patterns were designed. Using these patterns, three user experiments were performed to investigate appropriate tactile feedback patterns and their input performance during interaction with a touch button. The results showed that a tactile pattern responding to each touch and release gesture with a rapid response time and short falling time provides the feeling of physically clicking a button. The suggested tactile feedback pattern has a significantly positive effect on the number of typing errors and typing task completion time compared to the performance when no feedback is provided.

DEVS Modeling for Interactive Motion-based Mobile Contents Authoring Tool (모바일 기기 환경의 인터렉티브 모션 기반 콘텐츠 개발 도구와 DEVS 모델링)

  • Ju, Seunghwan;Choi, Yohan;Lim, Yongsoo;Seo, Heesuk
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.11 no.2
    • /
    • pp.121-129
    • /
    • 2015
  • Interactive media is a method of communication in which the output from the media comes from the input of the users. The interactive media lets the user go back with the media. Interactive media works with the user's participation. The media still has the same purpose but the user's input adds the interaction and brings interesting features to the system for a better enjoyment. We need a digital content using a dynamic motion and gesture of the mobile device. We made an authoring tool for content producers to easily create interactive content. We have tried to take advantage of the interaction by using a touch screen and a gravity sensor of the mobile device. This interaction may lead to allow the user to participate in the content, it can be used as a key device to assist in engagement. Furthermore, our authoring tool can be applied to various fields of publishing content.

Gadget Arms: Interactive Data Visualization using Hand Gesture in Extended Reality (가젯암: 확장현실을 위한 손 제스처 기반 대화형 데이터 시각화 시스템)

  • Choi, JunYoung;Jeong, HaeJin;Jeong, Won-Ki
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.2
    • /
    • pp.31-41
    • /
    • 2019
  • Extended Reality (XR), such as virtual and augmented reality, has huge potential for immersive data visualization and analysis. In XR, users can interact with data and other users realistically by navigating the shared virtual space, allowing for more intuitive data analysis. However, creating a visualization in XR also poses a challenge because complicated, low-level programming is required, which hinders broad adaptation in visual analytics. This paper proposes an interactive visualization authoring tool based on hand gesture for immersive data visualization-Gadget Arms. The proposed system provides a novel user interaction to create and place visualization in the 3D virtual world. This simple, but intuitive, user interaction enables user designs the entire visualization space in the XR without using a host computer and low-level programming. Our user study also confirmed that the proposed user interaction significantly improves the usability of the visualization authoring tool.

A Study on Vision Based Gesture Recognition Interface Design for Digital TV (동작인식기반 Digital TV인터페이스를 위한 지시동작에 관한 연구)

  • Kim, Hyun-Suk;Hwang, Sung-Won;Moon, Hyun-Jung
    • Archives of design research
    • /
    • v.20 no.3 s.71
    • /
    • pp.257-268
    • /
    • 2007
  • The development of Human Computer Interface has been relied on the development of technology. Mice and keyboards are the most popular HCI devices for personal computing. However, device-based interfaces are quite different from human to human interaction and very artificial. To develop more intuitive interfaces which mimic human to human interface has been a major research topic among HCI researchers and engineers. Also, technology in the TV industry has rapidly developed and the market penetration rate for big size screen TVs has increased rapidly. The HDTV and digital TV broadcasting are being tested. These TV environment changes require changes of Human to TV interface. A gesture recognition-based interface with a computer vision system can replace the remote control-based interface because of its immediacy and intuitiveness. This research focuses on how people use their hands or arms for command gestures. A set of gestures are sampled to control TV set up by focus group interviews and surveys. The result of this paper can be used as a reference to design a computer vision based TV interface.

  • PDF

Hand Shape Classification using Contour Distribution (윤곽 분포를 이용한 이미지 기반의 손모양 인식 기술)

  • Lee, Changmin;Kim, DaeEun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.6
    • /
    • pp.593-598
    • /
    • 2014
  • Hand gesture recognition based on vision is a challenging task in human-robot interaction. The sign language of finger spelling alphabets has been tested as a kind of hand gesture. In this paper, we test hand gesture recognition by detecting the contour shape and orientation of hand with visual image. The method has three stages, the first stage of finding hand component separated from the background image, the second stage of extracting the contour feature over the hand component and the last stage of comparing the feature with the reference features in the database. Here, finger spelling alphabets are used to verify the performance of our system and our method shows good performance to discriminate finger alphabets.

3D Virtual Reality Game with Deep Learning-based Hand Gesture Recognition (딥러닝 기반 손 제스처 인식을 통한 3D 가상현실 게임)

  • Lee, Byeong-Hee;Oh, Dong-Han;Kim, Tae-Young
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.5
    • /
    • pp.41-48
    • /
    • 2018
  • The most natural way to increase immersion and provide free interaction in a virtual environment is to provide a gesture interface using the user's hand. However, most studies about hand gesture recognition require specialized sensors or equipment, or show low recognition rates. This paper proposes a three-dimensional DenseNet Convolutional Neural Network that enables recognition of hand gestures with no sensors or equipment other than an RGB camera for hand gesture input and introduces a virtual reality game based on it. Experimental results on 4 static hand gestures and 6 dynamic hand gestures showed that they could be used as real-time user interfaces for virtual reality games with an average recognition rate of 94.2% at 50ms. Results of this research can be used as a hand gesture interface not only for games but also for education, medicine, and shopping.