• Title/Summary/Keyword: Gesture-Based User Interface

Search Result 107, Processing Time 0.034 seconds

Accelerometer-based Mobile Game Using the Gestures and Postures (제스처와 자세를 이용한 가속도센서 기반 모바일 게임)

  • Baek, Jong-Hun;Jang, Ik-Jin;Yun, Byoung-Ju
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.379-380
    • /
    • 2006
  • As a result of growth sensor-enabled mobile devices such as PDA, cellular phone and other computing devices, in recent years, users can utilize the diverse digital contents everywhere and anytime. However, the interfaces of mobile applications are often unnatural due to limited resources and miniaturized input/output. Especially, users may feel this problem in some applications such as the mobile game. Therefore, Novel interaction forms have been developed in order to complement the poor user interface of the mobile device and to increase the interest for the mobile game. In this paper, we describe the demonstration of the gesture and posture input supported by an accelerometer. The application example we created are AM-Fishing game on the mobile device that employs the accelerometer as the main interaction modality. The demos show the usability for the gesture and posture interaction.

  • PDF

Implementation of DID interface using gesture recognition (제스쳐 인식을 이용한 DID 인터페이스 구현)

  • Lee, Sang-Hun;Kim, Dae-Jin;Choi, Hong-Sub
    • Journal of Digital Contents Society
    • /
    • v.13 no.3
    • /
    • pp.343-352
    • /
    • 2012
  • In this paper, we implemented a touchless interface for DID(Digital Information Display) system using gesture recognition technique which includes both hand motion and hand shape recognition. Especially this touchless interface without extra attachments gives user both easier usage and spatial convenience. For hand motion recognition, two hand-motion's parameters such as a slope and a velocity were measured as a direction-based recognition way. And extraction of hand area image utilizing YCbCr color model and several image processing methods were adopted to recognize a hand shape recognition. These recognition methods are combined to generate various commands, such as, next-page, previous-page, screen-up, screen-down and mouse -click in oder to control DID system. Finally, experimental results showed the performance of 93% command recognition rate which is enough to confirm the possible application to commercial products.

The Development of the Writing Software for the Electronic Blackboard Supporting the User Action Recognition Functions (사용자 동작 인식 기능을 지원하는 판서 소프트웨어 개발)

  • Choi, Yun-Su;Jung, Jin-Uk;Hwang, Min-Tae;Jin, Kyo-Hong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.5
    • /
    • pp.1213-1220
    • /
    • 2015
  • By the dissemination of the electronic blackboard systems, smart devices, and digital contents, the Korean government is recently conducting the project that replaces the classic education which utilizes paper textbooks with SMART education using various devices. Also, teachers in the field must be easily able to use SMART education infrastructure for the activation of SMART education. Especially, since the electronic blackboard is expected as a education device which will be most common for teachers, the writing software operated on the this device must supports a simple interface. And the usage of it must be simple. In this paper, we developed the writing software for the electronic blackboard which everyone can use easily. Our writing software supports the basic writing function, the human gesture recognition function which recognizes the user gesture and performs works corresponding with that gesture, and the automatic button alignment function based on the frequency of the usages.

Robot User Control System using Hand Gesture Recognizer (수신호 인식기를 이용한 로봇 사용자 제어 시스템)

  • Shon, Su-Won;Beh, Joung-Hoon;Yang, Cheol-Jong;Wang, Han;Ko, Han-Seok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.4
    • /
    • pp.368-374
    • /
    • 2011
  • This paper proposes a robot control human interface using Markov model (HMM) based hand signal recognizer. The command receiving humanoid robot sends webcam images to a client computer. The client computer then extracts the intended commanding hum n's hand motion descriptors. Upon the feature acquisition, the hand signal recognizer carries out the recognition procedure. The recognition result is then sent back to the robot for responsive actions. The system performance is evaluated by measuring the recognition of '48 hand signal set' which is created randomly using fundamental hand motion set. For isolated motion recognition, '48 hand signal set' shows 97.07% recognition rate while the 'baseline hand signal set' shows 92.4%. This result validates the proposed hand signal recognizer is indeed highly discernable. For the '48 hand signal set' connected motions, it shows 97.37% recognition rate. The relevant experiments demonstrate that the proposed system is promising for real world human-robot interface application.

A Study on Comparative Experiment of Hand-based Interface in Immersive Virtua Reality (몰입형 가상현실에서 손 기반 인터페이스의 비교 실험에 관한 연구)

  • Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.25 no.2
    • /
    • pp.1-9
    • /
    • 2019
  • This study compares hand-based interfaces to improve a user's virtual reality (VR) presence by enhancing user immersion in VR interactions. To provide an immersive experience, in which users can more directly control the virtual environment and objects within that environment using their hands and, to simultaneously minimize the device burden on users using immersive VR systems, we designed two experimental interfaces (hand motion recognition sensor- and controller-based interactions). Hand motion recognition sensor-based interaction reflects accurate hand movements, direct gestures, and motion representations in the virtual environment, and it does not require using a device in addition to the VR head mounted display (HMD). Controller-based interaction designs a generalized interface that maps the gesture to the controller's key for easy access to the controller provided with the VR HMD. The comparative experiments in this study confirm the convenience and intuitiveness of VR interactions using the user's hand.

Web-based 3D Virtual Experience using Unity and Leap Motion (Unity와 Leap Motion을 이용한 웹 기반 3D 가상품평)

  • Jung, Ho-Kyun;Park, Hyungjun
    • Korean Journal of Computational Design and Engineering
    • /
    • v.21 no.2
    • /
    • pp.159-169
    • /
    • 2016
  • In order to realize the virtual prototyping (VP) of digital products, it is important to provide the people involved in product development with the appropriate visualization and interaction of the products, and the vivid simulation of user interface (UI) behaviors in an interactive 3D virtual environment. In this paper, we propose an approach to web-based 3D virtual experience using Unity and Leap Motion. We adopt Unity as an implementation platform which easily and rapidly implements the visualization of the products and the design and simulation of their UI behaviors, and allows remote users to get an easy access to the virtual environment. Additionally, we combine Leap Motion with Unity to embody natural and immersive interaction using the user's hand gesture. Based on the proposed approach, we have developed a testbed system for web-based 3D virtual experience and applied it for the design evaluation of various digital products. Button selection test was done to investigate the quality of the interaction using Leap Motion, and a preliminary user study was also performed to show the usefulness of the proposed approach.

Analyzing Input Patterns of Smartphone Applications in Touch Interfaces

  • Bahn, Hyokyung;Kim, Jisun
    • International journal of advanced smart convergence
    • /
    • v.10 no.4
    • /
    • pp.30-37
    • /
    • 2021
  • Touch sensor interface has become the most useful input device in a smartphone. Unlike keypad/keyboard interfaces used in electronic dictionaries and feature phones, smartphone's touch interfaces allow for the recognition of various gestures that represent distinct features of each application's input. In this paper, we analyze application-specific input patterns that appear in smartphone's touch interfaces. Specifically, we capture touch input patterns from various Android applications, and analyze them. Based on this analysis, we observe a certain unique characteristics of application's touch input patterns. This can be utilized in various useful areas like user authentications, prevention of executing application by illegal users, or digital forensic based on logged touch patterns.

Real-Time Recognition Method of Counting Fingers for Natural User Interface

  • Lee, Doyeob;Shin, Dongkyoo;Shin, Dongil
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.5
    • /
    • pp.2363-2374
    • /
    • 2016
  • Communication occurs through verbal elements, which usually involve language, as well as non-verbal elements such as facial expressions, eye contact, and gestures. In particular, among these non-verbal elements, gestures are symbolic representations of physical, vocal, and emotional behaviors. This means that gestures can be signals toward a target or expressions of internal psychological processes, rather than simply movements of the body or hands. Moreover, gestures with such properties have been the focus of much research for a new interface in the NUI/NUX field. In this paper, we propose a method for recognizing the number of fingers and detecting the hand region based on the depth information and geometric features of the hand for application to an NUI/NUX. The hand region is detected by using depth information provided by the Kinect system, and the number of fingers is identified by comparing the distance between the contour and the center of the hand region. The contour is detected using the Suzuki85 algorithm, and the number of fingers is calculated by detecting the finger tips in a location at the maximum distance to compare the distances between three consecutive dots in the contour and the center point of the hand. The average recognition rate for the number of fingers is 98.6%, and the execution time is 0.065 ms for the algorithm used in the proposed method. Although this method is fast and its complexity is low, it shows a higher recognition rate and faster recognition speed than other methods. As an application example of the proposed method, this paper explains a Secret Door that recognizes a password by recognizing the number of fingers held up by a user.

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Design of OpenCV based Finger Recognition System using binary processing and histogram graph

  • Baek, Yeong-Tae;Lee, Se-Hoon;Kim, Ji-Seong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.21 no.2
    • /
    • pp.17-23
    • /
    • 2016
  • NUI is a motion interface. It uses the body of the user without the use of HID device such as a mouse and keyboard to control the device. In this paper, we use a Pi Camera and sensors connected to it with small embedded board Raspberry Pi. We are using the OpenCV algorithms optimized for image recognition and computer vision compared with traditional HID equipment and to implement a more human-friendly and intuitive interface NUI devices. comparison operation detects motion, it proposed a more advanced motion sensors and recognition systems fused connected to the Raspberry Pi.