• Title/Summary/Keyword: Hand user interface

Search Result 202, Processing Time 0.027 seconds

Real-Time Recognition Method of Counting Fingers for Natural User Interface

  • Lee, Doyeob;Shin, Dongkyoo;Shin, Dongil
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.5
    • /
    • pp.2363-2374
    • /
    • 2016
  • Communication occurs through verbal elements, which usually involve language, as well as non-verbal elements such as facial expressions, eye contact, and gestures. In particular, among these non-verbal elements, gestures are symbolic representations of physical, vocal, and emotional behaviors. This means that gestures can be signals toward a target or expressions of internal psychological processes, rather than simply movements of the body or hands. Moreover, gestures with such properties have been the focus of much research for a new interface in the NUI/NUX field. In this paper, we propose a method for recognizing the number of fingers and detecting the hand region based on the depth information and geometric features of the hand for application to an NUI/NUX. The hand region is detected by using depth information provided by the Kinect system, and the number of fingers is identified by comparing the distance between the contour and the center of the hand region. The contour is detected using the Suzuki85 algorithm, and the number of fingers is calculated by detecting the finger tips in a location at the maximum distance to compare the distances between three consecutive dots in the contour and the center point of the hand. The average recognition rate for the number of fingers is 98.6%, and the execution time is 0.065 ms for the algorithm used in the proposed method. Although this method is fast and its complexity is low, it shows a higher recognition rate and faster recognition speed than other methods. As an application example of the proposed method, this paper explains a Secret Door that recognizes a password by recognizing the number of fingers held up by a user.

Clinical outcomes of a low-cost single-channel myoelectric-interface three-dimensional hand prosthesis

  • Ku, Inhoe;Lee, Gordon K.;Park, Chan Yong;Lee, Janghyuk;Jeong, Euicheol
    • Archives of Plastic Surgery
    • /
    • v.46 no.4
    • /
    • pp.303-310
    • /
    • 2019
  • Background Prosthetic hands with a myoelectric interface have recently received interest within the broader category of hand prostheses, but their high cost is a major barrier to use. Modern three-dimensional (3D) printing technology has enabled more widespread development and cost-effectiveness in the field of prostheses. The objective of the present study was to evaluate the clinical impact of a low-cost 3D-printed myoelectric-interface prosthetic hand on patients' daily life. Methods A prospective review of all upper-arm transradial amputation amputees who used 3D-printed myoelectric interface prostheses (Mark V) between January 2016 and August 2017 was conducted. The functional outcomes of prosthesis usage over a 3-month follow-up period were measured using a validated method (Orthotics Prosthetics User Survey-Upper Extremity Functional Status [OPUS-UEFS]). In addition, the correlation between the length of the amputated radius and changes in OPUS-UEFS scores was analyzed. Results Ten patients were included in the study. After use of the 3D-printed myoelectric single electromyography channel prosthesis for 3 months, the average OPUS-UEFS score significantly increased from 45.50 to 60.10. The Spearman correlation coefficient (r) of the correlation between radius length and OPUS-UEFS at the 3rd month of prosthetic use was 0.815. Conclusions This low-cost 3D-printed myoelectric-interface prosthetic hand with a single reliable myoelectrical signal shows the potential to positively impact amputees' quality of life through daily usage. The emergence of a low-cost 3D-printed myoelectric prosthesis could lead to new market trends, with such a device gaining popularity via reduced production costs and increased market demand.

Hand Gesture Interface Using Mobile Camera Devices (모바일 카메라 기기를 이용한 손 제스처 인터페이스)

  • Lee, Chan-Su;Chun, Sung-Yong;Sohn, Myoung-Gyu;Lee, Sang-Heon
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.16 no.5
    • /
    • pp.621-625
    • /
    • 2010
  • This paper presents a hand motion tracking method for hand gesture interface using a camera in mobile devices such as a smart phone and PDA. When a camera moves according to the hand gesture of the user, global optical flows are generated. Therefore, robust hand movement estimation is possible by considering dominant optical flow based on histogram analysis of the motion direction. A continuous hand gesture is segmented into unit gestures by motion state estimation using motion phase, which is determined by velocity and acceleration of the estimated hand motion. Feature vectors are extracted during movement states and hand gestures are recognized at the end state of each gesture. Support vector machine (SVM), k-nearest neighborhood classifier, and normal Bayes classifier are used for classification. SVM shows 82% recognition rate for 14 hand gestures.

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Visual Feedback System for Manipulating Objects Using Hand Motions in Virtual Reality Environment (가상 환경에서의 손동작을 사용한 물체 조작에 대한 시각적 피드백 시스템)

  • Seo, Woong;Kwon, Sangmo;Ihm, Insung
    • Journal of the Korea Computer Graphics Society
    • /
    • v.26 no.3
    • /
    • pp.9-19
    • /
    • 2020
  • With the recent development of various kinds of virtual reality devices, there has been an active research effort to increase the sense of reality by recognizing the physical behavior of users rather than by classical user input methods. Among such devices, the Leap Motion controller recognizes the user's hand gestures and can realistically trace the user's hand in a virtual reality environment. However, manipulating an object in virtual reality using a recognized user's hand often causes the hand to pass through the object, which should not occur in the real world. This study presents a way to build a visual feedback system for enhancing the user's sense of interaction between hands and objects in virtual reality. In virtual reality, the user's hands are examined precisely by using a ray tracing method to see if the virtual object collides with the user's hand, and when any collision occurs, visual feedback is given through the process of reconstructing the user's hand by moving the position of the end of the user's fingers that enter the object through sign distance field and reverse mechanics. This enables realistic interaction in virtual reality in real time.

Development of educational software for beam loading analysis using pen-based user interfaces

  • Suh, Yong S.
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.1
    • /
    • pp.67-77
    • /
    • 2014
  • Most engineering software tools use typical menu-based user interfaces, and they may not be suitable for learning tools because the solution processes are hidden and students can only see the results. An educational tool for simple beam analyses is developed using a pen-based user interface with a computer so students can write and sketch by hand. The geometry of beam sections is sketched, and a shape matching technique is used to recognize the sketch. Various beam loads are added by sketching gestures or writing singularity functions. Students sketch the distributions of the loadings by sketching the graphs, and they are automatically checked and the system provides aids in grading the graphs. Students receive interactive graphical feedback for better learning experiences while they are working on solving the problems.

HOG-HOD Algorithm for Recognition of Multi-cultural Hand Gestures (다문화 손동작 인식을 위한 HOG-HOD 알고리즘)

  • Kim, Jiye;Park, Jong-Il
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.8
    • /
    • pp.1187-1199
    • /
    • 2017
  • In recent years, research about Natural User Interface (NUI) has become focused because NUI system can give natural feelings for users in virtual reality. Most important thing in NUI system is how to communicate with the computer system. There are many things to interact with users such as speech, hand gestures, body actions. Among them, hand gesture is suitable for the purpose of NUI because people often use a relatively high frequency in daily life and hand gesture have meaning only by itself. This hand gestures called multi-cultural hand gesture and we proposed the method to recognize this kind of hand gestures. Proposed method is composed of Histogram of Oriented Gradients (HOG) used for hand shape recognition and Histogram of Oriented Displacements (HOD) used for hand center point trajectory recognition.

The Modified Block Matching Algorithm for a Hand Tracking of an HCI system (HCI 시스템의 손 추적을 위한 수정 블록 정합 알고리즘)

  • Kim Jin-Ok
    • Journal of Internet Computing and Services
    • /
    • v.4 no.4
    • /
    • pp.9-14
    • /
    • 2003
  • A GUI (graphical user interface) has been a dominant platform for HCI (human computer interaction). A GUI - based interaction has made computers simpler and easier to use. The GUI - based interaction, however, does not easily support the range of interaction necessary to meet users' needs that are natural. intuitive, and adaptive. In this paper, the modified BMA (block matching algorithm) is proposed to track a hand in a sequence of an image and to recognize it in each video frame in order to replace a mouse with a pointing device for a virtual reality. The HCI system with 30 frames per second is realized in this paper. The modified BMA is proposed to estimate a position of the hand and segmentation with an orientation of motion and a color distribution of the hand region for real - time processing. The experimental result shows that the modified BMA with the YCbCr (luminance Y, component blue, component red) color coordinate guarantees the real - time processing and the recognition rate. The hand tracking by the modified BMA can be applied to a virtual reclity or a game or an HCI system for the disable.

  • PDF

The Study of Usability Evaluation in the GUI of Mobile Computing - Based on Benchmark Testing in the interface design of WIPI (Mobile Computing의 GUI 개발에 있어 사용성 평가 연구 - WIPI 인터페이스 디자인을 위한 Benchmark Testing을 중심으로 -)

  • 정봉금;송연승
    • Archives of design research
    • /
    • v.17 no.1
    • /
    • pp.49-62
    • /
    • 2004
  • Due to the recent surge of wireless Internet and concurrent development of the end user terminal devices having standardized graphical user interface(GUI) and unified operation mechanism for better interactivity in information representation and ease of use, various efforts on the improvement of GUI is widely recognized as one of the key factors that will usher in the next stages of the wireless Internet for the users. Especially, improved usability along with unique visual effect are considered to be the key elements for GUI considering the rapid improvement of the resolution and color on the end user handset devices; thus, the study and research on the subject of GUI is expected to increase along with the wireless Internet using smart phones. User interface of the wires Internet end user handsets will have a definite and significant effect on the user interaction as well as productivity. Domestically, wireless Internet service providers and GUI design companies are making various efforts in producing a common GUI models for standardized operation scheme and improved graphical display capabilities of the hand phones, PDAs and smart phones. In the study, Nokia 3650 model and Microsoft Orange SPV model were chosen as test devices for usability comparison and data collection to collect directional benchmark data in developing next generation smart phone user interface integrating PDAs and phones. The mail purpose of this study is to achieve the most efficient user accessibility to WAP menu through intensive focus on developing WIPI WAP menu having most effective usability for the users in their twenties and thirties. The result of this study can also be used as the base research materials for WAP service development, VM browser development and PDA browser development. The result of this study along with the evaluation model is expected to provide effective analysis materials on the subject of user interface to the developers of the wireless Internet user devices, GUI designers and service planners while short listing key factors to consider in developing smart phones therefore serving as the GUI guideline of WIPI phones.

  • PDF

Hand Region Detection and hand shape classification using Hu moment and Back Projection (역 투영과 휴 모멘트를 이용한 손영역 검출 및 모양 분류)

  • Shin, Jae-Sun;Jang, Dae-Sik
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2011.10a
    • /
    • pp.911-914
    • /
    • 2011
  • Detecting Hand Region is essencial technology to providing User based interface and many research has been continue. In this paper will propose Hand Region Detection method by using HSV space based on Back Projection and Hand Shape Recognition using Hu Moment. By using Back Projection, I updated reliability on Hand Region Detection by Back Projection method and, Confirmed Hand Shape could be recognized through Hu moment.

  • PDF