• Title/Summary/Keyword: 마우스 인터페이스

Search Result 194, Processing Time 0.03 seconds

Efficient Fingertip Tracking and Mouse Pointer Control for Implementation of a Human Mouse (휴먼마우스 구현을 위한 효율적인 손끝좌표 추적 및 마우스 포인트 제어기법)

  • 박지영;이준호
    • Journal of KIISE:Software and Applications
    • /
    • v.29 no.11
    • /
    • pp.851-859
    • /
    • 2002
  • This paper discusses the design of a working system that visually recognizes hand gestures for the control of a window based user interface. We present a method for tracking the fingertip of the index finger using a single camera. Our method is based on CAMSHIFT algorithm and performs better than the CAMSHIFT algorithm in that it tracks well particular hand poses used in the system in complex backgrounds. We describe how the location of the fingertip is mapped to a location on the monitor, and how it Is both necessary and possible to smooth the path of the fingertip location using a physical model of a mouse pointer. Our method is able to track in real time, yet not absorb a major share of computational resources. The performance of our system shows a great promise that we will be able to use this methodology to control computers in near future.

A Development of the Next-generation Interface System Based on the Finger Gesture Recognizing in Use of Image Process Techniques (영상처리를 이용한 지화인식 기반의 차세대 인터페이스 시스템 개발)

  • Kim, Nam-Ho
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.15 no.4
    • /
    • pp.935-942
    • /
    • 2011
  • This study aims to design and implement the finger gesture recognizing system that automatically recognizes finger gestures input through a camera and controls the computer. Common CCD cameras were redesigned as infrared light cameras to acquire the images. The recorded images go through the pre-process to find the hand features, the finger gestures are read accordingly, and an event takes place for the follow-up mouse controlling and presentation, and finally the way to control computers is suggested. The finger gesture recognizing system presented in this study has been verified as the next-generation interface to replace the mouse and keyboard for the future information-based units.

Design and Implementation of e-Commerce User Authentication Interface using the Mouse Gesture (마우스 제스처를 이용한 전자상거래 사용자 인증 인터페이스)

  • 김은영;정옥란;조동섭
    • Journal of Korea Multimedia Society
    • /
    • v.6 no.3
    • /
    • pp.469-480
    • /
    • 2003
  • The accurate user- authentication technology is being raised as one of the most important in this current society, which is, so called, information society. Most authentication technology is used to identify users by using the special characteristics of users. This paper has established an e-commerce shopping mall based on conventional e-commerce systems. It also suggested and established the user authentication interface that uses the mouse gesture, which is the new authentication of what users have. The user authentication interface using the mouse gesture generates the status of recognition directly on the screen by comparing the stored pattern values with the unique pattern values that users entered. When users purchase products through the shopping mall and enter their another signature information together with payment information, security can be more increased. Experimental results show that our mouse gesture interface may be useful to provide more security to e-commerce server.

  • PDF

NUI/NUX of the Virtual Monitor Concept using the Concentration Indicator and the User's Physical Features (사용자의 신체적 특징과 뇌파 집중 지수를 이용한 가상 모니터 개념의 NUI/NUX)

  • Jeon, Chang-hyun;Ahn, So-young;Shin, Dong-il;Shin, Dong-kyoo
    • Journal of Internet Computing and Services
    • /
    • v.16 no.6
    • /
    • pp.11-21
    • /
    • 2015
  • As growing interest in Human-Computer Interaction(HCI), research on HCI has been actively conducted. Also with that, research on Natural User Interface/Natural User eXperience(NUI/NUX) that uses user's gesture and voice has been actively conducted. In case of NUI/NUX, it needs recognition algorithm such as gesture recognition or voice recognition. However these recognition algorithms have weakness because their implementation is complex and a lot of time are needed in training because they have to go through steps including preprocessing, normalization, feature extraction. Recently, Kinect is launched by Microsoft as NUI/NUX development tool which attracts people's attention, and studies using Kinect has been conducted. The authors of this paper implemented hand-mouse interface with outstanding intuitiveness using the physical features of a user in a previous study. However, there are weaknesses such as unnatural movement of mouse and low accuracy of mouse functions. In this study, we designed and implemented a hand mouse interface which introduce a new concept called 'Virtual monitor' extracting user's physical features through Kinect in real-time. Virtual monitor means virtual space that can be controlled by hand mouse. It is possible that the coordinate on virtual monitor is accurately mapped onto the coordinate on real monitor. Hand-mouse interface based on virtual monitor concept maintains outstanding intuitiveness that is strength of the previous study and enhance accuracy of mouse functions. Further, we increased accuracy of the interface by recognizing user's unnecessary actions using his concentration indicator from his encephalogram(EEG) data. In order to evaluate intuitiveness and accuracy of the interface, we experimented it for 50 people from 10s to 50s. As the result of intuitiveness experiment, 84% of subjects learned how to use it within 1 minute. Also, as the result of accuracy experiment, accuracy of mouse functions (drag(80.4%), click(80%), double-click(76.7%)) is shown. The intuitiveness and accuracy of the proposed hand-mouse interface is checked through experiment, this is expected to be a good example of the interface for controlling the system by hand in the future.

Implementing user interface through everyday gesture (일상적 행동양식을 통한 인터페이스의 구현)

  • Ahn, Jong-Yoon;Lee, Kyung-Won
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02b
    • /
    • pp.409-415
    • /
    • 2006
  • 컴퓨터와 인간사이의 원활한 의사소통 및 인터랙션을 위해 기존의 키보드, 마우스를 대체할 수 있는 다양한 입력장치들이 개발되고 있다. 하지만 정보를 탐색, 접근하는 데에 있어서 기존의 장치들은 클릭과 같은 제한적인 동작만을 입력 값으로 받아들이므로 이러한 방식에 익숙하지 않은 사용자의 입장에서는 부자연스러움을 느끼는 요인이 된다. 사용자의 제스처를 인식할 수 있는 인터페이스를 통해 일상에서 사물을 사용할 때의 행동양식을 그대로 가져올 수 있다면, 디지털 콘텐츠에 접근하는데 있어 보다 직관적이고 편리하게 컴퓨터와 의사소통 될 수 있다. 제스처는 동작의 자율성이 높고 때로 그 의미를 파악하기 모호하기 때문에 동작들을 정확히 인식하여 구분할 필요가 있다. 본 논문에서는 이를 바탕으로 효과적인 제스처 인터페이스의 구현을 위해 필요한 점들을 살펴보고, 기술적 구현을 통해 디지털 콘텐츠와의 인터랙션을 보여주고자 한다. 정보 접근에 있어 가장 익숙하고 전통적이라 할 수 있는 책의 메타포를 통해 페이지를 넘기는 행동양식을 인식할 수 있는 인터페이스를 개발하고 이를 입력장치로 사용한다. 사용자의 동작을 인식, 파악하여 책을 앞뒤로 넘기거나 탐색하며 원하는 정보에 접근할 수 있도록 유도하고 손 동작을 통한 인터페이스를 수단으로 컴퓨터와의 유연한 의사소통이 가능하도록 구현한다.

  • PDF

Development of a 30 User Interface Technique with Java3D (Java3D를 이용한 3D 사용자 인터페이스 기법의 개발)

  • 오태철;고명철;최윤철
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2001.10b
    • /
    • pp.223-225
    • /
    • 2001
  • 최근 인터넷과 비몰입형 가상현실 기술을 기반으로 하는 3D 사이버스페이스 분야에 대한 관심이 고조되고 있다. 그 동안 3D 그래픽의 처리나 네트워크 상의 데이터 전송기술과 같은 하드웨어 분야에서는 많은 개선이 있었으나 가상환경 내부의 다양한 상호작용을 처리하는 부분에 있어서는 연구가 미진한 실정이다. 특히 3D 사이버스페이스 상에서 참여자의 의사결정 및 상호작용의 수단이 되는 3D 사용자 인터페이스에 대한 연구는 성공적인 연구사례가 보고되지 않았다. 현재 실험적인 수준의 3D 사이버스페이스가 앞으로 좀더 활성화되고 일반화되기 위해서는 사용자 관점을 지원하는 친숙하고 자연스러운 3D 사용자 인터페이스에 대한 연구가 필요하다. 본 논문에서는 현재 가장 일반적인 입력 장치로서 사용되는 2D 마우스를 이용하여 3D 사이버스페이스 상의 다양한 객체들을 자연스럽게 조작할 수 있게 하는 3D 조작 인터페이스를 제안한다. 제안된 3D 조작 인터페이스를 실시간 환경에 적용하여 검증한 결과 3D 사이버스페이스 상에서의 상호작용을 매우 효과적으로 지원할 수 있음을 확인하였다.

  • PDF

NUI/NUX framework based on intuitive hand motion (직관적인 핸드 모션에 기반한 NUI/NUX 프레임워크)

  • Lee, Gwanghyung;Shin, Dongkyoo;Shin, Dongil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.11-19
    • /
    • 2014
  • The natural user interface/experience (NUI/NUX) is used for the natural motion interface without using device or tool such as mice, keyboards, pens and markers. Up to now, typical motion recognition methods used markers to receive coordinate input values of each marker as relative data and to store each coordinate value into the database. But, to recognize accurate motion, more markers are needed and much time is taken in attaching makers and processing the data. Also, as NUI/NUX framework being developed except for the most important intuition, problems for use arise and are forced for users to learn many NUI/NUX framework usages. To compensate for this problem in this paper, we didn't use markers and implemented for anyone to handle it. Also, we designed multi-modal NUI/NUX framework controlling voice, body motion, and facial expression simultaneously, and proposed a new algorithm of mouse operation by recognizing intuitive hand gesture and mapping it on the monitor. We implement it for user to handle the "hand mouse" operation easily and intuitively.

Vibrotactile Space Mouse (진동촉각 공간 마우스)

  • Park, Jun-Hyung;Choi, Ye-Rim;Lee, Kwang-Hyung;Back, Jong-Won;Jang, Tae-Jeong
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.337-341
    • /
    • 2008
  • This paper presents a vibrotactile space mouse which use pin-type vibrotactile display modules and a gyroscope chip. This mouse is a new interface device which is not only an input device as an ordinary space mouse but also a tactile output device. It consists of a space mouse which use gyroscope chip and vibrotactile display modules which have been developed in our own laboratory. Lately, by development of vibrotactile display modules which have small size and consume low power, vibrotactile displays are available in small sized embedded systems such as wireless mouses or mobile devices. Also, development of new sensors like miniature size gyroscope by MEMS technology enables manufacturing of a small space mouse which can be used in the air not in a plane. The vibrotactile space mouse proposed in this paper recognizes motion of a hand using the gyroscope chip and transmits the data to PC through Bluetooth. PC application receives the data and moves pointer. Also, 2 by 3 arrays of pin-type vibrotactile actuators are mounted on the front side of the mouse where fingers of a user's hand contact, and those actuators could be used to represent various information such as gray-scale of an image or Braille patterns for visually impared persons.

  • PDF

A Method For Utilizing Voice Interface in Web Environment Using VoiceXML (웹 환경에서 VoiceXML을 이용한 음성 인터페이스 활용방안)

  • 장민석;방초균
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2002.04a
    • /
    • pp.451-453
    • /
    • 2002
  • 현재의 웹 환경은 HTML로 구성이 되어있고 이로인해 하이퍼링크를 따라가기 위해 마우스 클릭을 통해 작업하는 GUI환경이 주를 이룬다. 하지만 이러한 방법은 인간이 가장 손쉽게 사용하는 음성과 비교해 볼때 상당히 불편한 축에 속한다. 이를 해결하기 위해 현재 무르익은 음성인식 기술과 전화기를 통해 정보를 제공하고자 하는 XML의 파생인 VoiceXML을 이용하여 현재 HTML이 주류를 이루는 웹 환경을 VoiceXML을 이용한 음성인터페이스 환경을 마련하고자 한다.

  • PDF