• Title/Summary/Keyword: touch interface

Search Result 267, Processing Time 0.025 seconds

User Interface Experiment Model Design for Touch-Screen Based on Navigation System (터치스크린 기반 항해 시스템을 위한 사용자 인터페이스 실험 모델 설계)

  • Jeon, Hyun-Min;An, Jae-Yong;Oh, Seung-Yup;Park, Peom
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.3 no.11
    • /
    • pp.503-510
    • /
    • 2014
  • With the development of electronic communication technology, the ship's navigational equipment is being digitized, and it has being studied touch-screen-based navigation user interface. However, due to the influence of environmental factors such as waves, it has a potential problem hazardous marine accident occurs due to incorrect operation, the systematic research in consideration of this be done do not. In this paper, we provide a user interface experimental model to verify the stability that takes into account the external environment of the touch-screen input on. Further, we simulated to verify that the interface of the touch screen, the effect of applying the input delay time and the size of the button is obtained through the experimental model proposed. It will be able to greatly contribute to studies of the interface robust touch screen user errors that can be analyzed by the experimental model is proposed to improve the ship, the overall system stability.

The Proposal of the Conceptual Model for Cognitive Action of Smart Device (스마트 디바이스의 인지적 행동에 대한 개념모델 제안)

  • Song, Seung-Keun;Kim, Tae-Wan;Kim, Chee-Yong
    • Journal of Digital Contents Society
    • /
    • v.11 no.4
    • /
    • pp.529-536
    • /
    • 2010
  • Currently many people are awfully concerned about smart device in domestic and foreign mobile market. The need of smart device has been rapidly increased. Unlike a feature phone smart devices provide us with an intuitive interface which is easy to control. They are enable to smoothly interact between user and device. Though higher market outlook, there is a lack of empirical research on user interface in touch screen based on smart device. In this paper, we propose the touch interface conceptual model concentrating on user based on the result of previous research. Materials of this research are three kinds of smart devices which are currently released. Through expert's depth interview and observation of user, user's cognitive actions in smart device are defined. Since the method of the touch interface which is suitable for the action has been derived, we have proposed the conceptual model of user's cognitive action. This research imply to offer the excellent design guideline in order to implement touch interface to optimize user experience in touch screen based on smart device to release in the future.

Dial Menu User Interface Using Touch Screen (터치스크린을 이용한 다이얼 메뉴 유저 인터페이스)

  • Choi, Jung-Hwan;Kim, Youn-Woo;Jang, Hyun-Su;Eom, Young-Ik
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.584-589
    • /
    • 2008
  • The in put system using the touch screen directly makes the input signals by the contact on the screen without the assistance of peripherals such as a pen or hands. These kinds of input systems using the flexible hands is maximizing suppleness and intuition of the input rather than those systems using a keyboard or a mouse which are moving a cursor or typing a word. However, using hands for an input may give rise to a mistake in control. And there are few interfaces utilizing the touch screen. Incorrectness and insufficiency of the interface are the weak point of the touch screen systems. In this paper, we propose the dial menu user interface for the mobile devices using touch screen for an efficient input. In this method, it consists of 2 states(Inactive states, Active states) and 4 actions(Rotation, Zoom in, and Zoom out, and Click). The intuitive control utilizing the suggested method overcomes the incorrect pointing, weak point of the touch screen system, and boosts the searching menu by utilizing the drag function of the touch screen.

  • PDF

Comparing Elder Users' Interaction Behavior to the Younger: Focusing on Tap, Move and Flick Tasks on a Mobile Touch Screen Device

  • Lim, Ji-Hyoun;Ryu, Tae-Beum
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.413-419
    • /
    • 2012
  • Objective: This study presents an observation and analysis on behavioral characteristics of old users in comparison to young users in the use of control on display interface. Background: Touch interface which allows users to control directly on display, is conceived as delight and easy way of human-computer interaction. Due to the advantage in stimulus-response ensemble, the old users, who typically experiencing difficulties in interacting with computer, would expected to have better experience in using computing machines. Method: Twenty nine participants who are over 50 years old and 14 participants who are in 20s years old were participated in this study. Three primary tasks in touch interface, which are tap, move, and flick, were delivered by the users. For the tap task, response time and point of touch response were collected and the response bias was calculated for each trial. For the move task, delivery time and the distance of finger movements were recorded for each trial. For the flick task, task completion time and flicking distance were recorded. Results: From the collected behavioral data, temporal and spatial differences between young and old users behavior were analyzed. The older users showed difficulty in completing move task requiring eye-hand coordination.

Development of K-Touch haptic API(Application Programming Interface) (역/촉감 제시 "K-Touch" 햅틱 API 개발)

  • Lee, Beom-Chan;Kim, Jong-Phil;Ryu, Je-Ha
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.266-274
    • /
    • 2006
  • 본 논문은 새로운 햅틱 API 인 "K-Touch"의 개발에 관한 것이다. 그래픽 하드웨어 기반의 핵심 역감 알고리즘을 기반으로 개발된 K-Touch API 는 가상 환경을 구성하는 다양한 데이터 형식(3D polygon model, volume data, 2.5D depth image)에 대한 햅틱 상호작용을 가능하게 하고, 새로운 햅틱 알고리즘 및 장치 개발에 필요한 소프트웨어 확장성을 제공함과 동시에 사용자가 쉽고 빠르게 햅틱 응용분야를 개발할 수 있도록 설계되었다. 아울러 햅틱 감각의 중요 요소인 역감 및 촉감 상호작용을 위해 기존의 햅틱 SDK 및 API 와 달리 역/촉감을 동시에 제시할 수 있는 알고리즘이 개발되었다. 본 논문에서 제안하는 새로운 햅틱 API 의 효용성을 검증하기 위해 다양한 응용분야의 예를 구현하였다. 새로운 햅틱 API 인 K-Touch 는 사용자 및 연구자에게 보다 효율적으로 햅틱 연구를 진행 할 수 있도록 도움을 주는 툴(Tool)로써 중요한 역할을 할 것으로 기대된다.

  • PDF

Improvement of Smartphone Interface Using AR Marker (AR 마커를 이용한 스마트폰 인터페이스의 개선)

  • Kang, Yun-A;Han, Soon-Hung
    • Korean Journal of Computational Design and Engineering
    • /
    • v.16 no.5
    • /
    • pp.361-369
    • /
    • 2011
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but middle-aged people as well. Most smartphones use capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen, and difficulty occurs in precise control used for small buttons such as qwerty keyboard. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. Sticker-form marker is attached to fingernails and placed in front of the smartphone camera Then, the camera image of the marker is analyzed to determine the orientation of the marker to perceive as onRelease() or onPress() of the mouse depending on the marker's angle of rotation, and use its position as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

General-Purpose Multi-Touch Inter action System for Multi-I/O Content Control (다중 입출력 컨텐츠 제어를 위한 범용 멀티 터치 인터렉션 시스템)

  • Bae, Ki-Tae;Kwon, Doo-Young
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.12 no.4
    • /
    • pp.1933-1939
    • /
    • 2011
  • The former people who made musical instruments used touch devices for sound control. As the first multi-touch system appeared in 1982, the performance of the system has been improved rapidly by many researches. In spite of such performance improvement, the popularization of multi-touch interface was looked difficult. However, in 2007, multi-touch interfaces have become popular with Apple Iphone and people have been able to experience easily multi-touch interface using smart phones. In this paper we propose a general-purpose multi-touch interaction system for multi-touch content producer and market invigoration of multi-touch interface. We show by real field tests that the proposed method has benefits in the aspects of price and performance compared with other techniques.

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF

Experimental investigation of TD characteristics of a flying head slider in the near-contact region (근 접촉 영역에서 부상중인 슬라이더의 Touch-Down특성의 실험적 해석)

  • Lee, Yong-Eun;Lee, Sang-Jik;Lim, Geon-Yup;Park, Kyoung-Su
    • Transactions of the Society of Information Storage Systems
    • /
    • v.7 no.2
    • /
    • pp.65-69
    • /
    • 2011
  • Head Disk Interface (HDI) in a Hard Disk Drive (HDD) has decreased to achieve high areal density. Thus, the contact between a slider and a disk becomes more important. The contact between the slider and the disk can cause severe wear and damage of both the slider and the disk. Especially, Touch Down (TD) that the contact occurs continuously and repeatedly is extremely dangerous. Therefore, it is necessary to analyze the unstable bouncing vibration of the slider in head-disk interface. In this paper, we investigate the characteristic and causes of the Touch Down.

Improving Eye-gaze Mouse System Using Mouth Open Detection and Pop Up Menu (입 벌림 인식과 팝업 메뉴를 이용한 시선추적 마우스 시스템 성능 개선)

  • Byeon, Ju Yeong;Jung, Keechul
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.12
    • /
    • pp.1454-1463
    • /
    • 2020
  • An important factor in eye-tracking PC interface for general paralyzed patients is the implementation of the mouse interface, for manipulating the GUI. With a successfully implemented mouse interface, users can generate mouse events exactly at the point of their choosing. However, it is difficult to define this interaction in the eye-tracking interface. This problem has been defined as the Midas touch problem and has been a major focus of eye-tracking research. There have been many attempts to solve this problem using blink, voice input, etc. However, it was not suitable for general paralyzed patients because some of them cannot wink or speak. In this paper, we propose a mouth-pop-up, eye-tracking mouse interface that solves the Midas touch problem as well as becoming a suitable interface for general paralyzed patients using a common RGB camera. The interface presented in this paper implements a mouse interface that detects the opening and closing of the mouth to activate a pop-up menu that the user can select the mouse event. After implementation, a performance experiment was conducted. As a result, we found that the number of malfunctions and the time to perform tasks were reduced compared to the existing method.