• Title/Summary/Keyword: touchscreen interface

Search Result 12, Processing Time 0.018 seconds

A Study on the Navigation Menu Structure with Screen Size (Screen Size를 고려한 최적 Menu Structure에 관한 연구)

  • Kim, Seong-Min;Choe, Jae-Ho;Jung, Eui-S.;Choi, Kwang-Soo;Jeon, Myoung-Hoon;Park, Jun-Ho;Ahn, Jeong-Hee
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.380-385
    • /
    • 2008
  • To perform the navigation functions more efficiently, the navigation menu structure should be provided easy to understand to the driver in the vehicle environment that is restricted by driving workload, According to these conditions, to design better navigation interface, it is important to study on the navigation menu structure that is depend on the screen size and the information width and depth. Therefore, in this study we provided the different menu structures of 7-inch touchscreen LCD and 4-inch touchscreen LCD to the driver respectively in the driving simulator. Then, we compared the preference of each menu structures with the different touchscreen LCD.

  • PDF

ACT-R Predictive Model of Korean Text Entry on Touchscreen

  • Lim, Soo-Yong;Jo, Seong-Sik;Myung, Ro-Hae;Kim, Sang-Hyeob;Jang, Eun-Hye;Park, Byoung-Jun
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.2
    • /
    • pp.291-298
    • /
    • 2012
  • Objective: The aim of this study is to predict Korean text entry on touchscreens using ACT-R cognitive architecture. Background: Touchscreen application in devices such as satellite navigation devices, PDAs, mobile phones, etc. has been increasing, and the market size is expanding. Accordingly, there is an increasing interest to develop and evaluate the interface to enhance the user experience and increase satisfaction in the touchscreen environment. Method: In this study, Korean text entry performance in the touchscreen environment was analyzed using ACT-R. The ACT-R model considering the characteristics of the Korean language which is composed of vowels and consonants was established. Further, this study analyzed if the prediction of Korean text entry is possible through the ACT-R cognitive model. Results: In the analysis results, no significant difference on performance time between model prediction and empirical data was found. Conclusion: The proposed model can predict the accurate physical movement time as well as cognitive processing time. Application: This study is useful in conducting model-based evaluation on the text entry interface of the touchscreen and enabled quantitative and effective evaluation on the diverse types of Korean text input interfaces through the cognitive models.

A Comparison of Visual Occlusion Methods: Touch Screen Device vs. PLATO Goggles

  • Park, Jung-Chul
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.5
    • /
    • pp.589-595
    • /
    • 2011
  • Objective: This study compares two visual occlusion methods for the evaluation of in-vehicle interfaces. Background: Visual occlusion is a visual demand measuring technique which uses periodic vision/occlusion cycle to simulate a driving(or mobile) environment. It has been widely used for the evaluation of in-vehicle interfaces. There are two major implementation methods for this technique: (1) occlusion using PLATO(portable liquid crystal apparatus for tachistoscopic occlusion) goggles; (2) occlusion using a software application on a touchscreen device. Method: An experiment was conducted to examine the visual demand of an in-vehicle interface prototype using the goggle-based and the touchscreen-based occlusion methods. Address input and radio tuning tasks were evaluated in the experiment. Results: The results showed that, for the radio tuning task, there were no significant differences in total shutter open time and resumability ratio between the two occlusionconditions. However, it took longer for the participants to input addresses with the touchscreen-based occlusion. Conclusion & Application: The results suggest that touchscreen-based method could be used as an alternative to traditional, gogglebased visual occlusion especially in less demanding visual tasks such as radio tuning.

Touch TT: Scene Text Extractor Using Touchscreen Interface

  • Jung, Je-Hyun;Lee, Seong-Hun;Cho, Min-Su;Kim, Jin-Hyung
    • ETRI Journal
    • /
    • v.33 no.1
    • /
    • pp.78-88
    • /
    • 2011
  • In this paper, we present the Touch Text exTractor (Touch TT), an interactive text segmentation tool for the extraction of scene text from camera-based images. Touch TT provides a natural interface for a user to simply indicate the location of text regions with a simple touchline. Touch TT then automatically estimates the text color and roughly locates the text regions. By inferring text characteristics from the estimated text color and text region, Touch TT can extract text components. Touch TT can also handle partially drawn lines which cover only a small section of text area. The proposed system achieves reasonable accuracy for text extraction from moderately difficult examples from the ICDAR 2003 database and our own database.

Development of Finger Gestures for Touchscreen-based Web Browser Operation (터치스크린 기반 웹브라우저 조작을 위한 손가락 제스처 개발)

  • Nam, Jong-Yong;Choe, Jae-Ho;Jung, Eui-S.
    • Journal of the Ergonomics Society of Korea
    • /
    • v.27 no.4
    • /
    • pp.109-117
    • /
    • 2008
  • Compared to the existing PC which uses a mouse and a keyboard, the touchscreen-based portable PC allows the user to use fingers, requiring new operation methods. However, current touchscreen-based web browser operations in many cases involve merely having fingers move simply like a mouse and click, or not corresponding well to the user's sensitivity and the structure of one's index finger, making itself difficult to be used during walking. Therefore, the goal of this study is to develop finger gestures which facilitate the interaction between the interface and the user, and make the operation easier. First, based on the frequency of usage in the web browser and preference, top eight functions were extracted. Then, the users' structural knowledge was visualized through sketch maps, and the finger gestures which were applicable in touchscreens were derived through the Meaning in Mediated Action method. For the front/back page, and up/down scroll functions, directional gestures were derived, and for the window closure, refresh, home and print functions, letter-type and icon-type gestures were drawn. A validation experiment was performed to compare the performance between existing operation methods and the proposed one in terms of execution time, error rate, and preference, and as a result, directional gestures and letter-type gestures showed better performance than the existing methods. These results suggest that not only during the operation of touchscreen-based web browser in portable PC but also during the operation of telematics-related functions in automobile, PDA and so on, the new gestures can be used to make operation easier and faster.

Design of an Infrared Multi-touch Screen Controller using Stereo Vision (스테레오 비전을 이용한 저전력 적외선 멀티 터치스크린 컨트롤러의 설계)

  • Jung, Sung-Wan;Kwon, Oh-Jun;Jeong, Yong-Jin
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.47 no.2
    • /
    • pp.68-76
    • /
    • 2010
  • Touch-enabled technology is increasingly being accepted as a main communication interface between human and computers. However, conventional touchscreen technologies, such as resistive overlay, capacitive overlay, and SAW(Surface Acoustic Wave), are not cost-effective for large screens. As an alternative to the conventional methods, we introduce a newly emerging method, an optical imaging touchscreen which is much simpler and more cost-effective. Despite its attractive benefits, optical imaging touchscreen has to overcome some problems, such as heavy computational complexity, intermittent ghost points, and over-sensitivity, to be commercially used. Therefore, we designed a hardware controller for signal processing and multi-coordinate computation, and proposed Infrared-blocked DA(Dark Area) manipulation as a solution. While the entire optical touch control took 34ms with a 32-bit microprocessor, the designed hardware controller can manage 2 valid coordinates at 200fps and also reduce energy consumption of infrared diodes from 1.8Wh to 0.0072Wh.

A Study of Korean Soft-keyboard Layout for One Finger Text Entry (한 손가락 문자 입력을 위한 한글 Soft-keyboard 배열에 관한 연구)

  • Kong, Byung-Don;Hong, Seung-Kweon;Jo, Seong-Sik;Myung, Ro-Hae
    • IE interfaces
    • /
    • v.22 no.4
    • /
    • pp.329-335
    • /
    • 2009
  • Recently, the use of soft-keyboard is widespread and increases, because various handheld devices were developed such as PDA, navigation, mobile phones with enhanced competence of touchscreen. The use of soft-keyboard requires different characteristics compared to traditional hard-keyboard like QWERTY keyboard: no standard character layout, one finger entry, and cognitive processing time. In this study, therefore, the optimal soft-keyboard layout for one finger text entry in touchscreen environment was investigated among 6 keyboard layouts which were developed based on traditional characteristic of Korean text and the usage frequency of both vowels and consonants. As a result, the interface with Korean text invention order like 'ㄱㄴㄷㄹ' or 'ㅏㅑㅓㅕㅕ' was found to be better than the interface with usage frequency-based arrangement. Especially the vowels were most efficient when separated into two parts; located at the right-hand side and at right below the consonants. In conclusion, the keyboard layout with regard to the Korean text characteristic and the invention order was a more effective layout resulted from the minimum cognitive processing time.

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Evaluation of Correlations in Copier's Button and Usability (복사기 조작버튼에 따른 사용성 상관관계 연구)

  • Ha, Kwang Soo
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.12
    • /
    • pp.595-603
    • /
    • 2013
  • Touchscreen is now installed in the majority of information devices thanks to the developed digital technology. In case of copiers, although touchscreen was installed earlier than any other information device, current user interface indicates that conventional methods have been maintained without any special re-validation. But, since common office devices such as copiers have brief usage time and clear, straightforward purposes, user experience between an user and device is perhaps an important element. Nevertheless, conservativeness of the market and the fact that users and buyers are not always the same are a stumbling block to actively changing UI. Recently, however, as hardware element, buttons, have moved to touchscreen mostly for smartphone, and common and public devices have seen rapid changes in UX, mostly with touchscreen, it has led to high demand for changes in UX of copiers. To remedy the issues with copiers, a literature review was conducted through existing studies and user evaluation based on user survey and prototype was done to examine the copier's UI elements and relevant changes in usage. In particular, a new direction for UX of copiers was suggested by analyzing the impact of differences between software button-oriented UI and hardware button-oriented UI on users and usability. This study findings could be used as basic data for improving the usability of future common devices including copier and UX design.

Performance Comparison of Manual and Touch Interface using Video-based Behavior Analysis

  • Lee, Chai-Woo;Bahn, Sang-Woo;Kim, Ga-Won;Yun, Myung-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.29 no.4
    • /
    • pp.655-659
    • /
    • 2010
  • The objective of this study is to quantitatively incorporate user observation into usability evaluation of mobile interfaces using monitoring techniques in first- and third-person points of view. In this study, an experiment was conducted to monitor and record users' behavior using Ergoneers Dikablis, a gaze tracking device. The experiment was done with 2 mobile phones each with a button keypad interface and a touchscreen interface for comparative analysis. The subjects included 20 people who have similar experiences and proficiency in using mobile devices. Data from video recordings were coded with Noldus Observer XT to find usage patterns and to gather quantitative data for analysis in terms of effectiveness, efficiency and satisfaction. Results showed that the button keypad interface was generally better than the touchcreen interface. The movements of the fingers and gaze were much simpler when performing given tasks on the button keypad interface. While previous studies have mostly evaluated usability with performance measures by only looking at task results, this study can be expected to contribute by suggesting a method in which the behavioral patterns of interaction is evaluated.