• Title/Summary/Keyword: Touch screen Interaction

Search Result 40, Processing Time 0.024 seconds

A Study on User Behavior of Input Method for Touch Screen Mobile Phone (터치스크린 휴대폰 입력 방식에 따른 사용자 행태에 관한 연구)

  • Jun, Hye-Sun;Choi, Woo-Sik;Pan, Young-Hwan
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.173-178
    • /
    • 2008
  • Due to a rapid increase in demand for bigger-screen-equipped mobile phones in recent years, many big-name-manufactures have been releasing touch-screen-enabled devices. In this paper, various touch-screen-input methods have been summarized into 6 different categories. How? By tracing each user's finger print path, user's input pattern and behavior have been carefully recorded and analyzed. Through this analysis, what to be considered before designing UI is presented in great details.

  • PDF

Two camera based touch screen system for human computer interaction (인간과 컴퓨터 상호 작용을 위한 2개의 카메라 기반의 터치 스크린 시스템)

  • Kim, Jin-Kuk;Min, Kyung-Won;Ko, Han-Seok
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.319-320
    • /
    • 2006
  • In this paper, we propose a vision based system employing two cameras to provide effective touch screen function. The two main processes - determining touch (or no-touch) and contact location of screen plane - are essential for enabling touch screen function. First region of interest is found by using color characteristic and histogram for determining the contact mode. Second, if the hand touches the mirror, the fingertip point in image is found using the correlation coefficient based on the mirror attribute. Subsequently, the fingertip coordinate in image is transformed to the location in mirror plane by using four predefined points (termed as four-point method) and bilinear transform. Representative experimental results show that the proposed system is suited to touch screen.

  • PDF

The Effects of Swiping Orientation on Preference and Willingness to Pay: The Interaction Between Touch Interface and Need-For-Touch

  • Ren, Han;Kang, Hyunmin;Ryu, Soohyun;Han, Kwanghee
    • Science of Emotion and Sensibility
    • /
    • v.20 no.4
    • /
    • pp.65-78
    • /
    • 2017
  • The current study examined the influence of individual trait such as Need-For-Touch level (NFT; high vs. low) and swiping orientation (vertical vs. horizontal) on product evaluation and preference when using touch-screen interface like a smart phone and a tablet. Swiping is one of the most common interaction techniques for changing pages or searching some aligned pictures on touch-screen interface and it can be used in vertical and horizontal orientations. The experiment revealed a significant interaction between swiping orientation and NFT on preference, however the interaction on change-in-price of given products was only marginally significant. To be specific, high NFT participants reported higher preference for horizontal-swipe than vertical-swipe products, but such difference did not occur with low NFT participants. The current study illustrates the influence of swiping orientation and NFT on product preference and it provides a new perspective of design principles especially for online shopping websites.

Large scale interactive display system for touch interaction in stereopsis (입체 영상에서 터치 인터랙션을 위한 대규모 인터랙티브 디스플레이 시스템)

  • Kang, Maeng-Kwan;Kim, Jung-Hoon;Jo, Sung-Hyun;Joo, Woo-Suck;Yoon, Tae-Soo;Lee, Dong-Hoon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.10a
    • /
    • pp.252-255
    • /
    • 2010
  • In this thesis, it suggests large scale interactive display system which is able to various touch interaction and bases on infrared LED BAR and using 3D. Interaction layer formed on space from screen which is able to feel 3D using suggested IR LED BAR. It gets the image in real time what is composed in interaction section using infrared camera with band pass filter. The image finds touch interaction coordinate through image processing module and saves as packet. It send packet to server through network data communication. It analyze packet by metaphor analysis module and save as metaphor event and send it to contents. On contents, it practices to metaphor event result in real time so it makes use touch interaction in stereopsis. According to this process, it does not need touch the screen at firsthand but it is possible system and touch interaction so touch interaction is possible while use 3D.

  • PDF

Manipulation of the Windows Interface Based on Haptic Feedback (촉각 기반 윈도우 인터페이스)

  • Lee, Jun-Young;Kyung, Ki-Uk;Park, Jun-Seok
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.366-371
    • /
    • 2008
  • In this paper, we suggest a haptic interface and a framework of interaction with haptic feedback based Windows graphical user interface (GUI) in a computing device with touch screen. The events that occur during a user interacts with Windows interfaces through a touch screen are filtered out by the Windows Interface Message Filter (WIMF) and converted into appropriate haptic feedback information by the Haptic Information Provider (HIP). The haptic information are conveyed to users through a stylus-like haptic interface interacting with a touch screen. Major Windows interaction schemes including button click, menu selection/pop-up, window selection/movement, icon selection/drag & drop and scroll have been implemented and user tests show the improved usability since the haptic feedback helps intuition and precise manipulation.

  • PDF

The Mouse & Keyboard Control Application based on Smart Phone (스마트 폰 기반의 마우스와 키보드 제어 어플리케이션)

  • Lim, Yang Mi
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.2
    • /
    • pp.396-403
    • /
    • 2017
  • In recent years, the use of touch screens has expanded, and devices such as remote controllers have been developed in various ways to control and access contents at long range. The wireless-based touch screen is used in classroom, seminar room, and remote video conversation in addition to the TV remote control. The purpose of the study is to design a smart phone-based intuitive interface that can perform the role of a wireless mouse and a wireless keyboard at range using Bluetooth and to develop an application that integrates functions of a mouse and a keyboard. Firstly, touch interaction model for controlling software such as PowerPoint by connecting to a general PC on a smart phone has been studied. The most simple touch operation interface is used to reproduce the function of existing devices and design more simply. The studies of the extension of interfaces with various functions are important, but development of optimized interfaces for users will become more important in the future. In this sense, this study is valuable.

An alternative method for smartphone input using AR markers

  • Kang, Yuna;Han, Soonhung
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.3
    • /
    • pp.153-160
    • /
    • 2014
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but among middle-aged people as well. Most smartphones adopt capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen; this causes difficulties in making precise inputs. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. A marker is placed in front of the smartphone camera. Then, the camera image of the marker is analyzed to determine the position of the marker as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

Effect of Touch-key Sizes on Usability of Driver Information Systems and Driving Safety (터치키 크기가 운전자 정보 시스템의 사용성과 운전의 안전성에 미치는 영향 분석)

  • Kim, Hee-Hin;Kwon, Sung-Hyuk;Heo, Ji-Yoon;Chung, Min-K.
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.37 no.1
    • /
    • pp.30-40
    • /
    • 2011
  • In recent years, driver information systems (DIS's) became popular and the use of DIS's increased significantly. A majority of DIS's provides touch-screen interfaces because of intuitiveness of the interaction and the flexibility of interface design. In many cases, touch-screen interfaces are mainly manipulated by fingers. In this case, investigating the effect of touch-key sizes on usability is known to be one of the most important research issues, and lots of studies address the effect of touch-key size for mobile devices or kiosks. However, there is few study on DIS's. The importance of touch-key size study for DIS's should be emphasized because it is closely related to safety issues besides usability issues. In this study, we investigated the effect of touch-key sizes of DIS's while simulated driving (0, 50, and 100km/h) considering driving safety (lateral deviation, velocity deviation, total glance time, mean glance time, total time between glances, mean number of glances) and usability of DIS's (task completion time, error rate, subjective preference, NASA TLX) simultaneously. As a result, both of driving safety and usability of DIS's increased as driving speed decreased and touch-key size increased. However, there were no significant differences when touch-key size is larger than a certain level (in this study : 17.5mm).

Validating one-handed interaction modes for supporting touch dead-zone in large screen smartphones (대화면 스마트폰의 한 손 조작 시 터치 사각영역 지원 인터랙션의 유용성)

  • Park, Minji;Kim, Huhn
    • Journal of the HCI Society of Korea
    • /
    • v.12 no.1
    • /
    • pp.25-32
    • /
    • 2017
  • The purpose of this study is to evaluate the effectiveness of one-handed interaction modes for supporting the dead zone that users must be difficulty in performing the touch manipulation with only one hand. For the purpose, this study analyzed two existing one-handed modes in iPhone and Android smartphones, and proposed and implemented two additional one-handed modes. In order to investigate effectiveness of the one-handed modes, we performed the experiment that compared normal touch mode with the four one-handed modes. Experimental results showed that all one-handed modes required more time than normal touch mode because of the time requiring in both mode change and recognition. However, the participants had difficulty in manipulating continuous touches at dead zone area with only normal touch. Moreover, the subjective satisfaction was high in one-handed modes thanks to touch convenience and smooth transition effects in mode change. In special, the one-handed mode at iPhone was the most effective out of the tested modes.

Design of Contactless Gesture-based Rhythm Action Game Interface for Smart Mobile Devices

  • Ju, Da-Young
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.585-591
    • /
    • 2012
  • Objective: The aim of this study is to propose the contactless gesture-based interface on smart mobile devices for especially rhythm action games. Background: Most existing approaches about interactions of smart mobile games are tab on the touch screen. However that way is such undesirable for someone or for sometimes, because of the disabled person, or the inconvenience that users need to touch/tab specific devices. Moreover more importantly, new interaction can derive new possibilities from stranded game genre. Method: In this paper, I present a smart mobile game with contactless gesture-based interaction and the interfaces using computer vision technology. Discovering the gestures which are easy to recognize and research of interaction system that fits to game on smart mobile device are conducted as previous studies. A combination between augmented reality technique and contactless gesture interaction is also tried. Results: The rhythm game allows a user to interact with smart mobile devices using hand gestures, without touching or tabbing the screen. Moreover users can feel fun in the game as other games. Conclusion: Evaluation results show that users make low failure numbers, and the game is able to recognize gestures with quite high precision in real time. Therefore the contactless gesture-based interaction has potentials to smart mobile game. Application: The results are applied to the commercial game application.