• Title/Summary/Keyword: touch-screen

Search Result 367, Processing Time 0.032 seconds

A Study on the Cooling Block Design for a Large Touch Screen Panel (TSP) Cover Glass Molding System (대형 Touch Screen Panel(TSP) 덮개유리 성형기의 냉각 블록 설계에 관한 연구)

  • Lee, Jun Kyoung
    • Journal of the Korean Society of Manufacturing Process Engineers
    • /
    • v.19 no.6
    • /
    • pp.36-42
    • /
    • 2020
  • Nowadays, the touch screen panel (TSP) cover glass for mobile smart devices is being developed with a curved glass shape due to different design requirements. Because the sizes of mobile smart devices continue to increase, there has also been a great increase in the demand for large-area curved glass greater than 20 inches. In this study, heat and fluid flow analysis using CFD was performed to optimize the heating surface temperature distribution of the large curved glass formation system. Five cooling water flow paths in the cooling block were designed and analyzed for each case. A function that can quantitatively calculate the temperature uniformity of the heating surface was proposed and these values were obtained for the five models. The temperature distributions of the heating surface and the energy consumption of the heating system were also compared and comprehensively analyzed. Based on the analysis results of the five different cooling channel path models, the optimal path design could be presented.

Manipulation of the Windows Interface Based on Haptic Feedback (촉각 기반 윈도우 인터페이스)

  • Lee, Jun-Young;Kyung, Ki-Uk;Park, Jun-Seok
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.366-371
    • /
    • 2008
  • In this paper, we suggest a haptic interface and a framework of interaction with haptic feedback based Windows graphical user interface (GUI) in a computing device with touch screen. The events that occur during a user interacts with Windows interfaces through a touch screen are filtered out by the Windows Interface Message Filter (WIMF) and converted into appropriate haptic feedback information by the Haptic Information Provider (HIP). The haptic information are conveyed to users through a stylus-like haptic interface interacting with a touch screen. Major Windows interaction schemes including button click, menu selection/pop-up, window selection/movement, icon selection/drag & drop and scroll have been implemented and user tests show the improved usability since the haptic feedback helps intuition and precise manipulation.

  • PDF

Extracting Flick Operator for Predicting Performance by GOMS Model in Small Touch Screen

  • Choi, Mikyung;Lee, Bong Geun;Oh, Hyungseok;Myung, Rohae
    • Journal of the Ergonomics Society of Korea
    • /
    • v.32 no.2
    • /
    • pp.179-187
    • /
    • 2013
  • Objective: The purpose of this study is to extract GOMS manual operator, except for an experiment with participants. Background: The GOMS model has advantage of rapid modeling which is suitable for the environment of technology development which has a short life cycle products with a fast pace. The GOMS model was originally designed for desktop environment so that it is not adequate for implementing into the latest HCI environment such as small touch screen device. Therefore, this research proposed GOMS manual operator extraction methodology which is excluded experimental method. And flick Gesture was selected to explain application of proposed methodology to extract new operator. Method: Divide into start to final step of hand gesture needed to extract as an operator through gesture task analysis. Then apply the original GOMS operator to each similar step of gesture and modify the operator for implementation stage based on existing Fitts' law research. Steps that are required to move are modified based on the Fitts' law developed in touch screen device. Finally, new operator can be derived from using these stages and a validation experiment, performed to verify the validity of new operator and methodology by comparing human performance. Results: The average movement times of the participants' performance and the operator which is extracted in case study are not different significantly. Also the average of movement times of each type of view study is not different significantly. Conclusion: In conclusion, the result of the proposed methodology for extracting new operator is similar to the result of the experiment with their participants. Furthermore the GOMS model included the operator by the proposed methodology in this research could be applied successfully to predict the user's performance. Application: Using this methodology could be applied to develop new finger gesture in the touch screen. Also this proposed methodology could be applied to evaluate the usability of certain system rapidly including the new finger gesture performance.

N-gram based Language Model for the QWERTY Keyboard Input Errors in a Touch Screen Environment (터치스크린 환경에서 쿼티 자판 오타 교정을 위한 n-gram 언어 모델)

  • Ong, Yoon Gee;Kang, Seung Shik
    • Smart Media Journal
    • /
    • v.7 no.2
    • /
    • pp.54-59
    • /
    • 2018
  • With the increasing use of touch-enabled mobile devices such as smartphones and tablet PCs, the works are done on desktop computers and smartphones, and tablet PCs perform laptops. However, due to the nature of smart devices that require portability, QWERTY keyboard is densely arranged in a small screen. This is the cause of different typographical errors when using the mechanical QWERTY keyboard. Unlike the mechanical QWERTY keyboard, which has enough space for each button, QWERTY keyboard on the touch screen often has a small area assigned to each button, so that it is often the case that the surrounding buttons are input rather than the button the user intends to press. In this paper, we propose a method to automatically correct the input errors of the QWERTY keyboard in the touch screen environment by using the n-gram language model using the word unigram and the bigram probability.

An alternative method for smartphone input using AR markers

  • Kang, Yuna;Han, Soonhung
    • Journal of Computational Design and Engineering
    • /
    • v.1 no.3
    • /
    • pp.153-160
    • /
    • 2014
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but among middle-aged people as well. Most smartphones adopt capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen; this causes difficulties in making precise inputs. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. A marker is placed in front of the smartphone camera. Then, the camera image of the marker is analyzed to determine the position of the marker as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

Improvement of Smartphone Interface Using AR Marker (AR 마커를 이용한 스마트폰 인터페이스의 개선)

  • Kang, Yun-A;Han, Soon-Hung
    • Korean Journal of Computational Design and Engineering
    • /
    • v.16 no.5
    • /
    • pp.361-369
    • /
    • 2011
  • As smartphones came into wide use recently, it has become increasingly popular not only among young people, but middle-aged people as well. Most smartphones use capacitive full touch screen, so touch commands are made by fingers unlike the PDAs in the past that use touch pens. In this case, a significant portion of the smartphone's screen is blocked by the finger so it is impossible to see the screens around the finger touching the screen, and difficulty occurs in precise control used for small buttons such as qwerty keyboard. To solve this problem, this research proposes a method of using simple AR markers to improve the interface of smartphones. Sticker-form marker is attached to fingernails and placed in front of the smartphone camera Then, the camera image of the marker is analyzed to determine the orientation of the marker to perceive as onRelease() or onPress() of the mouse depending on the marker's angle of rotation, and use its position as the position of the mouse cursor. This method can enable click, double-click, drag-and-drop used in PCs as well as touch, slide, long-touch-input in smartphones. Through this research, smartphone inputs can be made more precise and simple, and show the possibility of the application of a new concept of smartphone interface.

Two-Point Touch Enabled 3D Touch Pad (2개의 터치인식이 가능한 3D 터치패드)

  • Lee, Yong-Min;Han, Chang Ho
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.18 no.4
    • /
    • pp.578-583
    • /
    • 2017
  • This paper presents a 3D touch pad technology that uses force touch sensors as a next-generation method for mobile applications. 3D touch technology requires detecting the location and pressure of touches simultaneously, as well as multi-touch function. We used metal foil strain gauges for the touch recognition sensor and detected the weak touch signals using Wheatstone bridge circuit at each strain gauge sensor. We also developed a touch recognition system that amplifies touch signals, converts them to digital data through a microprocessor, and displays the data on a screen. In software, we designed a touch recognition algorithm with C code, which is capable of recognizing two-point touch and differentiating touch pressures. We carried out a successful experiment to display two touch signals on a screen with different forces and locations.

Input Device of Non-Touch Screen Using Vision (비전을 이용한 비접촉 스크린 입력장치)

  • Seo, Hyo-Dong;Joo, Young-Hoon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.60 no.10
    • /
    • pp.1946-1950
    • /
    • 2011
  • This paper deals with an input device without the touch. The existing touch screens have some problems such as the week durability by frequent contact and the high cost by complex hardware configuration. In this paper, a non-touch input device is proposed to overcome these problems. The proposed method uses a skin color generated by the HCbCr color model and a hand region obtained by the labeling technique. In Addition, the skeleton model is employed to improve the recognition performance of the hand motion. Finally, the experiment results show the applicability of the proposed method.

The Mouse & Keyboard Control Application based on Smart Phone (스마트 폰 기반의 마우스와 키보드 제어 어플리케이션)

  • Lim, Yang Mi
    • Journal of Korea Multimedia Society
    • /
    • v.20 no.2
    • /
    • pp.396-403
    • /
    • 2017
  • In recent years, the use of touch screens has expanded, and devices such as remote controllers have been developed in various ways to control and access contents at long range. The wireless-based touch screen is used in classroom, seminar room, and remote video conversation in addition to the TV remote control. The purpose of the study is to design a smart phone-based intuitive interface that can perform the role of a wireless mouse and a wireless keyboard at range using Bluetooth and to develop an application that integrates functions of a mouse and a keyboard. Firstly, touch interaction model for controlling software such as PowerPoint by connecting to a general PC on a smart phone has been studied. The most simple touch operation interface is used to reproduce the function of existing devices and design more simply. The studies of the extension of interfaces with various functions are important, but development of optimized interfaces for users will become more important in the future. In this sense, this study is valuable.