• Title/Summary/Keyword: Touch screen Interaction

Search Result 40, Processing Time 0.023 seconds

Formation of Metal Mesh Electrodes via Laser Plasmonic Annealing of Metal Nanoparticles for Application in Flexible Touch Sensors (금속 나노 파티클의 레이저 플라즈모닉 어닐링을 통한 메탈메쉬 전극 형성과 이를 활용한 유연 터치 센서)

  • Seongmin Jeong;Yun Sik Hwang;Yu Mi Woo;Yong Jun Cho;Chan Hyeok Kim;Min Gi An;Ho Seok Seo;Chan Hyeon Yang;Kwi-Il Park;Jung Hwan Park
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.37 no.2
    • /
    • pp.223-229
    • /
    • 2024
  • Laser-induced plasmonic sintering of metal nanoparticles (NPs) holds significant promise as a technology for producing flexible conducting electrodes. This method offers immediate, straightforward, and scalable manufacturing approaches, eliminating the need for expensive facilities and intricate processes. Nevertheless, the metal NPs come at a high cost due to the intricate synthesis procedures required to ensure long-term reliability in terms of chemical stability and the prevention of NP aggregation. Herein, we induced the self-generation of metal nanoparticles from Ag organometallic ink, and fabricated highly conductive electrodes on flexible substrates through laser-assisted plasmonic annealing. To demonstrate the practicality of the fabricated flexible electrode, it was configured in a mesh pattern, realizing multi-touchable flexible touch screen panel.

Interaction Technique in Smoke Simulations using Mouth-Wind on Mobile Devices (모바일 디바이스에서 사용자의 입 바람을 이용한 연기 시뮬레이션의 상호작용 방법)

  • Kim, Jong-Hyun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.4
    • /
    • pp.21-27
    • /
    • 2018
  • In this paper, we propose a real-time interaction method using user's mouth wind in mobile device. In mobile and virtual reality, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. In this study, we propose an interface technology that can interact with real time using user's mouth wind. The direction of the wind is determined by using the angle and the position between the user and the mobile device, and the size of the wind is calculated by using the magnitude of user's mouth wind. To show the superiority of the proposed technique, we show the result of visualizing the flow of the vector field in real time by integrating the mouth-wind interface into the Navier-Stokes equations. We show the results of the paper on mobile devices, but can be applied in the Agumented reality(AR) and Virtual reality(VR) fields requiring interface technology.

Using Interaction for an Experiential Story 'The Three Little Pigs and Wolf' - for ipad - (인터랙션을 활용한 체험형 동화 '아기 돼지 삼형제와 늑대' - ipad를 중심으로 -)

  • Kim, Hyunhee
    • Design Convergence Study
    • /
    • v.14 no.3
    • /
    • pp.1-15
    • /
    • 2015
  • Storytelling which is part of human nature, has changed over millions of years. The development of technology and media has shaped Storytelling into various forms and shapes, and due to the recent spread of smart devices, the influence of interactive storytelling has grown significantly. The technology which allows diverse and natural input of users, have transformed the listener to user and has allowed the user to 'experience' the story rather than 'hear' it. In line with the trend in the development of these technologies, this study seeks to design and implement an interactive tale for children on an ipad platform. Focusing on the interaction aspect, this story is designed mainly for 3-7 years olds, which contains various multimedia elements and interaction elements that use built in technology such as multi-touch technology and microphone technology to allow user input that aline with the context of story. Focusing on children's experience and empathy with the characters of the story, 'Three Little Pigs and the Wolf' contains 22 steps and was published in the itunes Store.

A Study on Smart Touch Projector System Technology Using Infrared (IR) Imaging Sensor (적외선 영상센서를 이용한 스마트 터치 프로젝터 시스템 기술 연구)

  • Lee, Kuk-Seon;Oh, Sang-Heon;Jeon, Kuk-Hui;Kang, Seong-Soo;Ryu, Dong-Hee;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.7
    • /
    • pp.870-878
    • /
    • 2012
  • Recently, very rapid development of computer and sensor technologies induces various kinds of user interface (UI) technologies based on user experience (UX). In this study, we investigate and develop a smart touch projector system technology on the basis of IR sensor and image processing. In the proposed system, a user can control computer by understanding the control events based on gesture of IR pen as an input device. In the IR image, we extract the movement (or gesture) of the devised pen and track it for recognizing gesture pattern. Also, to correct the error between the coordinate of input image sensor and display device (projector), we propose a coordinate correction algorithm to improve the accuracy of operation. Through this system technology as the next generation human-computer interaction, we can control the events of the equipped computer on the projected image screen without manipulating the computer directly.

DEVS Modeling for Interactive Motion-based Mobile Contents Authoring Tool (모바일 기기 환경의 인터렉티브 모션 기반 콘텐츠 개발 도구와 DEVS 모델링)

  • Ju, Seunghwan;Choi, Yohan;Lim, Yongsoo;Seo, Heesuk
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.11 no.2
    • /
    • pp.121-129
    • /
    • 2015
  • Interactive media is a method of communication in which the output from the media comes from the input of the users. The interactive media lets the user go back with the media. Interactive media works with the user's participation. The media still has the same purpose but the user's input adds the interaction and brings interesting features to the system for a better enjoyment. We need a digital content using a dynamic motion and gesture of the mobile device. We made an authoring tool for content producers to easily create interactive content. We have tried to take advantage of the interaction by using a touch screen and a gravity sensor of the mobile device. This interaction may lead to allow the user to participate in the content, it can be used as a key device to assist in engagement. Furthermore, our authoring tool can be applied to various fields of publishing content.

Trends and Implications of Digital Transformation in Vehicle Experience and Audio User Interface (차내 경험의 디지털 트랜스포메이션과 오디오 기반 인터페이스의 동향 및 시사점)

  • Kim, Kihyun;Kwon, Seong-Geun
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.2
    • /
    • pp.166-175
    • /
    • 2022
  • Digital transformation is driving so many changes in daily life and industry. The automobile industry is in a similar situation. In some cases, element techniques in areas called metabuses are also being adopted, such as 3D animated digital cockpit, around view, and voice AI, etc. Through the growth of the mobile market, the norm of human-computer interaction (HCI) has been evolving from keyboard-mouse interaction to touch screen. The core area was the graphical user interface (GUI), and recently, the audio user interface (AUI) has partially replaced the GUI. Since it is easy to access and intuitive to the user, it is quickly becoming a common area of the in-vehicle experience (IVE), especially. The benefits of a AUI are freeing the driver's eyes and hands, using fewer screens, lower interaction costs, more emotional and personal, effective for people with low vision. Nevertheless, when and where to apply a GUI or AUI are actually different approaches because some information is easier to process as we see it. In other cases, there is potential that AUI is more suitable. This is a study on a proposal to actively apply a AUI in the near future based on the context of various scenes occurring to improve IVE.

Development of a Tiled Display GOCI Observation Satellite Imagery Visualization System (타일드 디스플레이 천리안 해양관측 위성 영상 가시화 시스템 개발)

  • Park, Chan-sol;Lee, Kwan-ju;Kim, Nak-hoon;Lee, Sang-ho;Seo, Ki-young;Park, Kyoung Shin
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2013.10a
    • /
    • pp.641-642
    • /
    • 2013
  • This research implemented Geostationary Ocean Color Imager (GOCI) observation satellite imagery visualization system on a large high-resolution tiled display. This system is designed to help users observe or analyze satellite imagery more effectively on the tiled display using multi-touch and Kinect motion gesture recognition interaction. We developed the multi-scale image loading and rendering technique for the massive amount of memory management and smooth rendering for GOCI satellite imagery on the tiled display. In this system, the unit of time corresponding to the selected date of the satellite images are sequentially displayed on the screen. Users can zoom-in, zoom-out, move the imagery and select some buttons to trigger functions using both multi-touch or Kinect gesture interaction.

  • PDF

Spatial Augmented Reality for Educational Content Display System (공간 증강 현실 기반 교육용 콘텐츠 전시 시스템)

  • Kim, Jung-Hoon;Lee, Young-Bo;Kim, Ki-Hyun;Yun, Tae-Soo;Lee, Dong-Hoon
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.790-794
    • /
    • 2008
  • This paper proposes a educational content display system based on spatial augmented reality with a multi-touch screen for effective learning with intuitive interaction. It is hardly expected that the existing display systems have major learning effects because the user-system interaction can only be achieved through buttons or switches, In contrast, the system proposed by this paper ensures more effective interaction through the user's movement, not through buttons or switches. This system uses a spatial augmented reality method to display images, thereby attracting the users' attention. In addition, it ensures the effective dissemination of information by providing visual images that are more realistic.

  • PDF

Development and Evaluation of the V-Catch Vision System

  • Kim, Dong Keun;Cho, Yongjoo;Park, Kyoung Shin
    • Journal of the Korea Society of Computer and Information
    • /
    • v.27 no.3
    • /
    • pp.45-52
    • /
    • 2022
  • A tangible sports game is an exercise game that uses sensors or cameras to track the user's body movements and to feel a sense of reality. Recently, VR indoor sports room systems installed to utilize tangible sports game for physical activity in schools. However, these systems primarily use screen-touch user interaction. In this research, we developed a V-Catch Vision system that uses AI image recognition technology to enable tracking of user movements in three-dimensional space rather than two-dimensional wall touch interaction. We also conducted a usability evaluation experiment to investigate the exercise effects of this system. We tried to evaluate quantitative exercise effects by measuring blood oxygen saturation level, the real-time ECG heart rate variability, and user body movement and angle change of Kinect skeleton. The experiment result showed that there was a statistically significant increase in heart rate and an increase in the amount of body movement when using the V-Catch Vision system. In the subjective evaluation, most subjects found the exercise using this system fun and satisfactory.

A Gesture Interface based on Hologram and Haptics Environments for Interactive and Immersive Experiences (상호작용과 몰입 향상을 위한 홀로그램과 햅틱 환경 기반의 동작 인터페이스)

  • Pyun, Hae-Gul;An, Haeng-A;Yuk, Seongmin;Park, Jinho
    • Journal of Korea Game Society
    • /
    • v.15 no.1
    • /
    • pp.27-34
    • /
    • 2015
  • This paper proposes a user interface for enhancing immersiveness and usability by combining hologram and haptic device with common Leap Motion. While Leap Motion delivers physical motion of user hand to control virtual environment, it is limited to handle virtual hands on screen and interact with virtual environment in one way. In our system, hologram is coupled with Leap Motion to improve user immersiveness by arranging real and virtual hands in the same place. Moreover, we provide a interaction prototype of sense by designing a haptic device to convey touch sense in virtual environment to user's hand.