• Title/Summary/Keyword: User-Computer Interface

Search Result 1,182, Processing Time 0.022 seconds

Trends and Implications of Digital Transformation in Vehicle Experience and Audio User Interface (차내 경험의 디지털 트랜스포메이션과 오디오 기반 인터페이스의 동향 및 시사점)

  • Kim, Kihyun;Kwon, Seong-Geun
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.2
    • /
    • pp.166-175
    • /
    • 2022
  • Digital transformation is driving so many changes in daily life and industry. The automobile industry is in a similar situation. In some cases, element techniques in areas called metabuses are also being adopted, such as 3D animated digital cockpit, around view, and voice AI, etc. Through the growth of the mobile market, the norm of human-computer interaction (HCI) has been evolving from keyboard-mouse interaction to touch screen. The core area was the graphical user interface (GUI), and recently, the audio user interface (AUI) has partially replaced the GUI. Since it is easy to access and intuitive to the user, it is quickly becoming a common area of the in-vehicle experience (IVE), especially. The benefits of a AUI are freeing the driver's eyes and hands, using fewer screens, lower interaction costs, more emotional and personal, effective for people with low vision. Nevertheless, when and where to apply a GUI or AUI are actually different approaches because some information is easier to process as we see it. In other cases, there is potential that AUI is more suitable. This is a study on a proposal to actively apply a AUI in the near future based on the context of various scenes occurring to improve IVE.

A Study on Gesture Interface through User Experience (사용자 경험을 통한 제스처 인터페이스에 관한 연구)

  • Yoon, Ki Tae;Cho, Eel Hea;Lee, Jooyoup
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.6
    • /
    • pp.839-849
    • /
    • 2017
  • Recently, the role of the kitchen has evolved from the space for previous survival to the space that shows the present life and culture. Along with these changes, the use of IoT technology is spreading. As a result, the development and diffusion of new smart devices in the kitchen is being achieved. The user experience for using these smart devices is also becoming important. For a natural interaction between a user and a computer, better interactions can be expected based on context awareness. This paper examines the Natural User Interface (NUI) that does not touch the device based on the user interface (UI) of the smart device used in the kitchen. In this method, we use the image processing technology to recognize the user's hand gesture using the camera attached to the device and apply the recognized hand shape to the interface. The gestures used in this study are proposed to gesture according to the user's context and situation, and 5 kinds of gestures are classified and used in the interface.

Multiple Object-Based Design Model for Quality Improvement of User Interface (사용자 인터페이스 품질 향상을 위한 다중 객체 기반 설계 모델)

  • Kim Jeong-Ok;Lee Sang-Young
    • The KIPS Transactions:PartD
    • /
    • v.12D no.7 s.103
    • /
    • pp.957-964
    • /
    • 2005
  • According to rapid growth of web environment, user interface design needs to support the complex interactions between human and computer. In the paper we suggest the object modeling method for Qualify Improvement of User Interface. We propose the 4 business event's object modeling phases such as business event object modeling, task object modeling, transaction object modeling, and form object modeling to enhance visual cohesion of UI. As a result, this 4 phases in this paper allows us to enhance visual cohesion of User Interface prototype. We have found that the visual cohesion of business events become strong and unskilled designer can develope the qualified user interface prototype. And it also improves understanding of business task and reduces prototype system development iteration.

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • v.29 no.3
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

Design and Implementation of 3D Visualization System for Real-time Environmental Sensor Data (실시간 환경 센서 데이터의 3차원 시각화 시스템 설계 및 구현)

  • Kim, KyeongOg;Ban, KyeongJin;Ryu, NamHoon;Kim, EungKon
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2007.11a
    • /
    • pp.783-787
    • /
    • 2007
  • Although data analysis in earlier days had been sufficiently done only by character user interface, users of these days are more used to the graphic user interface and the requirements for the user interface are gradually varying and increasing. In order to meet users' various wants and needs and to develop well-equipped interface, not only software developers but also professional designers who can complement the technique of the developers are needed. But in reality there are many restrictions and difficulties for developers and designers to work together cooperatively. Moreover, developing user interface in use of 3D type of graphics and animation techniques causes the rise of the developing cost. The thesis attempts to design and implement 3D visualization for real-time sensor data collected by the various environmental sensor and measuring devices, by using WPF (Window Presentation Foundation) which can make both developers and designers work together cooperatively and which makes it possible to implement various multimedia functions such as a 2D and 3D type of graphics, animation techniques, and an acoustic effect.

  • PDF

A Study on Interaction of Gaze-based User Interface in Mobile Virtual Reality Environment (모바일 가상현실 환경에서의 시선기반 사용자 인터페이스 상호 작용에 관한 연구)

  • Kim, Mingyu;Lee, Jiwon;Jeon, Changyu;Kim, Jinmo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.23 no.3
    • /
    • pp.39-46
    • /
    • 2017
  • This study proposes a gaze-based user interface to provide user oriented interaction suitable for the virtual reality environment on mobile platforms. For this purpose, a mobile platform-based three-dimensional interactive content is produced, to test whether the proposed interface increases user satisfaction through the interactions in a mobile virtual reality environment. The gaze-based interface, the most common input method for mobile virtual reality content, is designed by considering two factors: the field of view and the feedback system. The performance of the proposed gaze-based interface is analyzed by conducting experiments on whether or not it offers motives for user interest, effects of enhanced immersion, differentiated formats from existing ones, and convenience in operating content.

Design and Implementation of an Internet Auction Agent System using Scheduling for Auto-bidding Policy (자동 입찰정책 스켈줄링을 이용한 인터넷 경매 에이전트 시스템 설계 및 구현)

  • Lee, Jong-Hui;Kim, Tae-Seok;Lee, Geun-Wang;O, Hae-Seok
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.5S
    • /
    • pp.1620-1628
    • /
    • 2000
  • Existing internet auction systems have adopted the form which gives a win to the auction bidder who proposes the top bid price for the goods posted on the auction board. But they haven't been satisfying the automatical one-step processing for user's convenience because they must require continuous care of user for bidding and checking as well as neglecting the convenience of user interface while participating in the electronic bidding system. The AuctionBot system of Michigan Univ. is known as the representative internet auction system. It has merit allowing the various kids of bidding. But it does not support automatic a process which can replaces what user has to do using an agent. To improve this kind of user interface, we propose an automatic bidding price strategy algorithm which composes the auction system using agent and design for auto-bidding system using agent so that I can replace user's bidding activity. This mechanism gives an auction system the efficient bidding price strategy.

  • PDF

Intelligent Control Interface for Display Power Response to a User's Activity (사용자 활동 상태에 반응하는 지능형 디스플레이 전원 제어 인터페이스)

  • Baek, Jong-Hun;Yun, Byoung-Ju
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.2
    • /
    • pp.61-68
    • /
    • 2010
  • As a result of the growth of mobile devices such as PDA and cellular phone, a user can utilize various digital contents everywhere and anytime. However, mobile devices have the limited resources and interaction mechanisms. This paper introduces the schema for a user activity estimation and its application in order to overcome the poor user interface and limited resource problems. We are able to supplement lacking the user interface of mobile devices by using the user activity estimation proposed in this paper, and its application is a intelligent control interface for the display power on or off which can effectively utility the battery of the mobile device.

Interaction Technique in Smoke Simulations using Mouth-Wind on Mobile Devices (모바일 디바이스에서 사용자의 입 바람을 이용한 연기 시뮬레이션의 상호작용 방법)

  • Kim, Jong-Hyun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.24 no.4
    • /
    • pp.21-27
    • /
    • 2018
  • In this paper, we propose a real-time interaction method using user's mouth wind in mobile device. In mobile and virtual reality, user interaction technology is important, but various user interface methods is still lacking. Most of the interaction technologies are hand touch screen touch or motion recognition. In this study, we propose an interface technology that can interact with real time using user's mouth wind. The direction of the wind is determined by using the angle and the position between the user and the mobile device, and the size of the wind is calculated by using the magnitude of user's mouth wind. To show the superiority of the proposed technique, we show the result of visualizing the flow of the vector field in real time by integrating the mouth-wind interface into the Navier-Stokes equations. We show the results of the paper on mobile devices, but can be applied in the Agumented reality(AR) and Virtual reality(VR) fields requiring interface technology.