• Title/Summary/Keyword: Natural User Interface

Search Result 224, Processing Time 0.026 seconds

Gesture based Natural User Interface for e-Training

  • Lim, C.J.;Lee, Nam-Hee;Jeong, Yun-Guen;Heo, Seung-Il
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.4
    • /
    • pp.577-583
    • /
    • 2012
  • Objective: This paper describes the process and results related to the development of gesture recognition-based natural user interface(NUI) for vehicle maintenance e-Training system. Background: E-Training refers to education training that acquires and improves the necessary capabilities to perform tasks by using information and communication technology(simulation, 3D virtual reality, and augmented reality), device(PC, tablet, smartphone, and HMD), and environment(wired/wireless internet and cloud computing). Method: Palm movement from depth camera is used as a pointing device, where finger movement is extracted by using OpenCV library as a selection protocol. Results: The proposed NUI allows trainees to control objects, such as cars and engines, on a large screen through gesture recognition. In addition, it includes the learning environment to understand the procedure of either assemble or disassemble certain parts. Conclusion: Future works are related to the implementation of gesture recognition technology for a multiple number of trainees. Application: The results of this interface can be applied not only in e-Training system, but also in other systems, such as digital signage, tangible game, controlling 3D contents, etc.

User Interface in Web Based Communication for Internet Robot Control

  • Sugisaka, Masanori;Hazry, Desa
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.49-51
    • /
    • 2005
  • Robot control involves advance programming, scientific and high technology. The systematic and methodological aspects of robot controls often results in having superficial control design problems that can negatively affect the robot application, usability and appeal. User friendly interface of robot control is extremely advantageous and more attractive. To illustrate, the application of medical robot is usually handled by clients who have little background in advance programming language. Thus, it would be difficult if the client needs to use programming language to control the robot. It would justify better if the robot control is presented in a meaningful interface to the client. This way the robot application would be more natural and user friendly. This paper describes the method of developing the user interface for web based communication to control an internet robot named Tarou. The web based communication tasks involves three levels. The first one accommodates on the client sending commands to robot through the internet. The next communication level relates to the robot receiving the commands sent by the client. The final communication level generates on sending feedback on status of commands by the robot to the client. The methodology used here can be elaborated in four hierarchical steps; identify user needs and robot tasks, identify the enhancing tag reference used by the server, induce the tag into HTML, present the HTML in attractive user interface as the client control panel.

  • PDF

A Study of Process of Interaction Design on User Interface Development (사용자 인터페이스 개발을 위한 인터렉션 디자인 프로세스에 관한 연구)

  • 양승무
    • Archives of design research
    • /
    • v.14
    • /
    • pp.193-207
    • /
    • 1996
  • Until recently, ind ustrial designers designed the product packaging of Eletronic Media, and userInterface designers focused primarily on software design, But as computer related technology continues to evolve, this distinction has become ambiguous, As a natural consequence of these trends, industrial designers are getting more involved in user-interface design to develop interactive objects, Moreover, In the future, the whole Information of a product whIch means product packaging, marketing, training matenals, support as weIl as hardware and software will be regarded as a product, and user's entIre expenences including the material elements of product will be taken for the factors of user interface to expand the category of user interface design. The importance of interface design and its applications within industrial design has been stressed in the recent years, However researches on practical process in interface design developments have been rarely established. This paper aims to develop the interaction design process for the useful user-interface design by theoretical study and actual analysis of user-interface design.

  • PDF

Game Analysis for Next Generation User Interface based on Storytelling (차세대 스토리텔링기반 유저 인터페이스를 위한 게임분석)

  • Lee, Dae-Young;Kim, Seon-Ju;Yu, Hui-Beom;Won, Sun-Jin;Sung, Jung-Hwan
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02b
    • /
    • pp.534-539
    • /
    • 2008
  • Nowadays, UI(User Interface)'s main target area is getting larger from user's convenience oriented interface, to user's satisfaction oriented interface. In this study, we attempted 'Storytelling User Interface' for fulfillment of user's satisfaction. We analyzed UI in games, distinguished and clarified its effectiveness that conveys main contents and special features to its audience. The most important element in storytelling UI is 'Inherent Drama'. This was proved through the game analysis with Blizzard's 'Diablo', Sierra's 'Home world', Eden Games' 'Test Drive Unlimited', EA's 'Black And White', and Sony Entertainment's 'Eye Of Judgment'. And we divided storytelling elements into 3 pieces, Mood Design - that shows story's background, or traits, Natural Control - for user's feeling as a hero in the content, and Flow Directing - using animation, visual design and sound, etc. Via these elements, we can make the best use of background, icon, animation, every structures and flow in games. Finally, we embodied three pieces for putting in practice for effective improving interface.

  • PDF

A Study on Implementing Kinect-Based Control for LCD Display Contents (LCD Display 설비 Contents의 Kinect기반 동작제어 기술 구현에 관한 연구)

  • Rho, Jungkyu
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.4
    • /
    • pp.565-569
    • /
    • 2014
  • Recently, various kinds of new computer controlled devices have been introduced in a wide range of areas, and convenient user interfaces for controlling the devices are strongly needed. To implement natural user interfaces(NUIs) on top of the devices, new technologies like a touch screen, Wii Remote, wearable interfaces, and Microsoft Kinect were presented. This paper presents a natural and intuitive gesture-based model for controlling contents of LCD display. Microsoft Kinect sensor and its SDK are used to recognize human gestures, and the gestures are interpreted into corresponding commands to be executed. A command dispatch model is also proposed in order to handle the commands more naturally. I expect the proposed interface can be used in various fields, including display contents control.

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.4
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.

Motion-based Controlling 4D Special Effect Devices to Activate Immersive Contents (실감형 콘텐츠 작동을 위한 모션 기반 4D 특수효과 장치 제어)

  • Kim, Kwang Jin;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.8 no.1
    • /
    • pp.51-58
    • /
    • 2019
  • This paper describes a gesture application to controlling the special effects of physical devices for 4D contents using the PWM (Pulse Width Modulation) method. The user operation recognized by the infrared sensor is interpreted as a command for 3D content control, several of which manipulate the device that generates the special effect to display the physical stimulus to the user. With the content controlled under the NUI (Natural User Interface) technique, the user can be directly put into an immersion experience, which leads to provision of the higher degree of interest and attention. In order to measure the efficiency of the proposed method, we implemented a PWM-based real-time linear control system that manages the parameters of the motion recognition and animation controller using the infrared sensor and transmits the event.

Speech Recognition Interface in the Communication Environment (통신환경에서 음성인식 인터페이스)

  • Han, Tai-Kun;Kim, Jong-Keun;Lee, Dong-Wook
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2610-2612
    • /
    • 2001
  • This study examines the recognition of the user's sound command based on speech recognition and natural language processing, and develops the natural language interface agent which can analyze the recognized command. The natural language interface agent consists of speech recognizer and semantic interpreter. Speech recognizer understands speech command and transforms the command into character strings. Semantic interpreter analyzes the character strings and creates the commands and questions to be transferred into the application program. We also consider the problems, related to the speech recognizer and the semantic interpreter, such as the ambiguity of natural language and the ambiguity and the errors from speech recognizer. This kind of natural language interface agent can be applied to the telephony environment involving all kind of communication media such as telephone, fax, e-mail, and so on.

  • PDF

Tangible Space Initiative

  • Ahn, Chong-Keun;Kim, Lae-Hyun;Ha, Sung-Do
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2004.10a
    • /
    • pp.1053-1056
    • /
    • 2004
  • Research in Human Computer Interface (HCI) is towards development of an application environment able to deal with interactions of both human and computers that can be more intuitive and efficient. This can be achieved by bridging the gap between the synthetic virtual environment and the natural physical environment. Thus a project called Tangible Space Initiative (TSI) has been launched by KIST. TSI is subdivided into Tangible Interface (TI) which controls 3D cyber space with user's perspective, Responsive Cyber Space (RCS) which creates and controls the virtual environment and Tangible Agent (TA) which senses and acts upon the physical interface environment on behalf of any components of TSI or the user. This paper is a brief introduction to a new generation of Human Computer Interface that bring user to a new era of interaction with computers in the future.

  • PDF

Design of Gesture based Interfaces for Controlling GUI Applications (GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계)

  • Park, Ki-Chang;Seo, Seong-Chae;Jeong, Seung-Moon;Kang, Im-Cheol;Kim, Byung-Gi
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.1
    • /
    • pp.55-63
    • /
    • 2013
  • NUI(Natural User Interfaces) has been developed through CLI(Command Line Interfaces) and GUI(Graphical User Interfaces). NUI uses many different input modalities, including multi-touch, motion tracking, voice and stylus. In order to adopt NUI to legacy GUI applications, he/she must add device libraries, modify relevant source code and debug it. In this paper, we propose a gesture-based interface model that can be applied without modification of the existing event-based GUI applications and also present the XML schema for the specification of the model proposed. This paper shows a method of using the proposed model through a prototype.