• Title/Summary/Keyword: human and computer interaction

Search Result 602, Processing Time 0.046 seconds

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • v.29 no.3
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

Human-Computer Interaction System for the disabled using Recognition of Face Direction (얼굴 주시방향 인식을 이용한 장애자용 의사 전달 시스템)

  • 정상현;문인혁
    • Proceedings of the IEEK Conference
    • /
    • 2001.06d
    • /
    • pp.175-178
    • /
    • 2001
  • This paper proposes a novel human-computer interaction system for the disabled using recognition of face direction. Face direction is recognized by comparing positions of center of gravity between face region and facial features such as eyes and eyebrows. The face region is first selected by using color information, and then the facial features are extracted by applying a separation filter to the face region. The process speed for recognition of face direction is 6.57frame/sec with a success rate of 92.9% without any special hardware for image processing. We implement human-computer interaction system using screen menu, and show a validity of the proposed method from experimental results.

  • PDF

Human-Computer Interaction Survey for Intelligent Robot (지능형 로봇을 위한 인간-컴퓨터 상호작용(HCI) 연구동향)

  • Hong, Seok-Ju;Lee, Chil-Woo
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.507-511
    • /
    • 2006
  • Intelligent robot is defined as a system that it judges autonomously based on sensory organ of sight, hearing etc.. analogously with human. Human communicates using nonverbal means such as gesture in addition to language. If robot understands such nonverbal communication means, robot may become familiar with human . HCI (Human Computer Interaction) technologies are studied vigorously including face recognition and gesture recognition, but they are many problems that must be solved in real conditions. In this paper, we introduce the importance of contents and give application example of technology stressed on the recent research result about gesture recognition technology as one of most natural communication method with human.

  • PDF

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • v.2 no.4
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF

Intelligent Emotional Interface for Personal Robot and Its Application to a Humanoid Robot, AMIET

  • Seo, Yong-Ho;Jeong, Il-Woong;Jung, Hye-Won;Yang, Hyun-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1764-1768
    • /
    • 2004
  • In the near future, robots will be used for the personal use. To provide useful services to humans, it will be necessary for robots to understand human intentions. Consequently, the development of emotional interfaces for robots is an important expansion of human-robot interactions. We designed and developed an intelligent emotional interface for the robot, and applied the interfaces to our humanoid robot, AMIET. Subsequent human-robot interaction demonstrated that our intelligent emotional interface is very intuitive and friendly

  • PDF

Emotional Model Focused on Robot's Familiarity to Human

  • Choi, Tae-Yong;Kim, Chang-Hyun;Lee, Ju-Jang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1025-1030
    • /
    • 2005
  • This paper deals with the emotional model of the software-robot. The software-robot requires several capabilities such as sensing, perceiving, acting, communicating, and surviving. and so on. There are already many studies about the emotional model like KISMET and AIBO. The new emotional model using the modified friendship scheme is proposed in this paper. Quite often, the available emotional models have time invariant human respond architectures. Conventional emotional models make the sociable robot get around with humans, and obey human commands during robot operation. This behavior makes the robot very different from real pets. Similar to real pets, the proposed emotional model with the modified friendship capability has time varying property depending on interaction between human and robot.

  • PDF

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.4
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.

Evaluating the Effectiveness of Nielsen's Usability Heuristics for Computer Engineers and Designers without Human Computer Interaction Background (비 HCI 전공자들을 대상으로 한 Nielsen의 Usability Heuristics에 대한 이해 정도 평가)

  • Jeong, YoungJoo;Sim, InSook;Jeong, GooCheol
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.2 no.2
    • /
    • pp.165-171
    • /
    • 2010
  • Usability heuristics("heuristics") are general principles for usability evaluation during user interface design. Our ultimate goal is to extend the practice of usability evaluation methods to a wider audience(e.g. user interface designers and engineers) than Human Computer Interaction(HCI) professionals. To this end, we explored the degree to which Jakob Nielsen's ten usability heuristics are understood by professors and students in design and computer engineering. None of the subjects received formal training in HCI, though some may have had an awareness of some HCI principles. The study identified easy-to-understand heuristics, examined the reasons for the ambiguities in others, and discovered differences between the responses of professors and students to the heuristics. In the course of the study, the subjects showed an increased tendency to think in terms of user-centric design. Furthermore, the findings in this study offer suggestions for improving these heuristics to resolve ambiguities and to extend their practice for user interface designers and engineers.

  • PDF

PERSONAL SPACE-BASED MODELING OF RELATIONSHIPS BETWEEN PEOPLE FOR NEW HUMAN-COMPUTER INTERACTION

  • Amaoka, Toshitaka;Laga, Hamid;Saito, Suguru;Nakajima, Masayuki
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.746-750
    • /
    • 2009
  • In this paper we focus on the Personal Space (PS) as a nonverbal communication concept to build a new Human Computer Interaction. The analysis of people positions with respect to their PS gives an idea on the nature of their relationship. We propose to analyze and model the PS using Computer Vision (CV), and visualize it using Computer Graphics. For this purpose, we define the PS based on four parameters: distance between people, their face orientations, age, and gender. We automatically estimate the first two parameters from image sequences using CV technology, while the two other parameters are set manually. Finally, we calculate the two-dimensional relationship of multiple persons and visualize it as 3D contours in real-time. Our method can sense and visualize invisible and unconscious PS distributions and convey the spatial relationship of users by an intuitive visual representation. The results of this paper can be used to Human Computer Interaction in public spaces.

  • PDF

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.1
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.