• Title/Summary/Keyword: Human Interaction Device

Search Result 130, Processing Time 0.022 seconds

An Interactive Robotic Cane

  • Yoon, Joongsun
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.5 no.1
    • /
    • pp.5-12
    • /
    • 2004
  • A human-friendly interactive system that is based on the harmonious symbiotic coexistence of human and robots is explored. Based on this interactive technology paradigm, a robotic cane is proposed for blind or visually impaired travelers to navigate safely and quickly through obstacles and other hazards faced by blind pedestrians. The proposed robotic cane, "RoJi,” consists of a long handle with a button-operated interface and a sensor head unit that is attached at the distal end of the handle. A series of sensors, mounted on the sensor head unit, detect obstacles and steer the device around them. The user feels the steering command as a very noticeable physical force through the handle and is able to follow the path of the robotic cane easily and without any conscious effort. The issues discussed include methodologies for human-robot interactions, design issues of an interactive robotic cane, and hardware requirements for efficient human-robot interactions.ions.

A Review of Haptic Perception: Focused on Sensation and Application

  • Song, Joobong;Lim, Ji Hyoun;Yun, Myung Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.31 no.6
    • /
    • pp.715-723
    • /
    • 2012
  • Objective: The aim of this study is to investigate haptic perception related researches into three perspectives: cutaneous & proprioceptive sensations, active & passive touch, and cognition & emotion, then to identify issues for implementing haptic interactions. Background: Although haptic technologies had improved and become practical, more research on the method of application is still needed to actualize the multimodal interaction technology. Systematical approached to explore haptic perception is required to understand emotional experience and social message, as well as tactile feedback. Method: Content analysis were conducted to analyze trend in haptic related research. Changes in issues and topics were investigated using sensational dimensions and the different contents delivered via tactile perception. Result: The found research opportunities were haptic perception in various body segments and emotion related proprioceptive sensation. Conclusion: Once the mechanism of how users perceives haptic stimuli would help to develop effective haptic interactrion and this study provide insights of what to focus for the future of haptic interaction. Application: This research is expected to provide presence, and emotional response applied by haptic perception to fields such as human-robot, human-device, and telecommunication interaction.

A Qualitative Research on the Structure and Determinants of Personal Device Network in the Ubiquitous Computing Context (유비쿼터스 컴퓨팅 환경에서 PDN의 구조와 결정 요인에 대한 정성적 연구)

  • Jeon Seok-Won;Jang Youn-Sun;Kim Jin-Woo
    • Journal of Information Technology Applications and Management
    • /
    • v.13 no.3
    • /
    • pp.1-28
    • /
    • 2006
  • In the ubiquitous computing environments. people usually carry multiple information technology devices with them. Personal device network (PDN) refers to the way how people connect multiple IT devices for their personal as well as professional purposes. Even though it has been Quite popular to construct the PDN in ubiquitous computing context, not much research has been conducted on how people actually connected multiple devices and what influences their methods of connection. In this paper we conducted a content analysis on community bulletin boards of IT devices and a contextual inquiry with expert users of PDN for investigating the configurations with which users connect multiple IT devices. Base on the results of two related studies, we identified three major types of PDN configurations, and key factors that influence the configurations of PDN. We conclude this research with guidelines to design a set of devices for each of the three configuration types.

  • PDF

Development of Co-Interaction Model for Bus Auto-Payment with Beacon based on MDD (모델 주도 개발(MDD) 기반 비콘 사용 버스 요금 자동 결제를 위한 상호작용 모델 개발)

  • Oh, Jung Won;Kim, Hangkon
    • Smart Media Journal
    • /
    • v.5 no.3
    • /
    • pp.42-48
    • /
    • 2016
  • Recently, most of the modern people used a second mobile device(two degree mobile device). Mobile devices are affecting all areas of human life, consumer electronics, transportation, manufacturing, and finance. On this paper, we propose a model-driven development based interaction model that can be used in the development of mobile payment system, which is the latest buzzword pins of the various application fields of mobile devices Tech (Fin-Tech) sector. Using a model-driven development based models do not depend on the Platforms (PIM), we propose a model for interaction between devices which can be reused when developing mobile billing app. A model-driven development based mobile applications use the reusable of interaction models development program analyzed the bus fees automatic payment application by a beacon.

Control and VR Navigation of a Gait Rehabilitation Robot with Upper and Lower Limbs Connections (상하지가 연동된 보행재활 로봇의 제어 및 VR 네비게이션)

  • Novandy, Bondhan;Yoon, Jung-Won
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.3
    • /
    • pp.315-322
    • /
    • 2009
  • This paper explains a control and navigation algorithm of a 6-DOF gait rehabilitation robot, which can allow a patient to navigate in virtual reality (VR) by upper and lower limbs interactions. In gait rehabilitation robots, one of the important concerns is not only to follow the robot motions passively, but also to allow the patient to walk by his/her intention. Thus, this robot allows automatic walking velocity update by estimating interaction torques between the human and the upper limb device, and synchronizing the upper limb device to the lower limb device. In addition, the upper limb device acts as a user-friendly input device for navigating in virtual reality. By pushing the switches located at the right and left handles of the upper limb device, a patient is able to do turning motions during navigation in virtual reality. Through experimental results of a healthy subject, we showed that rehabilitation training can be more effectively combined to virtual environments with upper and lower limb connections. The suggested navigation scheme for gait rehabilitation robot will allow various and effective rehabilitation training modes.

Toward Transparent Virtual Coupling for Haptic Interaction during Contact Tasks (컨택트 작업 시 햅틱 인터렉션의 투명성 향상을 위한 Virtual Coupling 기법의 설계)

  • Kim, Myungsin;Lee, Dongjun
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.3
    • /
    • pp.186-196
    • /
    • 2013
  • Since its introduction (e.g., [4, 6]), virtual coupling technique has been de facto way to connect a haptic device with a virtual proxy for haptic rendering and control. However, because of the single dependence on spring-damper feedback action, this virtual coupling suffers from the degraded transparency particularly during contact tasks when large device/proxy-forces are involved. In this paper, we propose a novel virtual coupling technique, which, by utilizing passive decomposition, reduces device-proxy position deviation even during the contact tasks while also scaling down (or up) the apparent inertia of the coordinated device-proxy. By doing so, we can significantly improve transparency between multiple degree of freedom (possibly nonlinear) haptic device and virtual proxy. In other to use passive decomposition, disturbance observer of [3] is adopted to estimate human force with some dead-zone modification to avoid "winding-up" force estimation in the presence of device torque saturation. Some preliminary experimental results are also given to illustrate efficacy of the proposed technique.

Stereo Vision Based 3D Input Device (스테레오 비전을 기반으로 한 3차원 입력 장치)

  • Yoon, Sang-Min;Kim, Ig-Jae;Ahn, Sang-Chul;Ko, Han-Seok;Kim, Hyoung-Gon
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.39 no.4
    • /
    • pp.429-441
    • /
    • 2002
  • This paper concerns extracting 3D motion information from a 3D input device in real time focused to enabling effective human-computer interaction. In particular, we develop a novel algorithm for extracting 6 degrees-of-freedom motion information from a 3D input device by employing an epipolar geometry of stereo camera, color, motion, and structure information, free from requiring the aid of camera calibration object. To extract 3D motion, we first determine the epipolar geometry of stereo camera by computing the perspective projection matrix and perspective distortion matrix. We then incorporate the proposed Motion Adaptive Weighted Unmatched Pixel Count algorithm performing color transformation, unmatched pixel counting, discrete Kalman filtering, and principal component analysis. The extracted 3D motion information can be applied to controlling virtual objects or aiding the navigation device that controls the viewpoint of a user in virtual reality setting. Since the stereo vision-based 3D input device is wireless, it provides users with a means for more natural and efficient interface, thus effectively realizing a feeling of immersion.

Implementation of interactive 3D floating image pointing device (인터렉티브 3D 플로팅 영상 포인팅 장치의 구현)

  • Shin, Dong-Hak;Kim, Eun-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.12 no.8
    • /
    • pp.1481-1487
    • /
    • 2008
  • In this paper, we propose a novel interactive 3D floating image pointing device for the use of 3D environment. The proposed system consists of the 3D floating image generation system by use of a floating lens array and the a user interface based on real-time finger detection. In the proposed system, a user selects single image among the floating images so that the interaction function are performed effectively by pressing the button event through the finger recognition using two cameras. To show the usefulness of the proposed system, we carry out the experiment and the preliminary results are presented.

A Study on Vision Based Gesture Recognition Interface Design for Digital TV (동작인식기반 Digital TV인터페이스를 위한 지시동작에 관한 연구)

  • Kim, Hyun-Suk;Hwang, Sung-Won;Moon, Hyun-Jung
    • Archives of design research
    • /
    • v.20 no.3 s.71
    • /
    • pp.257-268
    • /
    • 2007
  • The development of Human Computer Interface has been relied on the development of technology. Mice and keyboards are the most popular HCI devices for personal computing. However, device-based interfaces are quite different from human to human interaction and very artificial. To develop more intuitive interfaces which mimic human to human interface has been a major research topic among HCI researchers and engineers. Also, technology in the TV industry has rapidly developed and the market penetration rate for big size screen TVs has increased rapidly. The HDTV and digital TV broadcasting are being tested. These TV environment changes require changes of Human to TV interface. A gesture recognition-based interface with a computer vision system can replace the remote control-based interface because of its immediacy and intuitiveness. This research focuses on how people use their hands or arms for command gestures. A set of gestures are sampled to control TV set up by focus group interviews and surveys. The result of this paper can be used as a reference to design a computer vision based TV interface.

  • PDF

Simulation of Shot Impact by a Wearable Smart Individual Weapon Mounted on a Forearm (하박 장착용 스마트 개인무장의 발사충격에 의한 인체거동 해석)

  • Koo, Sungchan;Kim, Taekyung;Choi, Minki;Kim, Sanghyun;Choi, Sungho;Lee, Yongsun;Kim, Jay J.
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.22 no.6
    • /
    • pp.806-814
    • /
    • 2019
  • One of the future weapon systems is the individual smart weapon which has a structure mounted on the forearm of soldiers. The structure may cause injuries or affect the accuracy of fire due to its impact on joints when shooting. This paper proposes human-impact interaction modeling and a verification methodology in order to estimate the impact of fire applied to the forearm. For this purpose, a human musculoskeletal model was constructed and the joints' behavior in various shooting positions was simulated. In order to verify the simulation results, an impact testing device substituting the smart weapon was made and the experiment was performed on a real human body. This paper compares the simulation results performed under various impact conditions and the experimental values in terms of accuracy and introduces methods to complement them. The results of the study are expected to be a basis for a reliable human-impact interaction modeling, and smart individual weapon development.