• Title/Summary/Keyword: Human Interaction Device

Search Result 130, Processing Time 0.032 seconds

A research on man-robot cooperative interaction system

  • Ishii, Masaru
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1992.10b
    • /
    • pp.555-557
    • /
    • 1992
  • Recently, realization of an intelligent cooperative interaction system between a man and robot systems is required. In this paper, HyperCard with a voice control is used for above system because of its easy handling and excellent human interfaces. Clicking buttons in the HyperCard by a mouse device or a voice command means controlling each joint of a robot system. Robot teaching operation of grasping a bin and pouring liquid in it into a cup is carried out. This robot teaching method using HyperCard provides a foundation for realizing a user friendly cooperative interaction system.

  • PDF

Development of Low-cost 3D Printing Bi-axial Pressure Sensor (저가형 3D프린팅 2축 압력 센서 개발)

  • Choi, Heonsoo;Yeo, Joonseong;Seong, Jihun;Choi, Hyunjin
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.2
    • /
    • pp.152-158
    • /
    • 2022
  • As various mobile robots and manipulator robots have been commercialized, robots that can be used by individuals in their daily life have begun to appear. With the development of robots that support daily life, the interaction between robots and humans is becoming more important. Manipulator robots that support daily life must perform tasks such as pressing buttons or picking up objects safely. In many cases, this requires expensive multi-axis force/torque sensors to measure the interaction. In this study, we introduce a low-cost two-axis pressure sensor that can be applied to manipulators for education or research. The proposed system used three force sensitive resistor (FSR) sensors and the structure was fabricated by 3D printing. An experimental device using a load cell was constructed to measure the biaxial pressure. The manufactured prototype was able to distinguish the +-x-axis and the +-y-axis pressures.

A Study of Hand Gesture Recognition for Human Computer Interface (컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구)

  • Chang, Ho-Jung;Baek, Han-Wook;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF

Real-time Multi-device Control System Implementation for Natural User Interactive Platform

  • Kim, Myoung-Jin;Hwang, Tae-min;Chae, Sung-Hun;Kim, Min-Joon;Moon, Yeon-Kug;Kim, SeungJun
    • Journal of Internet Computing and Services
    • /
    • v.23 no.1
    • /
    • pp.19-29
    • /
    • 2022
  • Natural user interface (NUI) is used for the natural motion interface without using a specific device or tool like a mouse, keyboards, and pens. Recently, as non-contact sensor-based interaction technologies for recognizing human motion, gestures, voice, and gaze have been actively studied, an environment has been prepared that can provide more diverse contents based on various interaction methods compared to existing methods. However, as the number of sensors device is rapidly increasing, the system using a lot of sensors can suffer from a lack of computational resources. To address this problem, we proposed a real-time multi-device control system for natural interactive platform. In the proposed system, we classified two types of devices as the HC devices such as high-end commercial sensor and the LC devices such astraditional monitoring sensor with low-cost. we adopt each device manager to control efficiently. we demonstrate a proposed system works properly with user behavior such as gestures, motions, gazes, and voices.

Techniques of Editing and Reproducing Robot Operation Data for Direct Teaching (직접 교시 작업을 위한 로봇 작업 정보 편집 및 재생산 기법)

  • Kim, Han-Joon;Wang, Young-Jin;Kim, Jin-Oh;Back, Ju-Hoon
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.30 no.1
    • /
    • pp.96-104
    • /
    • 2013
  • Study of human-robot Interaction gets more and more attention to expand the robot application for tasks difficult by robot alone. Developed countries are preparing for a new market by introducing the concept of 'Co-Robot' model of human-robot Interaction. Our research of direct teaching is a way to instruct robot's trajectory by human's handling of its end device. This method is more intuitive than other existing methods. The benefit of this approach includes easy and fast teaching even by non-professional workers. And it can enhance utilization of robots in small and medium-sized enterprises for small quantity batch production. In this study, we developed the algorithms for creating accurate trajectory from repeated inaccurate direct teaching and GUI for the direct teaching. We also propose the basic framework for direct teaching.

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF

Human-Object Interaction Framework Using RGB-D Camera (RGB-D 카메라를 사용한 사용자-실사물 상호작용 프레임워크)

  • Baeka, Yong-Hwan;Lim, Changmin;Park, Jong-Il
    • Journal of Broadcast Engineering
    • /
    • v.21 no.1
    • /
    • pp.11-23
    • /
    • 2016
  • Recent days, touch interaction interface is the most widely used interaction interface to communicate with digital devices. Because of its usability, touch technology is applied almost everywhere from watch to advertising boards and it is growing much bigger. However, this technology has a critical weakness. Normally, touch input device needs a contact surface with touch sensors embedded in it. Thus, touch interaction through general objects like books or documents are still unavailable. In this paper, a human-object interaction framework based on RGB-D camera is proposed to overcome those limitation. The proposed framework can deal with occluded situations like hovering the hand on top of the object and also moving objects by hand. In such situations object recognition algorithm and hand gesture algorithm may fail to recognize. However, our framework makes it possible to handle complicated circumstances without performance loss. The framework calculates the status of the object with fast and robust object recognition algorithm to determine whether it is an object or a human hand. Then, the hand gesture recognition algorithm controls the context of each object by gestures almost simultaneously.

Behavior recognition system based fog cloud computing

  • Lee, Seok-Woo;Lee, Jong-Yong;Jung, Kye-Dong
    • International journal of advanced smart convergence
    • /
    • v.6 no.3
    • /
    • pp.29-37
    • /
    • 2017
  • The current behavior recognition system don't match data formats between sensor data measured by user's sensor module or device. Therefore, it is necessary to support data processing, sharing and collaboration services between users and behavior recognition system in order to process sensor data of a large capacity, which is another formats. It is also necessary for real time interaction with users and behavior recognition system. To solve this problem, we propose fog cloud based behavior recognition system for human body sensor data processing. Fog cloud based behavior recognition system solve data standard formats in DbaaS (Database as a System) cloud by servicing fog cloud to solve heterogeneity of sensor data measured in user's sensor module or device. In addition, by placing fog cloud between users and cloud, proximity between users and servers is increased, allowing for real time interaction. Based on this, we propose behavior recognition system for user's behavior recognition and service to observers in collaborative environment. Based on the proposed system, it solves the problem of servers overload due to large sensor data and the inability of real time interaction due to non-proximity between users and servers. This shows the process of delivering behavior recognition services that are consistent and capable of real time interaction.

Design of Next-Generation Ship Simulator System Using Virtual Reality (가상현실을 이용한 차세대 선박 시뮬레이터의 시스템 설계)

  • 임정빈;박계각
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.6 no.1
    • /
    • pp.1-9
    • /
    • 2000
  • The paper describes system design of next-generation Ship Simulator using Virtual Reality (VRSS), well known as human-computer interaction. VRSS system is required to have special condition that comprises multiple user participants such as captain, officer, pilot, and quartermaster. To cope with that condition, core technologies were explored and proposed multi-networking system with broker server. The evaluation of the proposed system was done with PC-based immersion-type VR device, constituted with HMD (Head Mounted Display), Head Tracking Sensor, Puck, Headphone, and Microphone. Using the VR device, assessment test was carried out in a virtual bridge with 3D objects, which are created by VRML (Virtual Reality Model Language) program. As results of tests, it is shown that the cybernetic 3D objects were act as if real things in a real ship's bridge. Therefore, interesting interaction with participants can be obtained in the system, Thus, we found that the proposed system architecture can be applicable to VRSS system construction.

  • PDF

Development of exoskeletal type tendon driven haptic device (텐던 구동방식의 장착형 역/촉감 제시기구의 개발에 관한 연구)

  • 이규훈;최혁렬
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1285-1288
    • /
    • 1997
  • The basic technology of virtual reality can be described as the cognition of the condition change in virtual world by stimulating the visual, auditory, kinesthetic and tactile sensation. Among these, the kinesthetic and tactile sensation is one of the most important things to recognize the interaction. In this paper, it is addressed the haptic device which help the human feel the sense of the operator, and is designed in modular type to expand for five fingers later. the haptic device is driven by tendon and ultrasonic motors located in the wrist part. Each joint is actuated by coupled tendons and adopts more actrator by one than the number of the joints, called 'N+1 type'. The haptic device adopts metamorphic 4-bar linkage structure and the length of linkages, shape and the location of joint displacement sensor are optimized through the analysis.

  • PDF