• Title/Summary/Keyword: 인간 로봇 상호작용

Search Result 169, Processing Time 0.028 seconds

Zero Accident, Connected Autonomous Driving Vehicle (사고제로, 커넥티드 자율이동체)

  • Choi, J.D.;Min, K.W.;Kim, J.H.;Seo, B.S.;Kim, D.H.;Yoo, D.S.;Cho, J.I.
    • Electronics and Telecommunications Trends
    • /
    • v.36 no.1
    • /
    • pp.22-31
    • /
    • 2021
  • In this thesis, we examine the development status of autonomous mobility services using various artificial intelligence algorithms and propose a solution by combining edge and cloud computing to overcome technical difficulties. A fully autonomous vehicle with enhanced safety and ethics can be implemented using the proposed solution. In addition, for the future of 2035, we present a new concept that enables two- and three-dimensional movement via cooperation between ecofriendly, low-noise, and modular fully autonomous vehicles. The zero-error autonomous driving system will safely and conveniently transport people, goods, and services without time and space constraints and contribute to the autonomous mobility services that are free from movement in connection with various mobility.

The Behavioral Patterns of Neutral Affective State for Service Robot Using Video Ethnography (비디오 에스노그래피를 이용한 서비스 로봇의 대기상태 행동패턴 연구)

  • Song, Hyun-Soo;Kim, Min-Joong;Jeong, Sang-Hoon;Suk, Hyeon-Jeong;Kwon, Dong-Soo;Kim, Myung-Suk
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.629-636
    • /
    • 2008
  • In recent years, a large number of robots have been developed in several countries, and these robots have been built for the purpose to appeal to users by well designed human-robot interaction. In case of the robots developed so far, they show proper reactions only when there is a certain input. On the other hands, they cannot perform in a standby mode which means there is no input. In other words, if a robot does not make any motion in standby mode, users may feel that the robot is being turned-off or even out of work. Especially, the social service robots maintain the standby status after finishing a certain task. In this period of time, if the robots can make human-like behavioral patterns such like a person in help desk, then they are expected to make people feels that they are alive and is more likely to interact with them. It is said that even if there is no interaction with others or the environment, people normally reacts to internal or external stimuli which are created by themselves such as moving their eyes or bodies. In order to create robotic behavioral patterns for standby mode, we analyze the actual facial expression and behavior from people who are in neutral affective emotion based on ethnographic methodology and apply extracted characteristics to our robots. Moreover, by using the robots which can show those series of expression and action, our research needs to find that people can feel like they are alive.

  • PDF

Passivity Based Adaptive Control and Its Optimization for Upper Limb Assist Exoskeleton Robot (상지 근력 보조용 착용형 외골격 로봇의 수동성 기반 적응 제어와 최적화 기법)

  • Khan, Abdul Manan;Ji, Young Hoon;Ali, Mian Ashfaq;Han, Jung Soo;Han, Chang Soo
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.32 no.10
    • /
    • pp.857-863
    • /
    • 2015
  • The need for human body posture robots has led researchers to develop dexterous design of exoskeleton robots. Quantitative techniques to assess human motor function and generate commands for robots were required to be developed. In this paper, we present a passivity based adaptive control algorithm for upper limb assist exoskeleton. The proposed algorithm can adapt to different subject parameters and provide efficient response against the biomechanical variations caused by subject variations. Furthermore, we have employed the Particle Swarm Optimization technique to tune the controller gains. Efficacy of the proposed algorithm method is experimentally demonstrated using a seven degree of freedom upper limb assist exoskeleton robot. The proposed algorithm was found to estimate the desired motion and assist accordingly. This algorithm in conjunction with an upper limb assist exoskeleton robot may be very useful for elderly people to perform daily tasks.

The Cognition of Non-Ridged Objects Using Linguistic Cognitive System for Human-Robot Interaction (인간로봇 상호작용을 위한 언어적 인지시스템 기반의 비강체 인지)

  • Ahn, Hyun-Sik
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.11
    • /
    • pp.1115-1121
    • /
    • 2009
  • For HRI (Human-Robot Interaction) in daily life, robots need to recognize non-rigid objects such as clothes and blankets. However, the recognition of non-rigid objects is challenging because of the variation of the shapes according to the places and laying manners. In this paper, the cognition of non-rigid object based on a cognitive system is presented. The characteristics of non-rigid objects are analysed in the view of HRI and referred to design a framework for the cognition of them. We adopt a linguistic cognitive system for describing all of the events happened to robots. When an event related to the non-rigid objects is occurred, the cognitive system describes the event into a sentential form and stores it at a sentential memory, and depicts the objects with a spatial model for being used as references. The cognitive system parses each sentence syntactically and semantically, in which the nouns meaning objects are connected to their models. For answering the questions of humans, sentences are retrieved by searching temporal information in the sentential memory and by spatial reasoning in a schematic imagery. Experiments show the feasibility of the cognitive system for cognizing non-rigid objects in HRI.

Face and Facial Feature Detection under Pose Variation of User Face for Human-Robot Interaction (인간-로봇 상호작용을 위한 자세가 변하는 사용자 얼굴검출 및 얼굴요소 위치추정)

  • Park Sung-Kee;Park Mignon;Lee Taigun
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.1
    • /
    • pp.50-57
    • /
    • 2005
  • We present a simple and effective method of face and facial feature detection under pose variation of user face in complex background for the human-robot interaction. Our approach is a flexible method that can be performed in both color and gray facial image and is also feasible for detecting facial features in quasi real-time. Based on the characteristics of the intensity of neighborhood area of facial features, new directional template for facial feature is defined. From applying this template to input facial image, novel edge-like blob map (EBM) with multiple intensity strengths is constructed. Regardless of color information of input image, using this map and conditions for facial characteristics, we show that the locations of face and its features - i.e., two eyes and a mouth-can be successfully estimated. Without the information of facial area boundary, final candidate face region is determined by both obtained locations of facial features and weighted correlation values with standard facial templates. Experimental results from many color images and well-known gray level face database images authorize the usefulness of proposed algorithm.

A Study on Interaction Design of Companion Robots Based on Emotional State (감정 상태에 따른 컴패니언 로봇의 인터랙션 디자인 : 공감 인터랙션을 중심으로)

  • Oh, Ye-Jeon;Shin, Yoon-Soo;Lee, Jee-Hang;Kim, Jin-Woo
    • Journal of Digital Contents Society
    • /
    • v.18 no.7
    • /
    • pp.1293-1301
    • /
    • 2017
  • Recent changes in social structure, such as nuclear family and personalization, are leading to personal and social problems, which may cause various problems due to negative emotional amplification. The absence of a family member who gives a sense of psychological stability in the past can be considered as a representative cause of the emotional difficulties of modern people. This personal and social problem is solved through the empathic interaction of the companion robot communication with users in daily life. In this study, we developed sophisticated empathic interaction design through prototyping of emotional robots. As a result, it was confirmed that the face interaction greatly affects the emotional interaction of the emotional robot and the interaction of the robot improves the emotional sense of the robot. This study has the theoretical and practical significance in that the emotional robot is made more sophisticated interaction and the guideline of the sympathetic interaction design is presented based on the experimental results.

Autonomous Mobile Robot Control using the Wearable Devices Based on EMG Signal for detecting fire (EMG 신호 기반의 웨어러블 기기를 통한 화재감지 자율 주행 로봇 제어)

  • Kim, Jin-Woo;Lee, Woo-Young;Yu, Je-Hun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.3
    • /
    • pp.176-181
    • /
    • 2016
  • In this paper, the autonomous mobile robot control system for detecting fire was proposed using the wearable device based on EMG(Electromyogram) signal. Myo armband is used for detecting the user's EMG signal. The gesture was classified after sending the data of EMG signal to a computer using Bluetooth communication. Then the robot named 'uBrain' was implemented to move by received data from Bluetooth communication in our experiment. 'Move front', 'Turn right', 'Turn left', and 'Stop' are controllable commands for the robot. And if the robot cannot receive the Bluetooth signal from a user or if a user wants to change manual mode to autonomous mode, the robot was implemented to be in the autonomous mode. The robot flashes the LED when IR sensor detects the fire during moving.

User Preference for the Personification of Public Service Robot (공공서비스 로봇의 의인화에 관한 사용자 선호)

  • Kim, Ban-Seok;Kim, Seung-In
    • Journal of Digital Convergence
    • /
    • v.18 no.2
    • /
    • pp.361-366
    • /
    • 2020
  • The purpose of this study is to find out user preference on personification of the public service robot. Public service robot services in public places is increasing, which is expected to continue to increase. Proper anthropomorphism of robots has positive effect on user experience. On the other hand, when the level of likeness exceeds a certain point, it provokes strangeness and a sense of unease. Therefore, it is necessary to prepare standards for anthropomorphism required for public service robots. In order to find it, a survey and an in-depth interview were conducted. According to the analysis, people prefer verbal interaction with the robot, and the proper age for the voice is in their 20s and 30s. It is recommended that no biological signals appear on the robot and there is a need for personalized services. Through this research, it is expected that it will contribute to design of public service robots that enhance user experience.

A Study on Human-Robot Interface based on Imitative Learning using Computational Model of Mirror Neuron System (Mirror Neuron System 계산 모델을 이용한 모방학습 기반 인간-로봇 인터페이스에 관한 연구)

  • Ko, Kwang-Enu;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.23 no.6
    • /
    • pp.565-570
    • /
    • 2013
  • The mirror neuron regions which are distributed in cortical area handled a functionality of intention recognition on the basis of imitative learning of an observed action which is acquired from visual-information of a goal-directed action. In this paper an automated intention recognition system is proposed by applying computational model of mirror neuron system to the human-robot interaction system. The computational model of mirror neuron system is designed by using dynamic neural networks which have model input which includes sequential feature vector set from the behaviors from the target object and actor and produce results as a form of motor data which can be used to perform the corresponding intentional action through the imitative learning and estimation procedures of the proposed computational model. The intention recognition framework is designed by a system which has a model input from KINECT sensor and has a model output by calculating the corresponding motor data within a virtual robot simulation environment on the basis of intention-related scenario with the limited experimental space and specified target object.

Development of Humanoid Joint Module for Safe Human-Robot Interaction (인간과의 안전한 상호 작용을 고려한 휴머노이드 조인트 모듈 개발)

  • Oh, Yeon Taek
    • The Journal of Korea Robotics Society
    • /
    • v.9 no.4
    • /
    • pp.264-271
    • /
    • 2014
  • In this study, we have developed the humanoid joint modules which provide a variety of service while living with people in the future home life. The most important requirement is ensuring the safety for humans of the robot system for collaboration with people and providing physical service in dynamic changing environment. Therefore we should construct the mechanism and control system that each joint of the robot should response sensitively and rapidly to fulfill that. In this study, we have analyzed the characteristic of the joint which based on the target constituting the humanoid motion, developed the optimal actuator system which can be controlled based on each joint characteristic, and developed the control system which can control an multi-joint system at a high speed. In particular, in the design of the joint, we have defined back-drivability at the safety perspective and developed an actuator unit to maximize. Therefore we establish a foundation element technology for future commercialization of intelligent service robots.