• 제목/요약/키워드: Multi-Floor Navigation

검색결과 6건 처리시간 0.02초

이동로봇의 안전한 엘리베이터 탑승을 위한 RGB-D 센서 기반의 엘리베이터 인식 및 위치추정 (Elevator Recognition and Position Estimation based on RGB-D Sensor for Safe Elevator Boarding)

  • 장민경;조현준;송재복
    • 로봇학회논문지
    • /
    • 제15권1호
    • /
    • pp.70-76
    • /
    • 2020
  • Multi-floor navigation of a mobile robot requires a technology that allows the robot to safely get on and off the elevator. Therefore, in this study, we propose a method of recognizing the elevator from the current position of the robot and estimating the location of the elevator locally so that the robot can safely get on the elevator regardless of the accumulated position error during autonomous navigation. The proposed method uses a deep learning-based image classifier to identify the elevator from the image information obtained from the RGB-D sensor and extract the boundary points between the elevator and the surrounding wall from the point cloud. This enables the robot to estimate the reliable position in real time and boarding direction for general elevators. Various experiments exhibit the effectiveness and accuracy of the proposed method.

RGB-D 센서를 이용한 이동로봇의 안전한 엘리베이터 승하차 (Getting On and Off an Elevator Safely for a Mobile Robot Using RGB-D Sensors)

  • 김지환;정민국;송재복
    • 로봇학회논문지
    • /
    • 제15권1호
    • /
    • pp.55-61
    • /
    • 2020
  • Getting on and off an elevator is one of the most important parts for multi-floor navigation of a mobile robot. In this study, we proposed the method for the pose recognition of elevator doors, safe path planning, and motion estimation of a robot using RGB-D sensors in order to safely get on and off the elevator. The accurate pose of the elevator doors is recognized using a particle filter algorithm. After the elevator door is open, the robot builds an occupancy grid map including the internal environments of the elevator to generate a safe path. The safe path prevents collision with obstacles in the elevator. While the robot gets on and off the elevator, the robot uses the optical flow algorithm of the floor image to detect the state that the robot cannot move due to an elevator door sill. The experimental results in various experiments show that the proposed method enables the robot to get on and off the elevator safely.

다층 실내 환경에서 계단 극복이 가능한 궤도형 로봇의 신뢰성 있는 자율 주행 정찰 시스템 (Reliable Autonomous Reconnaissance System for a Tracked Robot in Multi-floor Indoor Environments with Stairs)

  • 노주형;김보성;김도경;김지혁;심현철
    • 로봇학회논문지
    • /
    • 제19권2호
    • /
    • pp.149-158
    • /
    • 2024
  • This paper presents a robust autonomous navigation and reconnaissance system for tracked robots, designed to handle complex multi-floor indoor environments with stairs. We introduce a localization algorithm that adjusts scan matching parameters to robustly estimate positions and create maps in environments with scarce features, such as narrow rooms and staircases. Our system also features a path planning algorithm that calculates distance costs from surrounding obstacles, integrated with a specialized PID controller tuned to the robot's differential kinematics for collision-free navigation in confined spaces. The perception module leverages multi-image fusion and camera-LiDAR fusion to accurately detect and map the 3D positions of objects around the robot in real time. Through practical tests in real settings, we have verified that our system performs reliably. Based on this reliability, we expect that our research team's autonomous reconnaissance system will be practically utilized in actual disaster situations and environments that are difficult for humans to access, thereby making a significant contribution.

실내용 서비스 로봇을 위한 거리 센서 기반의 통합 자율 주행 시스템 개발 (Development of Range Sensor Based Integrated Navigation System for Indoor Service Robots)

  • 김건희;김문상;정우진
    • 제어로봇시스템학회논문지
    • /
    • 제10권9호
    • /
    • pp.785-798
    • /
    • 2004
  • This paper introduces the development of a range sensor based integrated navigation system for a multi-functional indoor service robot, called PSR (Public Service Robot System). The proposed navigation system includes hardware integration for sensors and actuators, the development of crucial navigation algorithms like mapping, localization, and path planning, and planning scheme such as error/fault handling. Major advantages of the proposed system are as follows: 1) A range sensor based generalized navigation system. 2) No need for the modification of environments. 3) Intelligent navigation-related components. 4) Framework supporting the selection of multiple behaviors and error/fault handling schemes. Experimental results are presented in order to show the feasibility of the proposed navigation system. The result of this research has been successfully applied to our three service robots in a variety of task domains including a delivery, a patrol, a guide, and a floor cleaning task.

해상 다중경로 페이딩 극복을 위한 선박중심 직접통신(MX-S2X) 물리계층 설계 및 성능 분석 (Design of Physical Layer and Performance Analysis for MX-S2X, Ship Centric Direct Communication with the Mitigation of Multi-path Fading on Sea Environment)

  • 류형직;유해선;김원용;김부영;심우성
    • 한국항해항만학회지
    • /
    • 제45권6호
    • /
    • pp.352-359
    • /
    • 2021
  • 본 논문은 자율운항 및 무인 기능이 요구되는 새로운 선박 안전 패러다임에 대응하여 선박중심 직접통신이라는 용어 정의 및 중요성을 제시하고, 고 주파수 기반의 광대역 통신기술을 활용하는 MX-S2X 통신 개념을 정의한다. 초고속 통신 기술의 해상 적용 시 극복해야 하는 해상 다중경로 페이딩을 고려한 MX-S2X 물리계층에 대한 설계 내용과 모의시험을 통한 검증 결과를 제시한다. 설계한 MX-S2X 통신의 물리계층은 충분한 세기의 신호가 수신됨에도 다중경로 페이딩의 영향에 따른 Error-floor 발생 극복을 목표로 설계하였다. 이를 입증하기 위하여 실제 해상 통신환경에 대한 채널 모델을 적용하여 물리계층에 대한 성능분석을 수행하였다. MX-S2X 물리계층의 성능 분석 결과 VDE 물리계층에서 관측되었던 BER Error-floor가 극복되고 AWGN 채널 대비 SNR 2dB 열화 범위 내에서 동작하는 것으로 확인되었다. 이는 근거리 해상통신 환경인 근거리 선박중심 직접통신에 적합한 성능을 보이며 추후 자율운항선박, 무인선 및 군집 선박 등의 직접통신에 활용 가능할 것으로 기대한다.

이동 로봇의 강인 위치 추정을 위한 단안 비젼 센서와 레이저 구조광 센서의 베이시안 센서융합 (Bayesian Sensor Fusion of Monocular Vision and Laser Structured Light Sensor for Robust Localization of a Mobile Robot)

  • 김민영;안상태;조형석
    • 제어로봇시스템학회논문지
    • /
    • 제16권4호
    • /
    • pp.381-390
    • /
    • 2010
  • This paper describes a procedure of the map-based localization for mobile robots by using a sensor fusion technique in structured environments. A combination of various sensors with different characteristics and limited sensibility has advantages in view of complementariness and cooperation to obtain better information on the environment. In this paper, for robust self-localization of a mobile robot with a monocular camera and a laser structured light sensor, environment information acquired from two sensors is combined and fused by a Bayesian sensor fusion technique based on the probabilistic reliability function of each sensor predefined through experiments. For the self-localization using the monocular vision, the robot utilizes image features consisting of vertical edge lines from input camera images, and they are used as natural landmark points in self-localization process. However, in case of using the laser structured light sensor, it utilizes geometrical features composed of corners and planes as natural landmark shapes during this process, which are extracted from range data at a constant height from the navigation floor. Although only each feature group of them is sometimes useful to localize mobile robots, all features from the two sensors are simultaneously used and fused in term of information for reliable localization under various environment conditions. To verify the advantage of using multi-sensor fusion, a series of experiments are performed, and experimental results are discussed in detail.