• Title/Summary/Keyword: mobile vision system

Search Result 292, Processing Time 0.026 seconds

Path Planning of an Autonomous Mobile Robot with Vision System Using Fuzzy Rules (비전 시스템을 가지는 자율주행 이동로봇을 위한 퍼지 규칙을 이용한 경로 계획)

  • Kim, Jae-Hoon;Kang, Geun-Taek;Lee, Won-Chang
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.13 no.1
    • /
    • pp.18-23
    • /
    • 2003
  • This paper presents new algorithms of path planning and obstacle avoidance for an autonomous mobile robot to navigate under unknown environments in the real time. Temporary targets are set up by distance variation method and then the algorithms of trajectory planning and obstacle avoidance are designed using fuzzy rules. It is shown by computer simulation that these algorithms are working well. Furthermore, an autonomous mobile robot was constructed to implement and test these algorithms in the real field. The experimental results are also satisfactory just like those of computer simulation.

Lane Marking Detection of Mobile Robot with Single Laser Rangefinder (레이저 거리 센서만을 이용한 자율 주행 모바일 로봇의 도로 위 정보 획득)

  • Jung, Byung-Jin;Park, Jun-Hyung;Kim, Taek-Young;Kim, Deuk-Young;Moon, Hyung-Pil
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.6
    • /
    • pp.521-525
    • /
    • 2011
  • Lane marking detection is one of important issues in the field of autonomous mobile robot. Especially, in urban environment, like pavement roads of downtown or tour tracks of Science Park, which have continuous patterns on the surface of the road, the lane marking detection becomes more important ability. Although there were many researches about lane detection and lane tracing, many of them used vision sensors mainly to detect lane marking. In this paper, we obtain 2 dimensional library data of 'Intensity' and 'Distance' using one laser rangefinder only. We design a simple classifier and filtering algorithm for the lane detection which uses only one LRF (Laser Range Finder). Allowing extended usage of LRF, this research provides more functionality not only in range finding but also in lane detecting to mobile robots. This work will be technically helpful for robot developers to design more simple and efficient autonomous driving system using LRF.

Vision-based Real-Time Traffic Emission Monitoring System (비전 기반의 실시간 대기오염 모니터링 시스템 개발)

  • Shin, Yunhee;Jung, Jinwoo;Yoo, Daewon;Park, Dongsoo;Kim, Eun Yi;Woo, Jung-Hun;Lim, Sang-Beom;Ju, Jin-Seon
    • Annual Conference of KIPS
    • /
    • 2010.04a
    • /
    • pp.439-442
    • /
    • 2010
  • 본 논문에서는 비전 기반의 실시간 대기오염 모니터링 시스템을 제안한다. 제안된 시스템은 먼저 실시간으로 제공되는 동영상을 분석하여 차종 별 대수와 평균속도 등의 교통 파라미터를 실시간으로 추출하고, 이를 바탕으로 대기 중의 CO, NO2등의 밀도를 추정하여 시간대별 대기 오염도를 모니터링 한다. 이를 위해 제안된 시스템은 배경모델을 이용한 차량 추출, 차종 별 윤곽선 및 크기 정보를 이용하여 템플릿 기반으로 차종을 인식하고 이를 추적하여 대수 및 속도를 인식한다. 제안된 시스템의 평가를 위해 교통이 밀집된 공간에 설치하여 테스트하였고, 실제 결과와 비교한 결과, 차량 속도에서 정확도 83.3%, 차종인식에서 정확도 86.98%를 보였다. 이러한 실험 결과는 제안된 시스템이 다양한 지역에서 실시간 대기오염물질 배출량을 산정하는데 적용될 수 있음을 보여주었다.

Development of Evaluation Technique of Mobility and Navigation Performance for Personal Robots (퍼스널 로봇을 위한 운동과 이동 성능평가 기술의 개발)

  • Ahn Chang-hyun;Kim Jin-Oh;Yi Keon Young;Lee Ho Gil;Kim Kyu-ro
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.52 no.2
    • /
    • pp.85-92
    • /
    • 2003
  • In this paper, we propose a method to evaluate performances of mobile personal robots. A set of performance measures is proposed and the corresponding evaluation methods are developed. Different from industrial manipulators, personal robots need to be evaluated with its mobility, navigation, task and intelligent performance in environments where human beings exist. The proposed performance measures are composed of measures for mobility including vibration, repeatability, path accuracy and so on, as well as measures for navigation performance including wall following, overcoming doorsill, obstacle avoidance and localization. But task and intelligent behavior performances such as cleaning capability and high-level decision-making are not considered in this paper. To measure the proposed performances through a series of tests, we designed a test environment and developed measurement systems including a 3D Laser tracking system, a vision monitoring system and a vibration measurement system. We measured the proposed performances with a mobile robot to show the result as an example. The developed systems, which are installed at Korea Agency for Technology and Standards, are going to be used for many robot companies in Korea.

Mobile Robots for the Concrete Crack Search and Sealing (콘크리트 크랙 탐색 및 실링을 위한 다수의 자율주행로봇)

  • Jin, Sung-Hun;Cho, Cheol-Joo;Lim, Kye-Young
    • The Journal of Korea Robotics Society
    • /
    • v.11 no.2
    • /
    • pp.60-72
    • /
    • 2016
  • This study proposes a multi-robot system, using multiple autonomous robots, to explore concrete structures and assist in their maintenance by sealing any cracks present in the structure. The proposed system employed a new self-localization method that is essential for autonomous robots, along with a visualization system to recognize the external environment and to detect and explore cracks efficiently. Moreover, more efficient crack search in an unknown environment became possible by arranging the robots into search areas divided depending on the surrounding situations. Operations with increased efficiency were also realized by overcoming the disadvantages of the infeasible logical behavioral model design with only six basic behavioral strategies based on distributed control-one of the methods to control swarm robots. Finally, this study investigated the efficiency of the proposed multi-robot system via basic sensor testing and simulation.

Path Planning of Internet based Mobile Robot with Vision System Using Fuzzy Rules (비젼시스템과 인터넷 기반 이동로봇을 위한 퍼지규칙의 경로 계획)

  • 김상헌;이동명;정재영;오선문;노관승;김관형
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09b
    • /
    • pp.9-12
    • /
    • 2003
  • 본 논문에서는 미지의 환경에서 인터넷 기반에 대한 이동로봇의 자율 주행이 가능하도록 비젼시스템과 퍼지규칙을 이용한 경로 설정 및 장애물 회피를 위한 알고리즘을 소개 하고자 한다. 한편 원격지에서도 로봇의 움직임을 파악할 수 있도록 인터넷을 통한 원격운용 기능을 추가함으로써 로봇의 효율적인 제어가 가능하도록 하였다. 원격지에서 제어하고자 할 때 대부분이 인터넷이나 무선을 이용한 원격제어 또는 실시간 모니터링을 통해 제어하여 그 상황을 시뮬레이션으로 구현하고 있다. 현재 이동로봇 제어를 할 때 많이 사용되는 방법은 IEEE 802.11b를 기반으로 한 wireless LAN Socket, TCP/IP, RF, 블루투스 통신등이 있다. 이러한 방식중 본 논문에서는 Internet 방식 중에 TCP/IP 프로토콜을 사용하였다. 전체 시스템은 이동로봇과 서버 그리고 클라이언트로 구성되며 이동 로봇은 인터넷을 통해서 로봇을 제어하거나 필요에 따라서는 로봇이 직접 제어권을 가지고 자율주행이 가능하도록 설계되었다. 본 논문에서는 퍼지규칙을 이용하여 경로 계획 및 장애물 회피를 위한 알고리즘을 생성하였으며, 실험을 통한 그 효율성을 검증하였다. 또한 실제 이동 로봇을 제작하여 실험한 결과에서도 제안된 알고리즘이 우수한 성능을 발휘함을 확인할 수 있었다.

  • PDF

Design of an Autonomous Mobile Robot Using Vision System and Odor Sensors to Search for a Odor Source (비젼 시스템과 후각 센서를 이용한 자율 이동 로봇의 냄새 발생지 탐색)

  • Ji Dongmin;Joo Moon G;Kang Geuntaek;Lee Wonchang
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2005.11a
    • /
    • pp.309-312
    • /
    • 2005
  • 본 논문에서는 비젼 시스템과 후각 센서를 이용하여 모바일 로봇의 냄새 발생지 탐색 기능을 구현 하였다. 모바일 로봇에서의 비젼 시스템은 많은 연구가 진행되어진 센싱 방식이지만, 후각 기능은 새롭게 지능 시스템의 센싱 방식으로 주목 받고 있는 추세이다. 이에 본 논문은 이전까지의 연구에서 보여준 임베디드 시스템에서의 가스 센싱 기능 구현을 벗어나 신경망 알고리즘을 이용하여 냄새를 구별 할 수 있는 후각 기능을 구현 하였으며, 비젼 시스템과 후각 센서의 복합적인 알고리즘을 통하여 냄새 발생지를 탐색하는 방법을 제시하였다. 또한 이를 실험하기 위해 AMOR(Autonomous Mobile Olfactory Robot)을 구현하여 남새 발생지 탐색 알고리즘의 효용성을 입증 하였다.

  • PDF

The Wireless World Research Forum(WWRF) Towards Systems Beyond 3G

  • Mohr, Werner
    • Information and Communications Magazine
    • /
    • v.19 no.7
    • /
    • pp.56-71
    • /
    • 2002
  • Third generation mobile radio systems (3G) are currently being deployed in different regions of the world. Future systems beyond 3G are already under discussion in international bodies and forums such as ITU, WWRF and R&D programs of the European Union and in other regions. These systems will determine the research and standardization activities in mobile and wireless communication in the next years. Based on the experience of 3G future systems will be developed mainly from the user perspective with respect to potential services and applications including traffic demands. Therefore, the Wireless World Research Forum (WWRF) was launched in 2001 as a global and open initiative of manufacturers, network operators, SMEs, R&D centers and the academic domain. WWRF is focused on the vision of such systems the Wireless World - and potential key technologies. This paper describes the international context of activities on systems beyond third generation, the goals, objectives and structure of WWRF, the user perspective as the starting point for a future system design and the key enabling technologies for the Wireless World.

Real time tracking of multiple humans for mobile robot application

  • Park, Joon-Hyuk;Park, Byung-Soo;Lee, Seok;Park, Sung-Kee;Kim, Munsang
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2002.10a
    • /
    • pp.100.3-100
    • /
    • 2002
  • This paper presents the method for detection and tracking of multiple humans robustly in mobile platform. The perception of human is performed in real time through the processing of images acquired from a moving stereo vision system. We performed multi-cue integration such as human shape, skin color and depth information to detect and track each human in moving background scene. Human shape is measured by edge-based template matching on distance transformed image. Improving robustness for human detection, we apply the human face skin color in HSV color space. And we could increase the accuracy and the robustness in both detection and tracking by applying random sampling stochastic estimati...

  • PDF

자율주행 로봇을 위한 Laser Range Finder

  • 차영엽;권대갑
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 1992.10a
    • /
    • pp.266-270
    • /
    • 1992
  • In this study an active vision system using a laser range finder is proposed for the navigation of a mobile robot in unknown environment. The laser range finder consists of a slitted laser beam generator, a scanning mechanism, CCD camera, and a signal processing unit. A laser beam from laser source is slitted by a set of cylindrical lenses and the slitted laser beam is emitted up and down and rotates around the robot by the scanning mechanism. The image of laser beam reflected on the surface of an object is engraved on the CCD array. A high speed image processing algorithm is proposed for the real-time navigation of the mobile robot. Through experiments it is proved that the accurate and real-time recognition of environment is able to be realized using the proposed laser range finder.