• Title/Summary/Keyword: sensor- fusion

Search Result 822, Processing Time 0.037 seconds

Virtual Environment Building and Navigation of Mobile Robot using Command Fusion and Fuzzy Inference

  • Jin, Taeseok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.22 no.4
    • /
    • pp.427-433
    • /
    • 2019
  • This paper propose a fuzzy inference model for map building and navigation for a mobile robot with an active camera, which is intelligently navigating to the goal location in unknown environments using sensor fusion, based on situational command using an active camera sensor. Active cameras provide a mobile robot with the capability to estimate and track feature images over a hallway field of view. In this paper, instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. Command fusion method is used to govern the robot navigation. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of active camera sensor for navigation experiments are fused into the identification process. Navigation performance improves on that achieved using fuzzy inference alone and shows significant advantages over command fusion techniques. Experimental evidences are provided, demonstrating that the proposed method can be reliably used over a wide range of relative positions between the active camera and the feature images.

A Study In Movement of Wheeled Mobile Robot Via Sensor Fusion (센서융합에 의한 이동로봇의 주행성 연구)

  • Shin, Hui-Seok;Hong, Suk-Kyo;Chwa, Dong-Kyoung
    • Proceedings of the KIEE Conference
    • /
    • 2005.10b
    • /
    • pp.584-586
    • /
    • 2005
  • In this paper, low cost inertial sensor and compass were used instead of encoder for localization of mobile robot. Movements by encoder, movements by inertial sensor and movements by complementary filter with inertial sensor and compass were analyzed. Movement by complementary filter was worse than by only inertial sensor because of imperfection of compass. For the complementary filter to show best movements, compass need to be compensated for position error.

  • PDF

Vision and force/torque sensor fusion in peg-in-hole using fuzzy logic (삽입 작업에서 퍼지추론에 의한 비젼 및 힘/토오크 센서의 퓨젼)

  • 이승호;이범희;고명삼;김대원
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1992.10a
    • /
    • pp.780-785
    • /
    • 1992
  • We present a multi-sensor fusion method in positioning control of a robot by using fuzzy logic. In general, the vision sensor is used in the gross motion control and the force/torque sensor is used in the fine motion control. We construct a fuzzy logic controller to combine the vision sensor data and the force/torque sensor data. Also, we apply the fuzzy logic controller to the peg-in-hole process. Simulation results uphold the theoretical results.

  • PDF

A Study on the Map-Building of a Cleaning Robot Base upon the Optimal Cost Function (청소로봇의 최적비용함수를 고려한 지도 작성에 관한 연구)

  • Kang, Jin Gu
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.5 no.3
    • /
    • pp.39-45
    • /
    • 2009
  • In this paper we present a cleaning robot system for an autonomous mobile robot. Our robot performs goal reaching tasks into unknown indoor environments by using sensor fusion. The robot's operation objective is to clean floor or any other applicable surface and to build a map of the surrounding environment for some further purpose such as finding the shortest path available. Using its cleaning robot system for an autonomous mobile robot can move in various modes and perform dexterous tasks. Performance of the cleaning robot system is better than a fixed base redundant robot in avoiding singularity and obstacle. Sensor fusion using the clean robot improves the performance of the robot with redundant freedom in workspace and Map-Building. In this paper, Map-building of the cleaning robot has been studied using sensor fusion. A sequence of this alternating task execution scheme enables the clean robot to execute various tasks efficiently. The proposed algorithm is experimentally verified and discussed with a cleaning robot, KCCR.

Overview of sensor fusion techniques for vehicle positioning (차량정밀측위를 위한 복합측위 기술 동향)

  • Park, Jin-Won;Choi, Kae-Won
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.11 no.2
    • /
    • pp.139-144
    • /
    • 2016
  • This paper provides an overview of recent trends in sensor fusion technologies for vehicle positioning. The GNSS by itself cannot satisfy precision and reliability required by autonomous driving. We survey sensor fusion techniques that combine the outputs from the GNSS and the inertial navigation sensors such as an odometer and a gyroscope. Moreover, we overview landmark-based positioning that matches landmarks detected by a lidar or a stereo vision to high-precision digital maps.

Implementation of a Real-time Data fusion Algorithm for Flight Test Computer (비행시험통제컴퓨터용 실시간 데이터 융합 알고리듬의 구현)

  • Lee, Yong-Jae;Won, Jong-Hoon;Lee, Ja-Sung
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.8 no.4 s.23
    • /
    • pp.24-31
    • /
    • 2005
  • This paper presents an implementation of a real-time multi-sensor data fusion algorithm for Flight Test Computer. The sensor data consist of positional information of the target from a radar, a GPS receiver and an INS. The data fusion algorithm is designed by the 21st order distributed Kalman Filter which is based on the PVA model with sensor bias states. A fault detection and correction logics are included in the algorithm for bad measurements and sensor faults. The statistical parameters for the states are obtained from Monte Carlo simulations and covariance analysis using test tracking data. The designed filter is verified by using real data both in post processing and real-time processing.

Hierarchical Behavior Control of Mobile Robot Based on Space & Time Sensor Fusion(STSF)

  • Han, Ho-Tack
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.314-320
    • /
    • 2006
  • Navigation in environments that are densely cluttered with obstacles is still a challenge for Autonomous Ground Vehicles (AGVs), especially when the configuration of obstacles is not known a priori. Reactive local navigation schemes that tightly couple the robot actions to the sensor information have proved to be effective in these environments, and because of the environmental uncertainties, STSF(Space and Time Sensor Fusion)-based fuzzy behavior systems have been proposed. Realization of autonomous behavior in mobile robots, using STSF control based on spatial data fusion, requires formulation of rules which are collectively responsible for necessary levels of intelligence. This collection of rules can be conveniently decomposed and efficiently implemented as a hierarchy of fuzzy-behaviors. This paper describes how this can be done using a behavior-based architecture. The approach is motivated by ethological models which suggest hierarchical organizations of behavior. Experimental results show that the proposed method can smoothly and effectively guide a robot through cluttered environments such as dense forests.

Fusion of Decisions in Wireless Sensor Networks under Non-Gaussian Noise Channels at Large SNR (비 정규 분포 잡음 채널에서 높은 신호 대 잡음비를 갖는 무선 센서 네트워크의 정보 융합)

  • Park, Jin-Tae;Kim, Gi-Sung;Kim, Ki-Seon
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.12 no.5
    • /
    • pp.577-584
    • /
    • 2009
  • Fusion of decisions in wireless sensor networks having flexibility on energy efficiency is studied in this paper. Two representative distributions, the generalized Gaussian and $\alpha$-stable probability density functions, are used to model non-Gaussian noise channels. By incorporating noise channels into the parallel fusion model, the optimal fusion rules are represented and suboptimal fusion rules are derived by using a large signal-to-noise ratio(SNR) approximation. For both distributions, the obtained suboptimal fusion rules are same and have equivalent form to the Chair-Varshney fusion rule(CVR). Thus, the CVR does not depend on the behavior of noise distributions that belong to the generalized Gaussian and $\alpha$-stable probability density functions. The simulation results show the suboptimality of the CVR at large SNRs.

The Sensory-Motor Fusion System for Object Tracking (이동 물체를 추적하기 위한 감각 운동 융합 시스템 설계)

  • Lee, Sang-Hee;Wee, Jae-Woo;Lee, Chong-Ho
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.52 no.3
    • /
    • pp.181-187
    • /
    • 2003
  • For the moving objects with environmental sensors such as object tracking moving robot with audio and video sensors, environmental information acquired from sensors keep changing according to movements of objects. In such case, due to lack of adaptability and system complexity, conventional control schemes show limitations on control performance, and therefore, sensory-motor systems, which can intuitively respond to various types of environmental information, are desirable. And also, to improve the system robustness, it is desirable to fuse more than two types of sensory information simultaneously. In this paper, based on Braitenberg's model, we propose a sensory-motor based fusion system, which can trace the moving objects adaptively to environmental changes. With the nature of direct connecting structure, sensory-motor based fusion system can control each motor simultaneously, and the neural networks are used to fuse information from various types of sensors. And also, even if the system receives noisy information from one sensor, the system still robustly works with information from other sensors which compensates the noisy information through sensor fusion. In order to examine the performance, sensory-motor based fusion model is applied to object-tracking four-foot robot equipped with audio and video sensors. The experimental results show that the sensory-motor based fusion system can tract moving objects robustly with simpler control mechanism than model-based control approaches.

Generalized IHS-Based Satellite Imagery Fusion Using Spectral Response Functions

  • Kim, Yong-Hyun;Eo, Yang-Dam;Kim, Youn-Soo;Kim, Yong-Il
    • ETRI Journal
    • /
    • v.33 no.4
    • /
    • pp.497-505
    • /
    • 2011
  • Image fusion is a technical method to integrate the spatial details of the high-resolution panchromatic (HRP) image and the spectral information of low-resolution multispectral (LRM) images to produce high-resolution multispectral images. The most important point in image fusion is enhancing the spatial details of the HRP image and simultaneously maintaining the spectral information of the LRM images. This implies that the physical characteristics of a satellite sensor should be considered in the fusion process. Also, to fuse massive satellite images, the fusion method should have low computation costs. In this paper, we propose a fast and efficient satellite image fusion method. The proposed method uses the spectral response functions of a satellite sensor; thus, it rationally reflects the physical characteristics of the satellite sensor to the fused image. As a result, the proposed method provides high-quality fused images in terms of spectral and spatial evaluations. The experimental results of IKONOS images indicate that the proposed method outperforms the intensity-hue-saturation and wavelet-based methods.