• Title/Summary/Keyword: sensor- fusion

Search Result 822, Processing Time 0.03 seconds

A Study on the Multi-sensor Data Fusion System for Ground Target Identification (지상표적식별을 위한 다중센서기반의 정보융합시스템에 관한 연구)

  • Gang, Seok-Hun
    • Journal of National Security and Military Science
    • /
    • s.1
    • /
    • pp.191-229
    • /
    • 2003
  • Multi-sensor data fusion techniques combine evidences from multiple sensors in order to get more accurate and efficient meaningful information through several process levels that may not be possible from a single sensor alone. One of the most important parts in the data fusion system is the identification fusion, and it can be categorized into physical models, parametric classification and cognitive-based models, and parametric classification technique is usually used in multi-sensor data fusion system by its characteristic. In this paper, we propose a novel heuristic identification fusion method in which we adopt desirable properties from not only parametric classification technique but also cognitive-based models in order to meet the realtime processing requirements.

  • PDF

Control of the Mobile Robot Navigation Using a New Time Sensor Fusion

  • Tack, Han-Ho;Kim, Chang-Geun;Kim, Myeong-Kyu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.4 no.1
    • /
    • pp.23-28
    • /
    • 2004
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion(STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

Multisensor Bias Estimation with Serial Fusion for Asynchronous Sensors (순차적 정보융합을 이용한 비동기 다중 레이더 환경에서의 바이어스 추정기법)

  • Kim, Hyoung Won;Park, Hyo Dal;Song, Taek Lyul
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.15 no.5
    • /
    • pp.676-686
    • /
    • 2012
  • This paper presents a sensor bias estimation method with serial fusion for asynchronous multisensory systems. Serial fusion processes the sensor measurements in a first-come-first-serve basis and it plays an essential role in asynchronous fusion in practice. The proposed algorithm generates the bias measurements using fusion estimates and sensor measurements for bias estimation, and compensates the sensor biases in fusion tracks. A simulation study indicates that the proposed algorithm has the superior performance in bias estimation and accurate tracking.

Distributed Fusion Moving Average Prediction for Linear Stochastic Systems

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • Journal of Sensor Science and Technology
    • /
    • v.28 no.2
    • /
    • pp.88-93
    • /
    • 2019
  • This paper is concerned with distributed fusion moving average prediction for continuous-time linear stochastic systems with multiple sensors. A distributed fusion with the weighted sum structure is applied to the optimal local moving average predictors. The distributed fusion prediction algorithm represents the optimal linear fusion by weighting matrices under the minimum mean square criterion. The derivation of equations for error cross-covariances between the local predictors is the key of this paper. Example demonstrates effectiveness of the distributed fusion moving average predictor.

Refinements of Multi-sensor based 3D Reconstruction using a Multi-sensor Fusion Disparity Map (다중센서 융합 상이 지도를 통한 다중센서 기반 3차원 복원 결과 개선)

  • Kim, Si-Jong;An, Kwang-Ho;Sung, Chang-Hun;Chung, Myung-Jin
    • The Journal of Korea Robotics Society
    • /
    • v.4 no.4
    • /
    • pp.298-304
    • /
    • 2009
  • This paper describes an algorithm that improves 3D reconstruction result using a multi-sensor fusion disparity map. We can project LRF (Laser Range Finder) 3D points onto image pixel coordinatesusing extrinsic calibration matrixes of a camera-LRF (${\Phi}$, ${\Delta}$) and a camera calibration matrix (K). The LRF disparity map can be generated by interpolating projected LRF points. In the stereo reconstruction, we can compensate invalid points caused by repeated pattern and textureless region using the LRF disparity map. The result disparity map of compensation process is the multi-sensor fusion disparity map. We can refine the multi-sensor 3D reconstruction based on stereo vision and LRF using the multi-sensor fusion disparity map. The refinement algorithm of multi-sensor based 3D reconstruction is specified in four subsections dealing with virtual LRF stereo image generation, LRF disparity map generation, multi-sensor fusion disparity map generation, and 3D reconstruction process. It has been tested by synchronized stereo image pair and LRF 3D scan data.

  • PDF

Development of a Monitoring and Verification Tool for Sensor Fusion (센서융합 검증을 위한 실시간 모니터링 및 검증 도구 개발)

  • Kim, Hyunwoo;Shin, Seunghwan;Bae, Sangjin
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.22 no.3
    • /
    • pp.123-129
    • /
    • 2014
  • SCC (Smart Cruise Control) and AEBS (Autonomous Emergency Braking System) are using various types of sensors data, so it is important to consider about sensor data reliability. In this paper, data from radar and vision sensor is fused by applying a Bayesian sensor fusion technique to improve the reliability of sensors data. Then, it presents a sensor fusion verification tool developed to monitor acquired sensors data and to verify sensor fusion results, efficiently. A parallel computing method was applied to reduce verification time and a series of simulation results of this method are discussed in detail.

Improvement of Land Cover Classification Accuracy by Optimal Fusion of Aerial Multi-Sensor Data

  • Choi, Byoung Gil;Na, Young Woo;Kwon, Oh Seob;Kim, Se Hun
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.3
    • /
    • pp.135-152
    • /
    • 2018
  • The purpose of this study is to propose an optimal fusion method of aerial multi - sensor data to improve the accuracy of land cover classification. Recently, in the fields of environmental impact assessment and land monitoring, high-resolution image data has been acquired for many regions for quantitative land management using aerial multi-sensor, but most of them are used only for the purpose of the project. Hyperspectral sensor data, which is mainly used for land cover classification, has the advantage of high classification accuracy, but it is difficult to classify the accurate land cover state because only the visible and near infrared wavelengths are acquired and of low spatial resolution. Therefore, there is a need for research that can improve the accuracy of land cover classification by fusing hyperspectral sensor data with multispectral sensor and aerial laser sensor data. As a fusion method of aerial multisensor, we proposed a pixel ratio adjustment method, a band accumulation method, and a spectral graph adjustment method. Fusion parameters such as fusion rate, band accumulation, spectral graph expansion ratio were selected according to the fusion method, and the fusion data generation and degree of land cover classification accuracy were calculated by applying incremental changes to the fusion variables. Optimal fusion variables for hyperspectral data, multispectral data and aerial laser data were derived by considering the correlation between land cover classification accuracy and fusion variables.

Multi-Attribute Data Fusion for Energy Equilibrium Routing in Wireless Sensor Networks

  • Lin, Kai;Wang, Lei;Li, Keqiu;Shu, Lei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.1
    • /
    • pp.5-24
    • /
    • 2010
  • Data fusion is an attractive technology because it allows various trade-offs related to performance metrics, e.g., energy, latency, accuracy, fault-tolerance and security in wireless sensor networks (WSNs). Under a complicated environment, each sensor node must be equipped with more than one type of sensor module to monitor multi-targets, so that the complexity for the fusion process is increased due to the existence of various physical attributes. In this paper, we first investigate the process and performance of multi-attribute fusion in data gathering of WSNs, and then propose a self-adaptive threshold method to balance the different change rates of each attributive data. Furthermore, we present a method to measure the energy-conservation efficiency of multi-attribute fusion. Based on our proposed methods, we design a novel energy equilibrium routing method for WSNs, viz., multi-attribute fusion tree (MAFT). Simulation results demonstrate that MAFT achieves very good performance in terms of the network lifetime.

A Study on Multi Sensor Track Fusion Algorithm for Naval Combat System (함정 전투체계 표적 융합 정확도 향상을 위한 알고리즘 연구)

  • Jung, Young-Ran
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.10 no.3
    • /
    • pp.34-42
    • /
    • 2007
  • It is very important for the combat system to process extensive data exactly at short time for the better situation awareness compared with the threats in these days. This paper suggests to add radial velocity on the decision factor of sensor data fusion in the existing algorithm for the accuracy enhancement of the sensor data fusion in the combat system.

Vision Sensor and Ultrasonic Sensor Fusion Using Neural Network

  • Baek, Sang-Hoon;Oh, Se-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.668-671
    • /
    • 2004
  • This paper proposes a new method of sensor fusion of an ultrasonic sensor and a vision sensor at the sensor level. In general vision system, the vision system finds edges of objects. And in general ultrasonic system, the ultrasonic system finds absolute distance between robot and object. So, the method integrates data of two different types. The system makes perfect output for robot control in the end. But this paper does not propose only integrating a different kind of data but also fusion information which receives from different kind of sensors. This method has advantages which can simply embody algorithm and can control robot on real time.

  • PDF