• 제목/요약/키워드: Multi-sensor fusion

검색결과 204건 처리시간 0.027초

다중센서 융합 상이 지도를 통한 다중센서 기반 3차원 복원 결과 개선 (Refinements of Multi-sensor based 3D Reconstruction using a Multi-sensor Fusion Disparity Map)

  • 김시종;안광호;성창훈;정명진
    • 로봇학회논문지
    • /
    • 제4권4호
    • /
    • pp.298-304
    • /
    • 2009
  • This paper describes an algorithm that improves 3D reconstruction result using a multi-sensor fusion disparity map. We can project LRF (Laser Range Finder) 3D points onto image pixel coordinatesusing extrinsic calibration matrixes of a camera-LRF (${\Phi}$, ${\Delta}$) and a camera calibration matrix (K). The LRF disparity map can be generated by interpolating projected LRF points. In the stereo reconstruction, we can compensate invalid points caused by repeated pattern and textureless region using the LRF disparity map. The result disparity map of compensation process is the multi-sensor fusion disparity map. We can refine the multi-sensor 3D reconstruction based on stereo vision and LRF using the multi-sensor fusion disparity map. The refinement algorithm of multi-sensor based 3D reconstruction is specified in four subsections dealing with virtual LRF stereo image generation, LRF disparity map generation, multi-sensor fusion disparity map generation, and 3D reconstruction process. It has been tested by synchronized stereo image pair and LRF 3D scan data.

  • PDF

비행시험시스템용 다중센서 자료융합필터 설계 (Design of Multi-Sensor Data Fusion Filter for a Flight Test System)

  • 이용재;이자성
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제55권9호
    • /
    • pp.414-419
    • /
    • 2006
  • This paper presents a design of a multi-sensor data fusion filter for a Flight Test System. The multi-sensor data consist of positional information of the target from radars and a telemetry system. The data fusion filter has a structure of a federated Kalman filter and is based on the Singer dynamic target model. It consists of dedicated local filter for each sensor, generally operating in parallel, plus a master fusion filter. A fault detection and correction algorithms are included in the local filter for treating bad measurements and sensor faults. The data fusion is carried out in the fusion filter by using maximum likelihood estimation algorithm. The performance of the designed fusion filter is verified by using both simulation data and real data.

Multi-Attribute Data Fusion for Energy Equilibrium Routing in Wireless Sensor Networks

  • Lin, Kai;Wang, Lei;Li, Keqiu;Shu, Lei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제4권1호
    • /
    • pp.5-24
    • /
    • 2010
  • Data fusion is an attractive technology because it allows various trade-offs related to performance metrics, e.g., energy, latency, accuracy, fault-tolerance and security in wireless sensor networks (WSNs). Under a complicated environment, each sensor node must be equipped with more than one type of sensor module to monitor multi-targets, so that the complexity for the fusion process is increased due to the existence of various physical attributes. In this paper, we first investigate the process and performance of multi-attribute fusion in data gathering of WSNs, and then propose a self-adaptive threshold method to balance the different change rates of each attributive data. Furthermore, we present a method to measure the energy-conservation efficiency of multi-attribute fusion. Based on our proposed methods, we design a novel energy equilibrium routing method for WSNs, viz., multi-attribute fusion tree (MAFT). Simulation results demonstrate that MAFT achieves very good performance in terms of the network lifetime.

지상표적식별을 위한 다중센서기반의 정보융합시스템에 관한 연구 (A Study on the Multi-sensor Data Fusion System for Ground Target Identification)

  • 강석훈
    • 안보군사학연구
    • /
    • 통권1호
    • /
    • pp.191-229
    • /
    • 2003
  • Multi-sensor data fusion techniques combine evidences from multiple sensors in order to get more accurate and efficient meaningful information through several process levels that may not be possible from a single sensor alone. One of the most important parts in the data fusion system is the identification fusion, and it can be categorized into physical models, parametric classification and cognitive-based models, and parametric classification technique is usually used in multi-sensor data fusion system by its characteristic. In this paper, we propose a novel heuristic identification fusion method in which we adopt desirable properties from not only parametric classification technique but also cognitive-based models in order to meet the realtime processing requirements.

  • PDF

MULTI-SENSOR DATA FUSION FOR FUTURE TELEMATICS APPLICATION

  • Kim, Seong-Baek;Lee, Seung-Yong;Choi, Ji-Hoon;Choi, Kyung-Ho;Jang, Byung-Tae
    • Journal of Astronomy and Space Sciences
    • /
    • 제20권4호
    • /
    • pp.359-364
    • /
    • 2003
  • In this paper, we present multi-sensor data fusion for telematics application. Successful telematics can be realized through the integration of navigation and spatial information. The well-determined acquisition of vehicle's position plays a vital role in application service. The development of GPS is used to provide the navigation data, but the performance is limited in areas where poor satellite visibility environment exists. Hence, multi-sensor fusion including IMU (Inertial Measurement Unit), GPS(Global Positioning System), and DMI (Distance Measurement Indicator) is required to provide the vehicle's position to service provider and driver behind the wheel. The multi-sensor fusion is implemented via algorithm based on Kalman filtering technique. Navigation accuracy can be enhanced using this filtering approach. For the verification of fusion approach, land vehicle test was performed and the results were discussed. Results showed that the horizontal position errors were suppressed around 1 meter level accuracy under simulated non-GPS availability environment. Under normal GPS environment, the horizontal position errors were under 40㎝ in curve trajectory and 27㎝ in linear trajectory, which are definitely depending on vehicular dynamics.

다중주기 칼만 필터를 이용한 비동기 센서 융합 (Asynchronous Sensor Fusion using Multi-rate Kalman Filter)

  • 손영섭;김원희;이승희;정정주
    • 전기학회논문지
    • /
    • 제63권11호
    • /
    • pp.1551-1558
    • /
    • 2014
  • We propose a multi-rate sensor fusion of vision and radar using Kalman filter to solve problems of asynchronized and multi-rate sampling periods in object vehicle tracking. A model based prediction of object vehicles is performed with a decentralized multi-rate Kalman filter for each sensor (vision and radar sensors.) To obtain the improvement in the performance of position prediction, different weighting is applied to each sensor's predicted object position from the multi-rate Kalman filter. The proposed method can provide estimated position of the object vehicles at every sampling time of ECU. The Mahalanobis distance is used to make correspondence among the measured and predicted objects. Through the experimental results, we validate that the post-processed fusion data give us improved tracking performance. The proposed method obtained two times improvement in the object tracking performance compared to single sensor method (camera or radar sensor) in the view point of roots mean square error.

다중센서 데이터 융합에서 이벤트 발생 빈도기반 가중치 부여 (Multi-sensor Data Fusion Using Weighting Method based on Event Frequency)

  • 서동혁;유창근
    • 한국전자통신학회논문지
    • /
    • 제6권4호
    • /
    • pp.581-587
    • /
    • 2011
  • 무선센서네트워크는 높은 수준의 상황정보를 추론할 수 있기 위해 이질적인 다중센서로 이루어질 필요가 있다. 다중센서에 의해 수집된 데이터를 상황 정보추론에 활용할 때 다중센서 데이터 융합이 필요하다. 본 논문에서는 Dempster-Shafer의 증거이론에 입각하여 무선센서네트워크에서의 데이터 융합할 때 센서별 가중치를 부여하는 방안을 제안하였다. 센서별 이벤트 발생빈도수를 기준으로 하였는데, 센서별 이벤트 발생 빈도수는 해당 센서가 입수한 상황데이터의 가중치를 계산하는데 반영해야 할 요소이다. 센서별 이벤트 발생빈도에 기초하여 가중치를 계산하였으며 이 가중치를 부여하여 다중센서 데이터 융합하였을 때 신뢰도가 더욱 뚜렷한 격차를 보이게 함으로써 상황정보를 추론함에 있어서 용이할 수 있었다.

다중 레이더 환경에서의 바이어스 오차 추정의 가관측성에 대한 연구와 정보 융합 (A Study of Observability Analysis and Data Fusion for Bias Estimation in a Multi-Radar System)

  • 원건희;송택렬;김다솔;서일환;황규환
    • 제어로봇시스템학회논문지
    • /
    • 제17권8호
    • /
    • pp.783-789
    • /
    • 2011
  • Target tracking performance improvement using multi-sensor data fusion is a challenging work. However, biases in the measurements should be removed before various data fusion techniques are applied. In this paper, a bias removing algorithm using measurement data from multi-radar tracking systems is proposed and evaluated by computer simulation. To predict bias estimation performance in various geometric relations between the radar systems and target, a system observability index is proposed and tested via computer simulation results. It is also studied that target tracking which utilizes multi-sensor data fusion with bias-removed measurements results in better performance.

입력 영상의 방사학적 불일치 보정이 다중 센서 고해상도 위성영상의 시공간 융합에 미치는 영향 (Effect of Correcting Radiometric Inconsistency between Input Images on Spatio-temporal Fusion of Multi-sensor High-resolution Satellite Images)

  • 박소연;나상일;박노욱
    • 대한원격탐사학회지
    • /
    • 제37권5_1호
    • /
    • pp.999-1011
    • /
    • 2021
  • 다중 센서 영상으로부터 공간 및 시간해상도가 모두 높은 영상을 예측하는 시공간 융합에서 다중 센서 영상의 방사학적 불일치는 예측 성능에 영향을 미칠 수 있다. 이 연구에서는 다중 센서 위성영상의 서로 다른 분광학적 특성을 보정하는 방사보정이 융합 결과에 미치는 영향을 분석하였다. 두 농경지에서 얻어진 Sentinel-2, PlanetScope 및 RapidEye 영상을 이용한 사례연구를 통해 상대 방사보정의 효과를 정량적으로 분석하였다. 사례연구 결과, 상대 방사보정을 적용한 다중 센서 영상을 사용하였을 때 융합의 예측 정확도가 향상되었다. 특히 입력 자료 간 상관성이 낮은 경우에 상대 방사보정에 의한 예측 정확도 향상이 두드러졌다. 분광 특성의 차이를 보이는 다중 센서 자료를 서로 유사하게 변환함으로써 예측 성능이 향상된 것으로 보인다. 이 결과를 통해 상대 방사보정은 상관성이 낮은 다중 센서 위성영상의 시공간 융합에서 예측 능력을 향상시키기 위해 필요할 것으로 판단된다.

다중센서 데이터융합 기반 상황추론에서 시간경과를 고려한 클러스터링 기법 (A Novel Clustering Method with Time Interval for Context Inference based on the Multi-sensor Data Fusion)

  • 유창근;박찬봉
    • 한국전자통신학회논문지
    • /
    • 제8권3호
    • /
    • pp.397-402
    • /
    • 2013
  • 다중센서를 이용한 상황인식에서 시간변화는 고려해야 하는 요소이다. 센서가 감지하여 보고한 정보를 바탕으로 상황추론에 도달하고자 하는 경우, 일정 시간 간격별로 묶어서 검토하는 것이 유용하다. 본 논문에서는 시간경과를 고려하는 클러스터링 기법을 이용한 다중센서 데이터융합을 제안한다. 각 센서별로 일정시간 간격동안 수집되어 보고된 센싱 정보를 묶어 1차 데이터융합을 실시하고 그 결과를 대상으로 다시 2차 데이터융합을 실시하였다. Dempster-Shafer이론을 이용하여 다중센서 데이터융합을 실시하고 그 결과를 분석하여 상황을 추론하는데 시간간격을 기준으로 세분화시켜 평가하고 이것을 다시 융합함으로써 향상된 상황 정보를 추론할 수 있다.