• Title/Summary/Keyword: multi-sensor information fusion

Search Result 116, Processing Time 0.029 seconds

Multi-Attribute Data Fusion for Energy Equilibrium Routing in Wireless Sensor Networks

  • Lin, Kai;Wang, Lei;Li, Keqiu;Shu, Lei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.1
    • /
    • pp.5-24
    • /
    • 2010
  • Data fusion is an attractive technology because it allows various trade-offs related to performance metrics, e.g., energy, latency, accuracy, fault-tolerance and security in wireless sensor networks (WSNs). Under a complicated environment, each sensor node must be equipped with more than one type of sensor module to monitor multi-targets, so that the complexity for the fusion process is increased due to the existence of various physical attributes. In this paper, we first investigate the process and performance of multi-attribute fusion in data gathering of WSNs, and then propose a self-adaptive threshold method to balance the different change rates of each attributive data. Furthermore, we present a method to measure the energy-conservation efficiency of multi-attribute fusion. Based on our proposed methods, we design a novel energy equilibrium routing method for WSNs, viz., multi-attribute fusion tree (MAFT). Simulation results demonstrate that MAFT achieves very good performance in terms of the network lifetime.

MULTI-SENSOR DATA FUSION FOR FUTURE TELEMATICS APPLICATION

  • Kim, Seong-Baek;Lee, Seung-Yong;Choi, Ji-Hoon;Choi, Kyung-Ho;Jang, Byung-Tae
    • Journal of Astronomy and Space Sciences
    • /
    • v.20 no.4
    • /
    • pp.359-364
    • /
    • 2003
  • In this paper, we present multi-sensor data fusion for telematics application. Successful telematics can be realized through the integration of navigation and spatial information. The well-determined acquisition of vehicle's position plays a vital role in application service. The development of GPS is used to provide the navigation data, but the performance is limited in areas where poor satellite visibility environment exists. Hence, multi-sensor fusion including IMU (Inertial Measurement Unit), GPS(Global Positioning System), and DMI (Distance Measurement Indicator) is required to provide the vehicle's position to service provider and driver behind the wheel. The multi-sensor fusion is implemented via algorithm based on Kalman filtering technique. Navigation accuracy can be enhanced using this filtering approach. For the verification of fusion approach, land vehicle test was performed and the results were discussed. Results showed that the horizontal position errors were suppressed around 1 meter level accuracy under simulated non-GPS availability environment. Under normal GPS environment, the horizontal position errors were under 40㎝ in curve trajectory and 27㎝ in linear trajectory, which are definitely depending on vehicular dynamics.

Design of Multi-Sensor Data Fusion Filter for a Flight Test System (비행시험시스템용 다중센서 자료융합필터 설계)

  • Lee, Yong-Jae;Lee, Ja-Sung
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.55 no.9
    • /
    • pp.414-419
    • /
    • 2006
  • This paper presents a design of a multi-sensor data fusion filter for a Flight Test System. The multi-sensor data consist of positional information of the target from radars and a telemetry system. The data fusion filter has a structure of a federated Kalman filter and is based on the Singer dynamic target model. It consists of dedicated local filter for each sensor, generally operating in parallel, plus a master fusion filter. A fault detection and correction algorithms are included in the local filter for treating bad measurements and sensor faults. The data fusion is carried out in the fusion filter by using maximum likelihood estimation algorithm. The performance of the designed fusion filter is verified by using both simulation data and real data.

A Study on the Multi-sensor Data Fusion System for Ground Target Identification (지상표적식별을 위한 다중센서기반의 정보융합시스템에 관한 연구)

  • Gang, Seok-Hun
    • Journal of National Security and Military Science
    • /
    • s.1
    • /
    • pp.191-229
    • /
    • 2003
  • Multi-sensor data fusion techniques combine evidences from multiple sensors in order to get more accurate and efficient meaningful information through several process levels that may not be possible from a single sensor alone. One of the most important parts in the data fusion system is the identification fusion, and it can be categorized into physical models, parametric classification and cognitive-based models, and parametric classification technique is usually used in multi-sensor data fusion system by its characteristic. In this paper, we propose a novel heuristic identification fusion method in which we adopt desirable properties from not only parametric classification technique but also cognitive-based models in order to meet the realtime processing requirements.

  • PDF

Multi-sensor Data Fusion Using Weighting Method based on Event Frequency (다중센서 데이터 융합에서 이벤트 발생 빈도기반 가중치 부여)

  • Suh, Dong-Hyok;Ryu, Chang-Keun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.6 no.4
    • /
    • pp.581-587
    • /
    • 2011
  • A wireless sensor network needs to consist of multi-sensors in order to infer a high level of information on circumstances. Data fusion, in turn, is required to utilize the data collected from multi-sensors for the inference of information on circumstances. The current paper, based on Dempster-Shafter's evidence theory, proposes data fusion in a wireless sensor network with different weights assigned to different sensors. The frequency of events per sensor is the crucial element in calculating different weights of the data of circumstances that each sensor collects. Data fusion utilizing these different weights turns out to show remarkable difference in reliability, which makes it much easier to infer information on circumstances.

A Study on a Multi-sensor Information Fusion Architecture for Avionics (항공전자 멀티센서 정보 융합 구조 연구)

  • Kang, Shin-Woo;Lee, Seoung-Pil;Park, Jun-Hyeon
    • Journal of Advanced Navigation Technology
    • /
    • v.17 no.6
    • /
    • pp.777-784
    • /
    • 2013
  • Synthesis process from the data produced by different types of sensor into a single information is being studied and used in a variety of platforms in terms of multi-sensor data fusion. Heterogeneous sensors has been integrated into various aircraft and modern avionic systems manage them. As the performance of sensors in aircraft is getting higher, the integration of sensor information is required from the viewpoint of avionics gradually. Information fusion is not studied widely in the view of software that provide a pilot with fused information from data produced by the sensor in the form of symbology on a display device. The purpose of information fusion is to assist pilots to make a decision in order to perform mission by providing the correct combat situation from avionics of the aircraft and to minimize their workload consequently. In the aircraft avionics equipped with different types of sensors, the software architecture that produce a comprehensive information using the sensor data through multi-sensor data fusion process to the user is shown in this paper.

Classification of Multi-sensor Remote Sensing Images Using Fuzzy Logic Fusion and Iterative Relaxation Labeling (퍼지 논리 융합과 반복적 Relaxation Labeling을 이용한 다중 센서 원격탐사 화상 분류)

  • Park No-Wook;Chi Kwang-Hoon;Kwon Byung-Doo
    • Korean Journal of Remote Sensing
    • /
    • v.20 no.4
    • /
    • pp.275-288
    • /
    • 2004
  • This paper presents a fuzzy relaxation labeling approach incorporated to the fuzzy logic fusion scheme for the classification of multi-sensor remote sensing images. The fuzzy logic fusion and iterative relaxation labeling techniques are adopted to effectively integrate multi-sensor remote sensing images and to incorporate spatial neighboring information into spectral information for contextual classification, respectively. Especially, the iterative relaxation labeling approach can provide additional information that depicts spatial distributions of pixels updated by spatial information. Experimental results for supervised land-cover classification using optical and multi-frequency/polarization images indicate that the use of multi-sensor images and spatial information can improve the classification accuracy.

Sensor fault diagnosis for bridge monitoring system using similarity of symmetric responses

  • Xu, Xiang;Huang, Qiao;Ren, Yuan;Zhao, Dan-Yang;Yang, Juan
    • Smart Structures and Systems
    • /
    • v.23 no.3
    • /
    • pp.279-293
    • /
    • 2019
  • To ensure high quality data being used for data mining or feature extraction in the bridge structural health monitoring (SHM) system, a practical sensor fault diagnosis methodology has been developed based on the similarity of symmetric structure responses. First, the similarity of symmetric response is discussed using field monitoring data from different sensor types. All the sensors are initially paired and sensor faults are then detected pair by pair to achieve the multi-fault diagnosis of sensor systems. To resolve the coupling response issue between structural damage and sensor fault, the similarity for the target zone (where the studied sensor pair is located) is assessed to determine whether the localized structural damage or sensor fault results in the dissimilarity of the studied sensor pair. If the suspected sensor pair is detected with at least one sensor being faulty, field test could be implemented to support the regression analysis based on the monitoring and field test data for sensor fault isolation and reconstruction. Finally, a case study is adopted to demonstrate the effectiveness of the proposed methodology. As a result, Dasarathy's information fusion model is adopted for multi-sensor information fusion. Euclidean distance is selected as the index to assess the similarity. In conclusion, the proposed method is practical for actual engineering which ensures the reliability of further analysis based on monitoring data.

Improvement of Land Cover Classification Accuracy by Optimal Fusion of Aerial Multi-Sensor Data

  • Choi, Byoung Gil;Na, Young Woo;Kwon, Oh Seob;Kim, Se Hun
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.3
    • /
    • pp.135-152
    • /
    • 2018
  • The purpose of this study is to propose an optimal fusion method of aerial multi - sensor data to improve the accuracy of land cover classification. Recently, in the fields of environmental impact assessment and land monitoring, high-resolution image data has been acquired for many regions for quantitative land management using aerial multi-sensor, but most of them are used only for the purpose of the project. Hyperspectral sensor data, which is mainly used for land cover classification, has the advantage of high classification accuracy, but it is difficult to classify the accurate land cover state because only the visible and near infrared wavelengths are acquired and of low spatial resolution. Therefore, there is a need for research that can improve the accuracy of land cover classification by fusing hyperspectral sensor data with multispectral sensor and aerial laser sensor data. As a fusion method of aerial multisensor, we proposed a pixel ratio adjustment method, a band accumulation method, and a spectral graph adjustment method. Fusion parameters such as fusion rate, band accumulation, spectral graph expansion ratio were selected according to the fusion method, and the fusion data generation and degree of land cover classification accuracy were calculated by applying incremental changes to the fusion variables. Optimal fusion variables for hyperspectral data, multispectral data and aerial laser data were derived by considering the correlation between land cover classification accuracy and fusion variables.

Lane Information Fusion Scheme using Multiple Lane Sensors (다중센서 기반 차선정보 시공간 융합기법)

  • Lee, Soomok;Park, Gikwang;Seo, Seung-woo
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.12
    • /
    • pp.142-149
    • /
    • 2015
  • Most of the mono-camera based lane detection systems are fragile on poor illumination conditions. In order to compensate limitations of single sensor utilization, lane information fusion system using multiple lane sensors is an alternative to stabilize performance and guarantee high precision. However, conventional fusion schemes, which only concerns object detection, are inappropriate to apply to the lane information fusion. Even few studies considering lane information fusion have dealt with limited aids on back-up sensor or omitted cases of asynchronous multi-rate and coverage. In this paper, we propose a lane information fusion scheme utilizing multiple lane sensors with different coverage and cycle. The precise lane information fusion is achieved by the proposed fusion framework which considers individual ranging capability and processing time of diverse types of lane sensors. In addition, a novel lane estimation model is proposed to synchronize multi-rate sensors precisely by up-sampling spare lane information signals. Through quantitative vehicle-level experiments with around view monitoring system and frontal camera system, we demonstrate the robustness of the proposed lane fusion scheme.