• 제목/요약/키워드: time sensor fusion

Search Result 217, Processing Time 0.025 seconds

Compression Filters Based on Time-Propagated Measurement Fusion (시전달 측정치 융합에 기반한 압축필트)

  • Lee, Hyeong-Geun;Lee, Jang-Gyu
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.51 no.9
    • /
    • pp.389-401
    • /
    • 2002
  • To complement the conventional fusion methodologies of state fusion and measurement fusion, a time-propagated measurement fusion methodology is proposed. Various aspects of common process noise are investigated regarding information preservation. Based on time-propagated measurement fusion methodology, four compression filters are derived. The derived compression filters are efficient in asynchronous sensor fusion and fault detection since they maintain correct statistical information. A new batch Kalman recursion is proposed to show the optimality under the time-propagated measurement fusion methodology. A simple simulation result evaluates estimation efficiency and characteristic.

Design of a Multi-Sensor Data Simulator and Development of Data Fusion Algorithm (다중센서자료 시뮬레이터 설계 및 자료융합 알고리듬 개발)

  • Lee, Yong-Jae;Lee, Ja-Seong;Go, Seon-Jun;Song, Jong-Hwa
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.34 no.5
    • /
    • pp.93-100
    • /
    • 2006
  • This paper presents a multi-sensor data simulator and a data fusion algorithm for tracking high dynamic flight target from Radar and Telemetry System. The designed simulator generates time-asynchronous multiple sensor data with different data rates and communication delays. Measurement noises are incorporated by using realistic sensor models. The proposed fusion algorithm is designed by a 21st order distributed Kalman Filter which is based on the PVA model with sensor bias states. A fault detection and correction logics are included in the algorithm for bad data and sensor faults. The designed algorithm is verified by using both simulation data and actual real data.

Asynchronous Sensor Fusion using Multi-rate Kalman Filter (다중주기 칼만 필터를 이용한 비동기 센서 융합)

  • Son, Young Seop;Kim, Wonhee;Lee, Seung-Hi;Chung, Chung Choo
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.11
    • /
    • pp.1551-1558
    • /
    • 2014
  • We propose a multi-rate sensor fusion of vision and radar using Kalman filter to solve problems of asynchronized and multi-rate sampling periods in object vehicle tracking. A model based prediction of object vehicles is performed with a decentralized multi-rate Kalman filter for each sensor (vision and radar sensors.) To obtain the improvement in the performance of position prediction, different weighting is applied to each sensor's predicted object position from the multi-rate Kalman filter. The proposed method can provide estimated position of the object vehicles at every sampling time of ECU. The Mahalanobis distance is used to make correspondence among the measured and predicted objects. Through the experimental results, we validate that the post-processed fusion data give us improved tracking performance. The proposed method obtained two times improvement in the object tracking performance compared to single sensor method (camera or radar sensor) in the view point of roots mean square error.

Sensor Fusion System for Improving the Recognition Performance of 3D Object (3차원 물체의 인식 성능 향상을 위한 감각 융합 시스템)

  • Kim, Ji-Kyoung;Oh, Yeong-Jae;Chong, Kab-Sung;Wee, Jae-Woo;Lee, Chong-Ho
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.107-109
    • /
    • 2004
  • In this paper, authors propose the sensor fusion system that can recognize multiple 3D objects from 2D projection images and tactile information. The proposed system focuses on improving recognition performance of 3D object. Unlike the conventional object recognition system that uses image sensor alone, the proposed method uses tactual sensors in addition to visual sensor. Neural network is used to fuse these informations. Tactual signals are obtained from the reaction force by the pressure sensors at the fingertips when unknown objects are grasped by four-fingered robot hand. The experiment evaluates the recognition rate and the number of teaming iterations of various objects. The merits of the proposed systems are not only the high performance of the learning ability but also the reliability of the system with tactual information for recognizing various objects even though visual information has a defect. The experimental results show that the proposed system can improve recognition rate and reduce learning time. These results verify the effectiveness of the proposed sensor fusion system as recognition scheme of 3D object.

  • PDF

Neural Network Approach to Sensor Fusion System for Improving the Recognition Performance of 3D Objects (3차원 물체의 인식 성능 향상을 위한 감각 융합 신경망 시스템)

  • Dong Sung Soo;Lee Chong Ho;Kim Ji Kyoung
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.54 no.3
    • /
    • pp.156-165
    • /
    • 2005
  • Human being recognizes the physical world by integrating a great variety of sensory inputs, the information acquired by their own action, and their knowledge of the world using hierarchically parallel-distributed mechanism. In this paper, authors propose the sensor fusion system that can recognize multiple 3D objects from 2D projection images and tactile informations. The proposed system focuses on improving recognition performance of 3D objects. Unlike the conventional object recognition system that uses image sensor alone, the proposed method uses tactual sensors in addition to visual sensor. Neural network is used to fuse the two sensory signals. Tactual signals are obtained from the reaction force of the pressure sensors at the fingertips when unknown objects are grasped by four-fingered robot hand. The experiment evaluates the recognition rate and the number of learning iterations of various objects. The merits of the proposed systems are not only the high performance of the learning ability but also the reliability of the system with tactual information for recognizing various objects even though the visual sensory signals get defects. The experimental results show that the proposed system can improve recognition rate and reduce teeming time. These results verify the effectiveness of the proposed sensor fusion system as recognition scheme for 3D objects.

A Study on Mobile Robot Navigation Using a New Sensor Fusion

  • Tack, Han-Ho;Jin, Tae-Seok;Lee, Sang-Bae
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2003.09a
    • /
    • pp.471-475
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.

  • PDF

Motion and Structure Estimation Using Fusion of Inertial and Vision Data for Helmet Tracker

  • Heo, Se-Jong;Shin, Ok-Shik;Park, Chan-Gook
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.11 no.1
    • /
    • pp.31-40
    • /
    • 2010
  • For weapon cueing and Head-Mounted Display (HMD), it is essential to continuously estimate the motion of the helmet. The problem of estimating and predicting the position and orientation of the helmet is approached by fusing measurements from inertial sensors and stereo vision system. The sensor fusion approach in this paper is based on nonlinear filtering, especially expended Kalman filter(EKF). To reduce the computation time and improve the performance in vision processing, we separate the structure estimation and motion estimation. The structure estimation tracks the features which are the part of helmet model structure in the scene and the motion estimation filter estimates the position and orientation of the helmet. This algorithm is tested with using synthetic and real data. And the results show that the result of sensor fusion is successful.

Development of System For Cell Fusion Detection (세포 전기 융합 감지 장치에 관한 연구)

  • Kwon, Ki-Jin;Kim, Min-Soo;Park, Se-Kwang
    • Proceedings of the KIEE Conference
    • /
    • 1994.07b
    • /
    • pp.1336-1338
    • /
    • 1994
  • Cell fusion device is an artificial equipment which fuses electrically two types of cells fed from the respective micropump to the fusion chamber by electric pulses. In this case, the detective sensor of flowing cell, along with passage, is required to control the time of pulses applied to cell and the injection of cells which are fed from inlet to micropump. There are two methods of detection of flowing cell; optical, impedance method. The difference of output for optical sensor is about 426mV for 805nm wavelength. about 37mV for 665nm wavelength. In impedance method, sensor output is 132.33mV at middle point and 117.10mV at edge point in the channel. Experimental results show that the optimal frequency range of sensor output is Iron 50Hz to 400Hz.

  • PDF

A wireless sensor with data-fusion algorithm for structural tilt measurement

  • Dan Li;Guangwei Zhang;Ziyang Su;Jian Zhang
    • Smart Structures and Systems
    • /
    • v.31 no.3
    • /
    • pp.301-309
    • /
    • 2023
  • Tilt is a key indicator of structural safety. Real-time monitoring of tilt responses helps to evaluate structural condition, enable cost-effective maintenance, and enhance lifetime resilience. This paper presents a prototype wireless sensing system for structural tilt measurement. Long range (LoRa) technology is adopted by the sensing system to offer long-range wireless communication with low power consumption. The sensor integrates a gyroscope and an accelerometer as the sensing module. Although tilt can be estimated from the gyroscope or the accelerometer measurements, these estimates suffer from either drift issue or high noise. To address this challenging issue and obtain more reliable tilt results, two sensor fusion algorithms, the complementary filter and the Kalman filter, are investigated to fully exploit the advantages of both gyroscope and accelerometer measurements. Numerical simulation is carried out to validate and compare the sensor fusion algorithms. Laboratory experiment is conducted on a simply supported beam under moving vehicle load to further investigate the performance of the proposed wireless tilt sensing system.

A Fusion Algorithm considering Error Characteristics of the Multi-Sensor (다중센서 오차특성을 고려한 융합 알고리즘)

  • Hyun, Dae-Hwan;Yoon, Hee-Byung
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.36 no.4
    • /
    • pp.274-282
    • /
    • 2009
  • Various location tracking sensors; such as GPS, INS, radar, and optical equipment; are used for tracking moving targets. In order to effectively track moving targets, it is necessary to develop an effective fusion method for these heterogeneous devices. There have been studies in which the estimated values of each sensors were regarded as different models and fused together, considering the different error characteristics of the sensors for the improvement of tracking performance using heterogeneous multi-sensor. However, the rate of errors for the estimated values of other sensors has increased, in that there has been a sharp increase in sensor errors and the attempts to change the estimated sensor values for the Sensor Probability could not be applied in real time. In this study, the Sensor Probability is obtained by comparing the RMSE (Root Mean Square Error) for the difference between the updated and measured values of the Kalman filter for each sensor. The process of substituting the new combined values for the Kalman filter input values for each sensor is excluded. There are improvements in both the real-time application of estimated sensor values, and the tracking performance for the areas in which the sensor performance has rapidly decreased. The proposed algorithm adds the error characteristic of each sensor as a conditional probability value, and ensures greater accuracy by performing the track fusion with the sensors with the most reliable performance. The trajectory of a UAV is generated in an experiment and a performance analysis is conducted with other fusion algorithms.