• 제목/요약/키워드: sensor- fusion

Search Result 822, Processing Time 0.029 seconds

Design of Multi-Sensor Data Fusion Filter for a Flight Test System (비행시험시스템용 다중센서 자료융합필터 설계)

  • Lee, Yong-Jae;Lee, Ja-Sung
    • The Transactions of the Korean Institute of Electrical Engineers D
    • /
    • v.55 no.9
    • /
    • pp.414-419
    • /
    • 2006
  • This paper presents a design of a multi-sensor data fusion filter for a Flight Test System. The multi-sensor data consist of positional information of the target from radars and a telemetry system. The data fusion filter has a structure of a federated Kalman filter and is based on the Singer dynamic target model. It consists of dedicated local filter for each sensor, generally operating in parallel, plus a master fusion filter. A fault detection and correction algorithms are included in the local filter for treating bad measurements and sensor faults. The data fusion is carried out in the fusion filter by using maximum likelihood estimation algorithm. The performance of the designed fusion filter is verified by using both simulation data and real data.

A Development of Wireless Sensor Networks for Collaborative Sensor Fusion Based Speaker Gender Classification (협동 센서 융합 기반 화자 성별 분류를 위한 무선 센서네트워크 개발)

  • Kwon, Ho-Min
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.12 no.2
    • /
    • pp.113-118
    • /
    • 2011
  • In this paper, we develop a speaker gender classification technique using collaborative sensor fusion for use in a wireless sensor network. The distributed sensor nodes remove the unwanted input data using the BER(Band Energy Ration) based voice activity detection, process only the relevant data, and transmit the hard labeled decisions to the fusion center where a global decision fusion is carried out. This takes advantages of power consumption and network resource management. The Bayesian sensor fusion and the global weighting decision fusion methods are proposed to achieve the gender classification. As the number of the sensor nodes varies, the Bayesian sensor fusion yields the best classification accuracy using the optimal operating points of the ROC(Receiver Operating Characteristic) curves_ For the weights used in the global decision fusion, the BER and MCL(Mutual Confidence Level) are employed to effectively combined at the fusion center. The simulation results show that as the number of the sensor nodes increases, the classification accuracy was even more improved in the low SNR(Signal to Noise Ration) condition.

Visual Control of Mobile Robots Using Multisensor Fusion System

  • Kim, Jung-Ha;Sugisaka, Masanori
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.91.4-91
    • /
    • 2001
  • In this paper, a development of the sensor fusion algorithm for a visual control of mobile robot is presented. The output data from the visual sensor include a time-lag due to the image processing computation. The sampling rate of the visual sensor is considerably low so that it should be used with other sensors to control fast motion. The main purpose of this paper is to develop a method which constitutes a sensor fusion system to give the optimal state estimates. The proposed sensor fusion system combines the visual sensor and inertial sensor using a modified Kalman filter. A kind of multi-rate Kalman filter which treats the slow sampling rate ...

  • PDF

Robust Hierarchical Data Fusion Scheme for Large-Scale Sensor Network

  • Song, Il Young
    • Journal of Sensor Science and Technology
    • /
    • v.26 no.1
    • /
    • pp.1-6
    • /
    • 2017
  • The advanced driver assistant system (ADAS) requires the collection of a large amount of information including road conditions, environment, vehicle status, condition of the driver, and other useful data. In this regard, large-scale sensor networks can be an appropriate solution since they have been designed for this purpose. Recent advances in sensor network technology have enabled the management and monitoring of large-scale tasks such as the monitoring of road surface temperature on a highway. In this paper, we consider the estimation and fusion problems of the large-scale sensor networks used in the ADAS. Hierarchical fusion architecture is proposed for an arbitrary topology of the large-scale sensor network. A robust cluster estimator is proposed to achieve robustness of the network against outliers or failure of sensors. Lastly, a robust hierarchical data fusion scheme is proposed for the communication channel between the clusters and fusion center, considering the non-Gaussian channel noise, which is typical in communication systems.

A Study on the Performance Improvement of Position Estimation using the Multi-Sensor Fusion in a Combat Vehicle (다중센서 융합을 통한 전투차량의 위치추정 성능 개선에 관한 연구)

  • Nam, Yoonwook;Kim, Sungho;Kim, Kitae;Kim, Hyoung-Nam
    • Journal of Korean Society for Quality Management
    • /
    • v.49 no.1
    • /
    • pp.1-15
    • /
    • 2021
  • Purpose: The purpose of this study was to propose a sensor fusion algorithm that integrates vehicle motion sensor(VMS) into the hybrid navigation system. Methods: How to evaluate the navigation performance was comparison test with the hybrid navigation system and the sensor fusion method. Results: The results of this study are as follows. It was found that the effects of the sensor fusion method and α value estimation were significant. Applying these greatly improves the navigation performance. Conclusion: For improving the reliability of navigation system, the sensor fusion method shows that the proposed method improves the navigation performance in a combat vehicle.

Development of Data Logging Platform of Multiple Commercial Radars for Sensor Fusion With AVM Cameras (AVM 카메라와 융합을 위한 다중 상용 레이더 데이터 획득 플랫폼 개발)

  • Jin, Youngseok;Jeon, Hyeongcheol;Shin, Young-Nam;Hyun, Eugin
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.13 no.4
    • /
    • pp.169-178
    • /
    • 2018
  • Currently, various sensors have been used for advanced driver assistance systems. In order to overcome the limitations of individual sensors, sensor fusion has recently attracted the attention in the field of intelligence vehicles. Thus, vision and radar based sensor fusion has become a popular concept. The typical method of sensor fusion involves vision sensor that recognizes targets based on ROIs (Regions Of Interest) generated by radar sensors. Especially, because AVM (Around View Monitor) cameras due to their wide-angle lenses have limitations of detection performance over near distance and around the edges of the angle of view, for high performance of sensor fusion using AVM cameras and radar sensors the exact ROI extraction of the radar sensor is very important. In order to resolve this problem, we proposed a sensor fusion scheme based on commercial radar modules of the vendor Delphi. First, we configured multiple radar data logging systems together with AVM cameras. We also designed radar post-processing algorithms to extract the exact ROIs. Finally, using the developed hardware and software platforms, we verified the post-data processing algorithm under indoor and outdoor environments.

A Study on the Fail Safety of Electronics Power Steering Using Sensor Fusion (Sensor Fusion을 이용한 전자식 조향장치의 Fail Safety 연구)

  • Kim, Byeong-Woo;Her, Jin;Cho, Hyun-Duck;Lee, Young-Seok
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.57 no.8
    • /
    • pp.1371-1376
    • /
    • 2008
  • A Steer-by-Wire system has so many advantages comparing with conventional mechanical steering system that it is expected to take key role in future environment friendly vehicle and intelligent transportation system. The mechanical connection between the hand wheel and the front axle will become obsolete. SBW system provides many benefits in terms of functionality, and at the same time present significant challenges - fault tolerant, fail safety - too. In this paper, failure analysis of SBW system will be performed and than sensor fusion technique will be proposed for fail safety of SBW system. A sensor fusion logic of steering angle sensor by using steering angle sensor, torque sensor and rack position sensor will be developed and simulated by fault injection simulation.

Sliding Window Filtering for Ground Moving Targets with Cross-Correlated Sensor Noises

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • Journal of Sensor Science and Technology
    • /
    • v.28 no.3
    • /
    • pp.146-151
    • /
    • 2019
  • This paper reports a sliding window filtering approach for ground moving targets with cross-correlated sensor noise and uncertainty. In addition, the effect of uncertain parameters during a tracking error on the model performance is considered. A distributed fusion sliding window filter is also proposed. The distributed fusion filtering algorithm represents the optimal linear combination of local filters under the minimum mean-square error criterion. The derivation of the error cross-covariances between the local sliding window filters is the key to the proposed method. Simulation results of the motion of the ground moving target a demonstrate high accuracy and computational efficiency of the distributed fusion sliding window filter.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Distributed Fusion Estimation for Sensor Network

  • Song, Il Young;Song, Jin Mo;Jeong, Woong Ji;Gong, Myoung Sool
    • Journal of Sensor Science and Technology
    • /
    • v.28 no.5
    • /
    • pp.277-283
    • /
    • 2019
  • In this paper, we propose a distributed fusion estimation for sensor networks using a receding horizon strategy. Communication channels were modelled as Markov jump systems, and a posterior probability distribution for communication channel characteristics was calculated and incorporated into the filter to allow distributed fusion estimation to handle path loss observation situations automatically. To implement distributed fusion estimation, a Kalman-Consensus filter was then used to obtain the average consensus, based on the estimates of sensors randomly distributed across sensor networks. The advantages of the proposed algorithms were then verified using a large-scale sensor network example.