• Title/Summary/Keyword: Sensor Fusion System

Search Result 435, Processing Time 0.029 seconds

Implementation of a Sensor Fusion FPGA for an IoT System (사물인터넷 시스템을 위한 센서 융합 FPGA 구현)

  • Jung, Chang-Min;Lee, Kwang-Yeob;Park, Tae-Ryong
    • Journal of IKEEE
    • /
    • v.19 no.2
    • /
    • pp.142-147
    • /
    • 2015
  • In this paper, a Kalman filter-based sensor fusion filter that measures posture by calibrating and combining information obtained from acceleration and gyro sensors was proposed. Recent advancements in sensor network technology have required sensor fusion technology. In the proposed approach, the nonlinear system model of the filter is converted to a linear system model through a Jacobian matrix operation, and the measurement value predicted via Euler integration. The proposed filter was implemented at an operating frequency of 74 MHz using a Virtex-6 FPGA Board from Xilinx Inc. Further, the accuracy and reliability of the measured posture were validated by comparing the values obtained using the implemented filters with those from existing filters.

Efficient Digitizing in Reverse Engineering By Sensor Fusion (역공학에서 센서융합에 의한 효율적인 데이터 획득)

  • Park, Young-Kun;Ko, Tae-Jo;Kim, Hrr-Sool
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.18 no.9
    • /
    • pp.61-70
    • /
    • 2001
  • This paper introduces a new digitization method with sensor fusion for shape measurement in reverse engineering. Digitization can be classified into contact and non-contact type according to the measurement devices. Important thing in digitization is speed and accuracy. The former is excellent in speed and the latter is good for accuracy. Sensor fusion in digitization intends to incorporate the merits of both types so that the system can be automatized. Firstly, non-contact sensor with vision system acquires coarse 3D point data rapidly. This process is needed to identify and loco]ice the object located at unknown position on the table. Secondly, accurate 3D point data can be automatically obtained using scanning probe based on the previously measured coarse 3D point data. In the research, a great number of measuring points of equi-distance were instructed along the line acquired by the vision system. Finally, the digitized 3D point data are approximated to the rational B-spline surface equation, and the free-formed surface information can be transferred to a commercial CAD/CAM system via IGES translation in order to machine the modeled geometric shape.

  • PDF

Study on INS/GPS Sensor Fusion for Agricultural Vehicle Navigation System (농업기계 내비게이션을 위한 INS/GPS 통합 연구)

  • Noh, Kwang-Mo;Park, Jun-Gul;Chang, Young-Chang
    • Journal of Biosystems Engineering
    • /
    • v.33 no.6
    • /
    • pp.423-429
    • /
    • 2008
  • This study was performed to investigate the effects of inertial navigation system (INS) / global positioning system (GPS) sensor fusion for agricultural vehicle navigation. An extended Kalman filter algorithm was adopted for INS/GPS sensor fusion in an integrated mode, and the vehicle dynamic model was used instead of the navigation state error model. The INS/GPS system was consisted of a low-cost gyroscope, an odometer and a GPS receiver, and its performance was tested through computer simulations. When measurement noises of GPS receiver were 10, 1.0, 0.5, and 0.2 m ($1{\sigma}$), RMS position and heading errors of INS/GPS system at 5 m/s straight path were remarkably reduced with 10%, 35%, 40%, and 60% of those obtained from the GPS receiver, respectively. The decrease of position and heading errors by using INS/GPS rather than stand-alone GPS can provide more stable steering of agricultural equipments. Therefore, the low-cost INS/GPS system using the extended Kalman filter algorithm may enable the self-autonomous navigation to meet required performance like stable steering or more less position errors even in slow-speed operation.

Development of 3D Point Cloud Mapping System Using 2D LiDAR and Commercial Visual-inertial Odometry Sensor (2차원 라이다와 상업용 영상-관성 기반 주행 거리 기록계를 이용한 3차원 점 구름 지도 작성 시스템 개발)

  • Moon, Jongsik;Lee, Byung-Yoon
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.16 no.3
    • /
    • pp.107-111
    • /
    • 2021
  • A 3D point cloud map is an essential elements in various fields, including precise autonomous navigation system. However, generating a 3D point cloud map using a single sensor has limitations due to the price of expensive sensor. In order to solve this problem, we propose a precise 3D mapping system using low-cost sensor fusion. Generating a point cloud map requires the process of estimating the current position and attitude, and describing the surrounding environment. In this paper, we utilized a commercial visual-inertial odometry sensor to estimate the current position and attitude states. Based on the state value, the 2D LiDAR measurement values describe the surrounding environment to create a point cloud map. To analyze the performance of the proposed algorithm, we compared the performance of the proposed algorithm and the 3D LiDAR-based SLAM (simultaneous localization and mapping) algorithm. As a result, it was confirmed that a precise 3D point cloud map can be generated with the low-cost sensor fusion system proposed in this paper.

Fusion of Sonar and Laser Sensor for Mobile Robot Environment Recognition

  • Kim, Kyung-Hoon;Cho, Hyung-Suck
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.91.3-91
    • /
    • 2001
  • A sensor fusion scheme for mobile robot environment recognition that incorporates range data and contour data is proposed. Ultrasonic sensor provides coarse spatial description but guarantees open space with no obstacle within sonic cone with relatively high belief. Laser structured light system provides detailed contour description of environment but prone to light noise and is easily affected by surface reflectivity. Overall fusion process is composed of two stages: Noise elimination and belief updates. Dempster Shafer´s evidential reasoning is applied at each stage. Open space estimation from sonar range measurements brings elimination of noisy lines from laser sensor. Comparing actual sonar data to the simulated sonar data enables ...

  • PDF

Sensor Fusion and Neural Network Analysis for Drill-Wear Monitoring (센서퓨젼 기반의 인공신경망을 이용한 드릴 마모 모니터링)

  • Prasopchaichana, Kritsada;Kwon, Oh-Yang
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.17 no.1
    • /
    • pp.77-85
    • /
    • 2008
  • The objective of the study is to construct a sensor fusion system for tool-condition monitoring (TCM) that will lead to a more efficient and economical drill usage. Drill-wear monitoring has an important attribute in the automatic machining processes as it can help preventing the damage of tools and workpieces, and optimizing the drill usage. In this study, we present the architectures of a multi-layer feed-forward neural network with Levenberg-Marquardt training algorithm based on sensor fusion for the monitoring of drill-wear condition. The input features to the neural networks were extracted from AE, vibration and current signals using the wavelet packet transform (WPT) analysis. Training and testing were performed at a moderate range of cutting conditions in the dry drilling of steel plates. The results show good performance in drill- wear monitoring by the proposed method of sensor fusion and neural network analysis.

A Cyber-Physical Information System for Smart Buildings with Collaborative Information Fusion

  • Liu, Qing;Li, Lanlan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.5
    • /
    • pp.1516-1539
    • /
    • 2022
  • This article shows a set of physical information fusion IoT systems that we designed for smart buildings. Its essence is a computer system that combines physical quantities in buildings with quantitative analysis and control. In the part of the Internet of Things, its mechanism is controlled by a monitoring system based on sensor networks and computer-based algorithms. Based on the design idea of the agent, we have realized human-machine interaction (HMI) and machine-machine interaction (MMI). Among them, HMI is realized through human-machine interaction, while MMI is realized through embedded computing, sensors, controllers, and execution. Device and wireless communication network. This article mainly focuses on the function of wireless sensor networks and MMI in environmental monitoring. This function plays a fundamental role in building security, environmental control, HVAC, and other smart building control systems. The article not only discusses various network applications and their implementation based on agent design but also demonstrates our collaborative information fusion strategy. This strategy can provide a stable incentive method for the system through collaborative information fusion when the sensor system is unstable in the physical measurements, thereby preventing system jitter and unstable response caused by uncertain disturbances and environmental factors. This article also gives the results of the system test. The results show that through the CPS interaction of HMI and MMI, the intelligent building IoT system can achieve comprehensive monitoring, thereby providing support and expansion for advanced automation management.

The Posture Estimation of Mobile Robots Using Sensor Data Fusion Algorithm (센서 데이터 융합을 이용한 이동 로보트의 자세 추정)

  • 이상룡;배준영
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.16 no.11
    • /
    • pp.2021-2032
    • /
    • 1992
  • A redundant sensor system, which consists of two incremental encoders and a gyro sensor, has been proposed for the estimation of the posture of mobile robots. A hardware system was built for estimating the heading angle change of the mobile robot from outputs of the gyro sensor. The proposed hardware system of the gyro sensor produced an accurate estimate for the heading angle change of the robot. A sensor data fusion algorithm has been developed to find the optimal estimates of the heading angle change based on the stochastic measurement equations of our readundant sensor system. The maximum likelihood estimation method is applied to combine the noisy measurement data from both encoders and gyro sensor. The proposed fusion algorithm demonstrated a satisfactory performance, showing significantly reduced estimation error compared to the conventional method, in various navigation experiments.

A Study of Observability Analysis and Data Fusion for Bias Estimation in a Multi-Radar System (다중 레이더 환경에서의 바이어스 오차 추정의 가관측성에 대한 연구와 정보 융합)

  • Won, Gun-Hee;Song, Taek-Lyul;Kim, Da-Sol;Seo, Il-Hwan;Hwang, Gyu-Hwan
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.8
    • /
    • pp.783-789
    • /
    • 2011
  • Target tracking performance improvement using multi-sensor data fusion is a challenging work. However, biases in the measurements should be removed before various data fusion techniques are applied. In this paper, a bias removing algorithm using measurement data from multi-radar tracking systems is proposed and evaluated by computer simulation. To predict bias estimation performance in various geometric relations between the radar systems and target, a system observability index is proposed and tested via computer simulation results. It is also studied that target tracking which utilizes multi-sensor data fusion with bias-removed measurements results in better performance.

Estimation of Train Position Using Sensor Fusion Technique (센서융합에 의한 열차위치 추정방법)

  • Yoon Hee-Sang;Park Tae-Hyoung;Yoon Yong-Gi;Hwang Jong-Gyu;Lee Jae-Ho
    • Journal of the Korean Society for Railway
    • /
    • v.8 no.2
    • /
    • pp.155-160
    • /
    • 2005
  • We propose a tram position estimation method for automatic train control system. The accurate train position should be continuously feedback to control system for safe and efficient operation of trains in railway. In this paper, we propose the sensor fusion method integrating a tachometer, a transponder, and a doppler sensor far estimation of train position. The external sensors(transponder, doppler sensor) are used to compensate for the error of internal sensor (tachometer). The Kalman filter is also applied to reduce the measurement error of the sensors. Simulation results are then presented to verify the usefulness of the proposed method.