• Title/Summary/Keyword: vehicle detection system

Search Result 790, Processing Time 0.025 seconds

An Overloaded Vehicle Identifying System based on Object Detection Model (객체 인식 모델을 활용한 적재 불량 화물차 탐지 시스템)

  • Jung, Woojin;Park, Jinuk;Park, Yongju
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.12
    • /
    • pp.1794-1799
    • /
    • 2022
  • Recently, the increasing number of overloaded vehicles on the road poses a risk to traffic safety, such as falling objects, road damage, and chain collisions due to the abnormal weight distribution, and can cause great damage once an accident occurs. therefore we propose to build an object detection-based AI model to identify overloaded vehicles that cause such social problems. In addition, we present a simple yet effective method to construct an object detection model for the large-scale vehicle images. In particular, we utilize the large-scale of vehicle image sets provided by open AI-Hub, which include the overloaded vehicles. We inspected the specific features of sizes of vehicles and types of image sources, and pre-processed these images to train a deep learning-based object detection model. Also, we propose an integrated system for tracking the detected vehicles. Finally, we demonstrated that the detection performance of the overloaded vehicle was improved by about 23% compared to the one using raw data.

Vehicle Manufacturer Recognition using Deep Learning and Perspective Transformation

  • Ansari, Israfil;Shim, Jaechang
    • Journal of Multimedia Information System
    • /
    • v.6 no.4
    • /
    • pp.235-238
    • /
    • 2019
  • In real world object detection is an active research topic for understanding different objects from images. There are different models presented in past and had significant results. In this paper we are presenting vehicle logo detection using previous object detection models such as You only look once (YOLO) and Faster Region-based CNN (F-RCNN). Both the front and rear view of the vehicles were used for training and testing the proposed method. Along with deep learning an image pre-processing algorithm called perspective transformation is proposed for all the test images. Using perspective transformation, the top view images were transformed into front view images. This algorithm has higher detection rate as compared to raw images. Furthermore, YOLO model has better result as compare to F-RCNN model.

A Study on the Measurement of Intruding Vehicles Enforcement System of Traffic Jam (끼어들기위반 단속장비의 교통정체 측정에 관한 연구)

  • Yoo, Sung-Jun;Kim, Jun-Ha;Hong, Soon-Jin;Kang, Soo-Chul
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.12 no.6
    • /
    • pp.68-77
    • /
    • 2013
  • This study suggested experimental study results of congestion detection method for intruding vehicle enforcement system. This congestion detection method is developed to determine optimal operation criteria of intruding vehicle enforcement system as detecting traffic congestion. In ITS sector, traffic management systems generally have used a sectional travel speed for congestion detection. However, image sensors have high error rate of congestion detection because of speed error. This study suggested comprehensive congestion detection criteria based on speed and occupancy rate using field studies. As field study results, the proposed intruding vehicle enforcement system using image sensor is capable of accurately detecting the traffic congestion using sectional speed of 20km/h and occupancy rate of 60% as congestion detection criteria.

Vehicle Tracking System using HSV Color Space at nighttime (HSV 색 공간을 이용한 야간 차량 검출시스템)

  • Park, Ho-Sik
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.8 no.4
    • /
    • pp.270-274
    • /
    • 2015
  • We suggest that HSV Color Space may be used to detect a vehicle detecting system at nighttime. It is essential that a licence plate should be extracted when a vehicle is under surveillance. To do so, a licence plate may be enlarged to certain size after the aimed vehicle is taken picture from a distance by using Pan-Tilt-Zoom Camera. Either Mean-Shift or Optical Flow Algorithm is generally used for the purpose of a vehicle detection and trace, even though those algorithms have tendency to have difficulty in detection and trace a vehicle at night. By utilizing the fact that a headlight or taillight of a vehicle stands out when an input image is converted in to HSV Color Space, we are able to achieve improvement on those algorithms for the vehicle detection and trace. In this paper, we have shown that at night, the suggested method is efficient enough to detect a vehicle 93.9% from the front and 97.7% from the back.

A Study on the Assessment of Blind Spot Detection for Road Alignment (도로 선형에 따른 사각지역 감시장치 평가에 관한 연구)

  • Lee, Hongguk;Park, Hwanseo;Chang, Kyungjin;Yoo, Songmin
    • Journal of Auto-vehicle Safety Association
    • /
    • v.4 no.1
    • /
    • pp.27-32
    • /
    • 2012
  • Recently, in order to reduce traffic accident related fatalities, increasing number of studies are conducted regarding the vehicle safety enhancement devices. But very few studies about test procedures and requirements for vehicle safety systems are being carried out. Since BSD, as one of the most important safety features, is installed on a new vehicle, its performance test method has to be evaluated. Independent factors irrelevant to the device types including collision position, vehicle speed and closing speed are used to calculate test distance away from the current vehicle. Effect of roadway geometry as radius of curvature is introduced to propose possible misjudgement of following vehicle as adjacent one. The study results would be utilized to enhance the test procedure of BSD performance.

Traveling Direction Estimation of Autonomous Vehicle using Vision System (비젼 시스템을 이용한 자율 주행 차량의 실시간 주행 방향 추정)

  • 강준필;정길도
    • Proceedings of the IEEK Conference
    • /
    • 2001.06e
    • /
    • pp.127-130
    • /
    • 2001
  • In this paper, we describes a method of estimating traveling direction of a autonomous vehicle. For the development of autonomous vehicle, it is important to detect road lane and to reckon traveling direction. The object of a propose algorithm is to perform lane detection in real-time for standalone vision system. And we calculate efficent traveling direction to find steering angie for lateral control system. Therefore autonomous vehicle go forward the center of lane by adjusting the current steering angle using traveling direction.

  • PDF

A Study on Detection of Lane and Situation of Obstacle for AGV using Vision System (비전 시스템을 이용한 AGV의 차선인식 및 장애물 위치 검출에 관한 연구)

  • 이진우;이영진;이권순
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • 2000.11a
    • /
    • pp.207-217
    • /
    • 2000
  • In this paper, we describe an image processing algorithm which is able to recognize the road lane. This algorithm performs to recognize the interrelation between AGV and the other vehicle. We experimented on AGV driving test with color CCD camera which is setup on the top of vehicle and acquires the digital signal. This paper is composed of two parts. One is image preprocessing part to measure the condition of the lane and vehicle. This finds the information of lines using RGB ratio cutting algorithm, the edge detection and Hough transform. The other obtains the situation of other vehicles using the image processing and viewport. At first, 2 dimension image information derived from vision sensor is interpreted to the 3 dimension information by the angle and position of the CCD camera. Through these processes, if vehicle knows the driving conditions which are angle, distance error and real position of other vehicles, we should calculate the reference steering angle.

  • PDF

Fin failure diagnosis for non-linear supersonic air vehicle based on inertial sensors

  • Ashrafifar, Asghar;Jegarkandi, Mohsen Fathi
    • Advances in aircraft and spacecraft science
    • /
    • v.7 no.1
    • /
    • pp.1-17
    • /
    • 2020
  • In this paper, a new model-based Fault Detection and Diagnosis (FDD) method for an agile supersonic flight vehicle is presented. A nonlinear model, controlled by a classical closed loop controller and proportional navigation guidance in interception scenario, describes the behavior of the vehicle. The proposed FDD method employs the Inertial Navigation System (INS) data and nonlinear dynamic model of the vehicle to inform fins damage to the controller before leading to an undesired performance or mission failure. Broken, burnt, unactuated or not opened control surfaces cause a drastic change in aerodynamic coefficients and consequently in the dynamic model. Therefore, in addition to the changes in the control forces and moments, system dynamics will change too, leading to the failure detection process being encountered with difficulty. To this purpose, an equivalent aerodynamic model is proposed to express the dynamics of the vehicle, and the health of each fin is monitored by the value of a parameter which is estimated using an adaptive robust filter. The proposed method detects and isolates fins damages in a few seconds with good accuracy.

Development of an Autonomous Navigation System for Unmanned Ground Vehicle

  • Kim, Yoon-Gu;Lee, Ki-Dong
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.3 no.4
    • /
    • pp.244-250
    • /
    • 2008
  • This paper describes the design and implementation of an unmanned ground vehicle (UGV) and also estimates how well autonomous navigation and remote control of UGV can be performed through the optimized arbitration of several sensor data, which are acquired from vision, obstacle detection, positioning system, etc. For the autonomous navigation, lane detection and tracing, global positioning, and obstacle avoidance are necessarily required. In addition, for the remote control, two types of experimental environments are established. One is to use a commercial racing wheel module, and the other is to use a haptic device that is useful for a user application based on virtual reality. Experimental results show that autonomous navigation and remote control of the designed UGV can be achieved with more effectiveness and accuracy using the proper arbitration of sensor data and navigation plan.

  • PDF

Real-Time Vehicle Detector with Dynamic Segmentation and Rule-based Tracking Reasoning for Complex Traffic Conditions

  • Wu, Bing-Fei;Juang, Jhy-Hong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.5 no.12
    • /
    • pp.2355-2373
    • /
    • 2011
  • Vision-based vehicle detector systems are becoming increasingly important in ITS applications. Real-time operation, robustness, precision, accurate estimation of traffic parameters, and ease of setup are important features to be considered in developing such systems. Further, accurate vehicle detection is difficult in varied complex traffic environments. These environments include changes in weather as well as challenging traffic conditions, such as shadow effects and jams. To meet real-time requirements, the proposed system first applies a color background to extract moving objects, which are then tracked by considering their relative distances and directions. To achieve robustness and precision, the color background is regularly updated by the proposed algorithm to overcome luminance variations. This paper also proposes a scheme of feedback compensation to resolve background convergence errors, which occur when vehicles temporarily park on the roadside while the background image is being converged. Next, vehicle occlusion is resolved using the proposed prior split approach and through reasoning for rule-based tracking. This approach can automatically detect straight lanes. Following this step, trajectories are applied to derive traffic parameters; finally, to facilitate easy setup, we propose a means to automate the setting of the system parameters. Experimental results show that the system can operate well under various complex traffic conditions in real time.