• 제목/요약/키워드: Vision sensor

검색결과 833건 처리시간 0.022초

자율적 상호협동을 통한 모바일 센서의 자기위치파악 (Autonomous Cooperative Localization of Mobile Sensors)

  • 송하윤
    • 정보처리학회논문지A
    • /
    • 제17A권2호
    • /
    • pp.53-62
    • /
    • 2010
  • 모바일 센서 네트워크(Mobile Sensor Network)의 노드(Node)인 모바일 센서 차량(Mobile Sensor Vehicles)들은 특정 지역에 관해 획득한 정보를 서로 교환하고 통합하는 과정을 거쳐 자신의 위치를 파악하게 되는데 이를 지역화(localization)라 한다. 이때 모바일 센서 차량은 탑재된 각종 센서를 이용하여 자신의 위치 정보를 파악한다. 본 연구에서는 데드-레코닝(Dead-Reckoning), 컴퓨터 비전 기법, 그리고 RSSI(Received Signal Strength Identification)를 사용한 모바일 센서 차량(MSV)의 지역화 정밀도를 향상시키는 방안을 제시하고, 각각의 방식들이 가진 장점을 융합하여 보다 정밀한 지역화를 할 수 있는지 살펴본다.

생체모방 시각센서 기술동향 (Trends in Biomimetic Vision Sensor Technology)

  • 이태재;박윤재;구교인;서종모;조동일
    • 제어로봇시스템학회논문지
    • /
    • 제21권12호
    • /
    • pp.1178-1184
    • /
    • 2015
  • In conventional robotics, charge-coupled device (CCD) and complementary metal-oxide-semiconductor (CMOS) cameras have been utilized for acquiring vision information. These devices have problems, such as narrow optic angles and inefficiencies in visual information processing. Recently, biomimetic vision sensors for robotic applications have been receiving much attention. These sensors are more efficient than conventional vision sensors in terms of the optic angle, power consumption, dynamic range, and redundancy suppression. This paper presents recent research trends on biomimetic vision sensors and discusses future directions.

THE DEVELOPMENT OF THE NARROW GAP MULTI-PASS WELDING SYSTEM USING LASER VISION SYSTEM

  • Park, Hee-Chang;Park, Young-Jo;Song, Keun-Ho;Lee, Jae-Woong;Jung, Yung-Hwa;Luc Didier
    • 대한용접접합학회:학술대회논문집
    • /
    • 대한용접접합학회 2002년도 Proceedings of the International Welding/Joining Conference-Korea
    • /
    • pp.706-713
    • /
    • 2002
  • In the multi-pass welding of pressure vessels or ships, the mechanical touch sensor system is generally used together with a manipulator to measure the gap and depth of the narrow gap to perform seam tracking. Unfortunately, such mechanical touch sensors may commit measuring errors caused by the eterioration of the measuring device. An automation system of narrow gap multi-pass welding using a laser vision system which can track the seam line of narrow gap and which can control welding power has been developed. The joint profile of the narrow gap, with 250mm depth and 28mm width, can be captured by laser vision camera. The image is then processed for defining tracking positions of the torch during welding. Then, the real-time correction of lateral and vertical position of the torch can be done by the laser vision system. The adaptive control of welding conditions like welding Currents and welding speeds, can also be performed by the laser vision system, which cannot be done by conventional mechanical touch systems. The developed automation system will be adopted to reduce the idle time of welders, which happens frequently in conventional long welding processes, and to improve the reliability of the weld quality as well.

  • PDF

The Development of the Narrow Gap Multi-Pass Welding System Using Laser Vision System

  • Park, H.C.;Park, Y.J.;Song, K.H.;Lee, J.W.;Jung, Y.H.;Didier, L.
    • International Journal of Korean Welding Society
    • /
    • 제2권1호
    • /
    • pp.45-51
    • /
    • 2002
  • In the multi-pass welding of pressure vessels or ships, the mechanical touch sensor system is generally used together with a manipulator to measure the gap and depth of the narrow gap to perform seam tracking. Unfortunately, such mechanical touch sensors may commit measuring errors caused by the deterioration of the measuring device. An automation system of narrow gap multi-pass welding using a laser vision system which can track the seam line of narrow gap and which can control welding power has been developed. The joint profile of the narrow gap, with 250mm depth and 28mm width, can be captured by laser vision camera. The image is then processed for defining tracking positions of the torch during welding. Then, the real-time correction of lateral and vertical position of the torch can be done by the laser vision system. The adaptive control of welding conditions like welding currents and welding speeds, can also be performed by the laser vision system, which cannot be done by conventional mechanical touch systems. The developed automation system will be adopted to reduce the idle time of welders, which happens frequently in conventional long welding processes, and to improve the reliability of the weld quality as well.

  • PDF

자율주행 차량의 도로 평면선형 기반 차로이탈 허용 범위 산정 (Estimating a Range of Lane Departure Allowance based on Road Alignment in an Autonomous Driving Vehicle)

  • 김영민;김형수
    • 한국ITS학회 논문지
    • /
    • 제15권4호
    • /
    • pp.81-90
    • /
    • 2016
  • 자율주행 차량은 변화하는 도로환경에 스스로 대응 가능하여야 하여, 인간 운전자 수준의 도로환경 인지성능을 확보하여야 한다. 자율주행 차량의 센서 중 영상센서는 주행방향 결정 및 차로이탈 방지 등 조향제어 수행을 위하여 차선인식 기능을 수행한다. 현재 제시된 영상센서의 차선인식 성능기준은 ADAS(Advanced Driver Assistance System)과 관련된 '운전자 보조' 관점의 성능기준으로서, 자율주행 차량의 '주체적 인지'를 위한 성능조건과 상이할 것으로 판단된다. 본 연구에서는 자율주행 시 차선인식이 비정상적으로 지속되어, 직선구간에서 곡선구간으로 진입하는 차량이 조향실패에 따라 차로를 이탈하는 상황을 가정하였다. 차량 이동궤적을 기반하여 차로이탈 상황을 모형화하고, 차로이탈 허용 수준에 따른 자율주행 차량 영상센서 성능수준을 제시하였다. 분석 결과 승용차 조건에서 차선인식 기능이 1초 이상 연속적인 오작동을 일으킨다면 차로이탈에 의한 위험한 상황에 놓일 수 있으며, 자율주행 차량을 위하여 현재 ADAS 영상센서 성능평가 방법에서의 차로이탈조건보다 심각한 차로이탈상황을 고려한 영상센서 성능평가 방안이 필요할 것으로 판단된다.

DGPS와 기계시각을 이용한 자율주행 콤바인의 개발 (Development of Autonomous Combine Using DGPS and Machine Vision)

  • 조성인;박영식;최창현;황헌;김명락
    • Journal of Biosystems Engineering
    • /
    • 제26권1호
    • /
    • pp.29-38
    • /
    • 2001
  • A navigation system was developed for autonomous guidance of a combine. It consisted of a DGPS, a machine vision system, a gyro sensor and an ultrasonic sensor. For an autonomous operation of the combine, target points were determined at first. Secondly, heading angle and offset were calculated by comparing current positions obtained from the DGPS with the target points. Thirdly, the fuzzy controller decided steering angle by the fuzzy inference that took 3 inputs of heading angle, offset and distance to the bank around the rice field. Finally, the hydraulic system was actuated for the combine steering. In the case of the misbehavior of the DGPS, the machine vision system found the desired travel path. In this way, the combine traveled straight paths to the traget point and then turned to the next target point. The gyro sensor was used to check the turning angle. The autonomous combine traveled within 31.11cm deviation(RMS) on the straight paths and harvested up to 96% of the whole rice field. The field experiments proved a possibility of autonomous harvesting. Improvement of the DGPS accuracy should be studied further by compensation variations of combines attitude due to unevenness of the rice field.

  • PDF

다중센서 융합 상이 지도를 통한 다중센서 기반 3차원 복원 결과 개선 (Refinements of Multi-sensor based 3D Reconstruction using a Multi-sensor Fusion Disparity Map)

  • 김시종;안광호;성창훈;정명진
    • 로봇학회논문지
    • /
    • 제4권4호
    • /
    • pp.298-304
    • /
    • 2009
  • This paper describes an algorithm that improves 3D reconstruction result using a multi-sensor fusion disparity map. We can project LRF (Laser Range Finder) 3D points onto image pixel coordinatesusing extrinsic calibration matrixes of a camera-LRF (${\Phi}$, ${\Delta}$) and a camera calibration matrix (K). The LRF disparity map can be generated by interpolating projected LRF points. In the stereo reconstruction, we can compensate invalid points caused by repeated pattern and textureless region using the LRF disparity map. The result disparity map of compensation process is the multi-sensor fusion disparity map. We can refine the multi-sensor 3D reconstruction based on stereo vision and LRF using the multi-sensor fusion disparity map. The refinement algorithm of multi-sensor based 3D reconstruction is specified in four subsections dealing with virtual LRF stereo image generation, LRF disparity map generation, multi-sensor fusion disparity map generation, and 3D reconstruction process. It has been tested by synchronized stereo image pair and LRF 3D scan data.

  • PDF

AVM 카메라와 융합을 위한 다중 상용 레이더 데이터 획득 플랫폼 개발 (Development of Data Logging Platform of Multiple Commercial Radars for Sensor Fusion With AVM Cameras)

  • 진영석;전형철;신영남;현유진
    • 대한임베디드공학회논문지
    • /
    • 제13권4호
    • /
    • pp.169-178
    • /
    • 2018
  • Currently, various sensors have been used for advanced driver assistance systems. In order to overcome the limitations of individual sensors, sensor fusion has recently attracted the attention in the field of intelligence vehicles. Thus, vision and radar based sensor fusion has become a popular concept. The typical method of sensor fusion involves vision sensor that recognizes targets based on ROIs (Regions Of Interest) generated by radar sensors. Especially, because AVM (Around View Monitor) cameras due to their wide-angle lenses have limitations of detection performance over near distance and around the edges of the angle of view, for high performance of sensor fusion using AVM cameras and radar sensors the exact ROI extraction of the radar sensor is very important. In order to resolve this problem, we proposed a sensor fusion scheme based on commercial radar modules of the vendor Delphi. First, we configured multiple radar data logging systems together with AVM cameras. We also designed radar post-processing algorithms to extract the exact ROIs. Finally, using the developed hardware and software platforms, we verified the post-data processing algorithm under indoor and outdoor environments.

IR 센서와 영상정보를 이용한 다 개체 로봇의 장애물 회피 방법 (Obstacle Avoidance Method for Multi-Agent Robots Using IR Sensor and Image Information)

  • 전병승;이도영;최인환;모영학;박정민;임묘택
    • 제어로봇시스템학회논문지
    • /
    • 제18권12호
    • /
    • pp.1122-1131
    • /
    • 2012
  • This paper presents obstacle avoidance method for scout robot or industrial robot in unknown environment by using IR sensor and vision system. In the proposed method, robots share the information where the obstacles are located in real-time, thus the robots can choose the best path for obstacle avoidance. Using IR sensor and vision system, multiple robots efficiently evade the obstacles by the proposed cooperation method. No landmark is used at wall or floor in experiment environment. The obstacles don't have specific color or shape. To get the information of the obstacle, vision system extracts the obstacle coordinate by using an image labeling method. The information obtained by IR sensor is about the obstacle range and the locomotion direction to decide the optimal path for avoiding obstacle. The experiment was conducted in $7m{\times}7m$ indoor environment with two-wheeled mobile robots. It is shown that multiple robots efficiently move along the optimal path in cooperation with each other in the space where obstacles are located.

해저 집광차량의 위치 추정을 위한 확장 칼만 필터 알고리즘 (Development of an Extended Kalman Filter Algorithm for the Localization of Underwater Mining Vehicles)

  • 원문철;차혁상;홍섭
    • 한국해양공학회지
    • /
    • 제19권2호
    • /
    • pp.82-89
    • /
    • 2005
  • This study deals with the development of the extended Kalman filter(EKF) algorithm for the localization of underwater mining vehicles. Both simulation and experimental studies in a test bed are carried out. For the experiments, a scale dawn tracked vehicle is run in a soil bin containing cohesive soil of bentonite-water mixture. To develop the EKF algorithm, we use a kinematic model including the inner/outer track slips and the slip angle for the vehicle. The measurements include the inner and outer wheel speeds from encoders, the heading angle from a compass sensor and a fiber optic rate gyro, and x and y coordinate position values from a vision system. The vision sensor replaces the LBL(Long Base Line) sonar system used in the real underwater positioning situations. Artificial noise signals mimicking the real LBL noise signal are added to the vision sensor information. To know the mean slip values of the tracks in both straight and cornering maneuver, several trial running experiments are executed before applying the EKF algorithm. Experimental results show the effectiveness of the EKF algorithm in rejecting the sensor measurements noise. Also, the simulation and experimental results show close correlations.