• 제목/요약/키워드: Vision System

검색결과 3,630건 처리시간 0.04초

Image-based structural dynamic displacement measurement using different multi-object tracking algorithms

  • Ye, X.W.;Dong, C.Z.;Liu, T.
    • Smart Structures and Systems
    • /
    • 제17권6호
    • /
    • pp.935-956
    • /
    • 2016
  • With the help of advanced image acquisition and processing technology, the vision-based measurement methods have been broadly applied to implement the structural monitoring and condition identification of civil engineering structures. Many noncontact approaches enabled by different digital image processing algorithms are developed to overcome the problems in conventional structural dynamic displacement measurement. This paper presents three kinds of image processing algorithms for structural dynamic displacement measurement, i.e., the grayscale pattern matching (GPM) algorithm, the color pattern matching (CPM) algorithm, and the mean shift tracking (MST) algorithm. A vision-based system programmed with the three image processing algorithms is developed for multi-point structural dynamic displacement measurement. The dynamic displacement time histories of multiple vision points are simultaneously measured by the vision-based system and the magnetostrictive displacement sensor (MDS) during the laboratory shaking table tests of a three-story steel frame model. The comparative analysis results indicate that the developed vision-based system exhibits excellent performance in structural dynamic displacement measurement by use of the three different image processing algorithms. The field application experiments are also carried out on an arch bridge for the measurement of displacement influence lines during the loading tests to validate the effectiveness of the vision-based system.

A VISION SYSTEM IN ROBOTIC WELDING

  • Absi Alfaro, S. C.
    • 대한용접접합학회:학술대회논문집
    • /
    • 대한용접접합학회 2002년도 Proceedings of the International Welding/Joining Conference-Korea
    • /
    • pp.314-319
    • /
    • 2002
  • The Automation and Control Group at the University of Brasilia is developing an automatic welding station based on an industrial robot and a controllable welding machine. Several techniques were applied in order to improve the quality of the welding joints. This paper deals with the implementation of a laser-based computer vision system to guide the robotic manipulator during the welding process. Currently the robot is taught to follow a prescribed trajectory which is recorded a repeated over and over relying on the repeatability specification from the robot manufacturer. The objective of the computer vision system is monitoring the actual trajectory followed by the welding torch and to evaluate deviations from the desired trajectory. The position errors then being transfer to a control algorithm in order to actuate the robotic manipulator and cancel the trajectory errors. The computer vision systems consists of a CCD camera attached to the welding torch, a laser emitting diode circuit, a PC computer-based frame grabber card, and a computer vision algorithm. The laser circuit establishes a sharp luminous reference line which images are captured through the video camera. The raw image data is then digitized and stored in the frame grabber card for further processing using specifically written algorithms. These image-processing algorithms give the actual welding path, the relative position between the pieces and the required corrections. Two case studies are considered: the first is the joining of two flat metal pieces; and the second is concerned with joining a cylindrical-shape piece to a flat surface. An implementation of this computer vision system using parallel computer processing is being studied.

  • PDF

컴퓨터 시각에 의한 고형 입자의 소량 유동율 측정장치 개발 (Development of a Computer Vision System to Measure Low Flow Rate of Solid Particles)

  • 이경환;서상룡;문정기
    • Journal of Biosystems Engineering
    • /
    • 제23권5호
    • /
    • pp.481-490
    • /
    • 1998
  • A computer vision system to measure low flow rate of solid particles was developed and tested to examine its performance with various sized 7 kinds of seeds, perilla, mung bean, paddy, small red bean, black soybean, Cuba bean and small potato tuber. The test was performed for two types of particle flow, continuous and discontinuous. For the continuous flow tested with perilla, mung bean and paddy, the tests resulted correlation coefficients for the flow rates measured by the computer vision and direct method about 0.98. Average errors of the computer vision measurement were in a range of 6∼9%. For the discontinuous flow tested with small red bean, black soybean, Cuba bean and small potato tuber, the tests resulted correlation coefficients for the flow rates measured by the computer vision and direct method 0.98∼0.99. Average errors of the computer vision measurement were in a range of 5∼10%. Performance of the computer vision system was compared with that of the conventional optical sensor to count particles in discontinuous flow. The comparison was done with black soybean, Cuba bean and small potato tuber, and resulted that the computer vision has much better performance than the optical sensor in a sense of precision of the measurement.

  • PDF

브레이크 캘리퍼 내부 검사를 위한 비전시스템 개발 (Development of the Vision System to Inspect the Inside of the Brake Calipers)

  • 권경훈;추형곤;김진영;강준희
    • 센서학회지
    • /
    • 제26권1호
    • /
    • pp.39-43
    • /
    • 2017
  • Development of vision system as a nondestructive evaluation system can be very useful in screening the defective mechanical parts before they are assembled into the final product. Since the tens of thousands of the mechanical parts are used in an automobile carefully inspecting the quality of the mechanical parts is very important to maximize the performance of the automobile. To sort out the defective mechanical parts before they are assembled, auto parts fabrication companies employ various inspection systems. Nondestructive evaluation systems are getting rapidly popular among various inspection systems. In this study, we have developed a vision system to inspect the inside of the brake caliper, a part that is used to compose a brake which is the most important to the safety of the drivers and the passengers. In a brake caliper, a piston is pushed against the brake disk by oil pressure, causing a friction to damp the rotation of the wheel. Inside the caliper, a groove is positioned to adopt an oil seal to prevent the oil leaks. Inspecting the groove with our vision system, we could examine the existence of the contaminants which are normally the residual tiny pieces from the machining process. We used a high resolution GigE camera, 360 degree lens to look in the inside view of the caliper at once, and a special illumination system in this vision system. We used the edge detection technique to successfully detect the contaminants which were in the form of small metal chips. Labview graphical program was used to process the digital data from the camera and to display the vision and the statistics of the contaminants. We were very successful in detecting the contaminants from the various size calipers. We think we are ready to employ this vision system to the caliper production factories.

MYSTERY CIRCLE 시스템을 이용한 폭주부족형 사위 및 간헐성 사시 환자의 시기능 훈련 효과 연구 (The Effect of Vision Training on Exophoria and Intermittent Extropia Using MYSTERY CIRCLE System)

  • 이창선;김건규;전영기;김종기;최철희;김기홍
    • 한국안광학회지
    • /
    • 제15권4호
    • /
    • pp.373-379
    • /
    • 2010
  • 목적: 이 연구의 목적은 MYSTERY CIRCLE 시스템을 이용하여 폭주부족 사위와 간헐성 사시 환자의 시기능 훈련효과를 조사하였다. 방법: 시기능 훈련 대상자는 안질환과 조절이상 및 수직사위가 없는 폭주부족 사위(n=18)와 간헐성 사시(n=8) 이상자 26명을 대상으로 추적 관찰하였고, 훈련기간은 8주간 매주 한번은 안경원에 방문하여 시기능 훈련에 따른 변화를 측정하였다. 결과: MYSTERY CIRCLE 시스템으로 시기능 훈련을 실행한 결과 기능적 및 감각적 증상이 개선되었다. 그리고 Worth 4 Dot검사, 입체시 검사 및 적색렌즈 융합 속도 검사도 개선을 보여 주었다. 결론: MISTERY CIRCLE 시기능 훈련 시스템을 이용한 시기능 훈련은 좋은 양안시 이상 개선 효과를 보여주었다.

비젼시스템을 이용한 이동로봇의 서보제어 (Servo control of mobile robot using vision system)

  • 백승민;국태용
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1997년도 한국자동제어학술회의논문집; 한국전력공사 서울연수원; 17-18 Oct. 1997
    • /
    • pp.540-543
    • /
    • 1997
  • In this paper, a precise trajectory tracking method for mobile robot using a vision system is presented. In solving the problem of precise trajectory tracking, a hierarchical control structure is used which is composed of the path planer, vision system, and dynamic controller. When designing the dynamic controller, non-ideal conditions such as parameter variation, frictional force, and external disturbance are considered. The proposed controller can learn bounded control input for repetitive or periodic dynamics compensation which provides robust and adaptive learning capability. Moreover, the usage of vision system makes mobile robot compensate the cumulative location error which exists when relative sensor like encoder is used to locate the position of mobile robot. The effectiveness of the proposed control scheme is shown through computer simulation.

  • PDF

A Three-Degree-of-Freedom Anthropomorphic Oculomotor Simulator

  • Bang Young-Bong;Paik Jamie K.;Shin Bu-Hyun;Lee Choong-Kil
    • International Journal of Control, Automation, and Systems
    • /
    • 제4권2호
    • /
    • pp.227-235
    • /
    • 2006
  • For a sophisticated humanoid that explores and learns its environment and interacts with humans, anthropomorphic physical behavior is much desired. The human vision system orients each eye with three-degree-of-freedom (3-DOF) in the directions of horizontal, vertical and torsional axes. Thus, in order to accurately replicate human vision system, it is imperative to have a simulator with 3-DOF end-effector. We present a 3-DOF anthropomorphic oculomotor system that reproduces realistic human eye movements for human-sized humanoid applications. The parallel link architecture of the oculomotor system is sized and designed to match the performance capabilities of the human vision. In this paper, a biologically-inspired mechanical design and the structural kinematics of the prototype are described in detail. The motility of the prototype in each axis of rotation was replicated through computer simulation, while performance tests comparable to human eye movements were recorded.

Vision Sensor and Ultrasonic Sensor Fusion Using Neural Network

  • Baek, Sang-Hoon;Oh, Se-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.668-671
    • /
    • 2004
  • This paper proposes a new method of sensor fusion of an ultrasonic sensor and a vision sensor at the sensor level. In general vision system, the vision system finds edges of objects. And in general ultrasonic system, the ultrasonic system finds absolute distance between robot and object. So, the method integrates data of two different types. The system makes perfect output for robot control in the end. But this paper does not propose only integrating a different kind of data but also fusion information which receives from different kind of sensors. This method has advantages which can simply embody algorithm and can control robot on real time.

  • PDF

3-D vision sensor system for arc welding robot with coordinated motion by transputer system

  • Ishida, Hirofumi;Kasagami, Fumio;Ishimatsu, Takakazu
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1993년도 한국자동제어학술회의논문집(국제학술편); Seoul National University, Seoul; 20-22 Oct. 1993
    • /
    • pp.446-450
    • /
    • 1993
  • In this paper we propose an arc welding robot system, where two robots works coordinately and employ the vision sensor. In this system one robot arm holds a welding target as a positioning device, and the other robot moves the welding torch. The vision sensor consists of two laser slit-ray projectors and one CCD TV camera, and is mounted on the top of one robot. The vision sensor detects the 3-dimensional shape of the groove on the target work which needs to be weld. And two robots are moved coordinately to trace the grooves with accuracy. In order to realize fast image processing, totally five sets of high-speed parallel processing units (Transputer) are employed. The teaching tasks of the coordinated motions are simplified considerably due to this vision sensor. Experimental results reveal the applicability of our system.

  • PDF