• 제목/요약/키워드: a vision system

검색결과 3,197건 처리시간 0.035초

DEVELOPMENT OF A MACHINE VISION SYSTEM FOR WEED CONTROL USING PRECISION CHEMICAL APPLICATION

  • Lee, Won-Suk;David C. Slaughter;D.Ken Giles
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1996년도 International Conference on Agricultural Machinery Engineering Proceedings
    • /
    • pp.802-811
    • /
    • 1996
  • Farmers need alternatives for weed control due to the desire to reduce chemicals used in farming. However, conventional mechanical cultivation cannot selectively remove weeds located in the seedline between crop plants and there are no selective heribicides for some crop/weed situations. Since hand labor is costly , an automated weed control system could be feasible. A robotic weed control system can also reduce or eliminate the need for chemicals. Currently no such system exists for removing weeds located in the seedline between crop plants. The goal of this project is to build a real-time , machine vision weed control system that can detect crop and weed locations. remove weeds and thin crop plants. In order to accomplish this objective , a real-time robotic system was developed to identify and locate outdoor plants using machine vision technology, pattern recognition techniques, knowledge-based decision theory, and robotics. The prototype weed control system is composed f a real-time computer vision system, a uniform illumination device, and a precision chemical application system. The prototype system is mounted on the UC Davis Robotic Cultivator , which finds the center of the seedline of crop plants. Field tests showed that the robotic spraying system correctly targeted simulated weeds (metal coins of 2.54 cm diameter) with an average error of 0.78 cm and the standard deviation of 0.62cm.

  • PDF

Passive Ranging Based on Planar Homography in a Monocular Vision System

  • Wu, Xin-mei;Guan, Fang-li;Xu, Ai-jun
    • Journal of Information Processing Systems
    • /
    • 제16권1호
    • /
    • pp.155-170
    • /
    • 2020
  • Passive ranging is a critical part of machine vision measurement. Most of passive ranging methods based on machine vision use binocular technology which need strict hardware conditions and lack of universality. To measure the distance of an object placed on horizontal plane, we present a passive ranging method based on monocular vision system by smartphone. Experimental results show that given the same abscissas, the ordinatesis of the image points linearly related to their actual imaging angles. According to this principle, we first establish a depth extraction model by assuming a linear function and substituting the actual imaging angles and ordinates of the special conjugate points into the linear function. The vertical distance of the target object to the optical axis is then calculated according to imaging principle of camera, and the passive ranging can be derived by depth and vertical distance to the optical axis of target object. Experimental results show that ranging by this method has a higher accuracy compare with others based on binocular vision system. The mean relative error of the depth measurement is 0.937% when the distance is within 3 m. When it is 3-10 m, the mean relative error is 1.71%. Compared with other methods based on monocular vision system, the method does not need to calibrate before ranging and avoids the error caused by data fitting.

소고기 육색 등급 자동 판정을 위한 기계시각 시스템의 칼라 보정 및 정량화 (Quantization and Calibration of Color Information From Machine Vision System for Beef Color Grading)

  • 김정희;최선;한나영;고명진;조성호;황헌
    • Journal of Biosystems Engineering
    • /
    • 제32권3호
    • /
    • pp.160-165
    • /
    • 2007
  • This study was conducted to evaluate beef using a color machine vision system. The machine vision system has an advantage to measure larger area than a colorimeter and also could measure other quality factors like distribution of fats. However, the machine vision measurement is affected by system components. To measure the beef color with the machine vision system, the effect of color balancing control was tested and calibration model was developed. Neural network for color calibration which learned reference color patches showed a high correlation with colorimeter in L*a*b* coordinates and had an adaptability at various measurement environments. The trained network showed a very high correlation with the colorimeter when measuring beef color.

관성센서와 비전을 이용한 보행용 항법 시스템 (Pedestrian Navigation System using Inertial Sensors and Vision)

  • 박상경;서영수
    • 전기학회논문지
    • /
    • 제59권11호
    • /
    • pp.2048-2057
    • /
    • 2010
  • Is this paper, a pedestrian inertial navigation system with vision is proposed. The navigation system using inertial sensors has problems that it is difficult to determine the initial position and the position error increases over time. To solve these problems, a vision system in addition to an inertial navigation system is used, where a camera is attached to a pedestrian. Landmarks are installed to known positions so that the position and orientation of a camera can be computed once a camera views the landmark. Using this position information, estimation errors in the inertial navigation system is compensated.

Integrated System for Autonomous Proximity Operations and Docking

  • Lee, Dae-Ro;Pernicka, Henry
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제12권1호
    • /
    • pp.43-56
    • /
    • 2011
  • An integrated system composed of guidance, navigation and control (GNC) system for autonomous proximity operations and the docking of two spacecraft was developed. The position maneuvers were determined through the integration of the state-dependent Riccati equation formulated from nonlinear relative motion dynamics and relative navigation using rendezvous laser vision (Lidar) and a vision sensor system. In the vision sensor system, a switch between sensors was made along the approach phase in order to provide continuously effective navigation. As an extension of the rendezvous laser vision system, an automated terminal guidance scheme based on the Clohessy-Wiltshire state transition matrix was used to formulate a "V-bar hopping approach" reference trajectory. A proximity operations strategy was then adapted from the approach strategy used with the automated transfer vehicle. The attitude maneuvers, determined from a linear quadratic Gaussian-type control including quaternion based attitude estimation using star trackers or a vision sensor system, provided precise attitude control and robustness under uncertainties in the moments of inertia and external disturbances. These functions were then integrated into an autonomous GNC system that can perform proximity operations and meet all conditions for successful docking. A six-degree of freedom simulation was used to demonstrate the effectiveness of the integrated system.

진자검사 계측을 위한 영상 시스템의 개발 (Development of a Vision System for the Measurement of the Pendulum Test)

  • 김철승;문기욱;이수영;엄광문
    • 전기학회논문지
    • /
    • 제56권4호
    • /
    • pp.817-819
    • /
    • 2007
  • The purpose of this work is to develop a measurement system of the pendulum test with minimal restriction of experimental environment and little influence of noise. In this work, we developed a vision system without any line between markers and a camera. The system performance is little influenced by the experimental environment, if light are sufficient to recognize markers. For the validation of the system, we compared knee joint angle trajectories measured by the developed system and by the magnetic sensor system during the nominal pendulum test and the maximum speed voluntary knee joint rotation. The joint angle trajectories of the developed system during both tests matched well with those of the magnetic system. Therefore, we suggest the vision system as an alternative to the previous systems with limited practicality for the pendulum test.

휴머노이드 로봇을 위한 비전기반 장애물 회피 시스템 개발 (Development of Vision based Autonomous Obstacle Avoidance System for a Humanoid Robot)

  • 강태구;김동원;박귀태
    • 전기학회논문지
    • /
    • 제60권1호
    • /
    • pp.161-166
    • /
    • 2011
  • This paper addresses the vision based autonomous walking control system. To handle the obstacles which exist beyond the field of view(FOV), we used the 3d panoramic depth image. Moreover, to decide the avoidance direction and walking motion of a humanoid robot for the obstacle avoidance by itself, we proposed the vision based path planning using 3d panoramic depth image. In the vision based path planning, the path and walking motion are decided under environment condition such as the size of obstacle and available avoidance space. The vision based path planning is applied to a humanoid robot, URIA. The results from these evaluations show that the proposed method can be effectively applied to decide the avoidance direction and the walking motion of a practical humanoid robot.

비젼시스템을 이용한 이동로봇의 서보제어 (Servo control of mobile robot using vision system)

  • 백승민;국태용
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1997년도 한국자동제어학술회의논문집; 한국전력공사 서울연수원; 17-18 Oct. 1997
    • /
    • pp.540-543
    • /
    • 1997
  • In this paper, a precise trajectory tracking method for mobile robot using a vision system is presented. In solving the problem of precise trajectory tracking, a hierarchical control structure is used which is composed of the path planer, vision system, and dynamic controller. When designing the dynamic controller, non-ideal conditions such as parameter variation, frictional force, and external disturbance are considered. The proposed controller can learn bounded control input for repetitive or periodic dynamics compensation which provides robust and adaptive learning capability. Moreover, the usage of vision system makes mobile robot compensate the cumulative location error which exists when relative sensor like encoder is used to locate the position of mobile robot. The effectiveness of the proposed control scheme is shown through computer simulation.

  • PDF

Vision Sensor and Ultrasonic Sensor Fusion Using Neural Network

  • Baek, Sang-Hoon;Oh, Se-Young
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.668-671
    • /
    • 2004
  • This paper proposes a new method of sensor fusion of an ultrasonic sensor and a vision sensor at the sensor level. In general vision system, the vision system finds edges of objects. And in general ultrasonic system, the ultrasonic system finds absolute distance between robot and object. So, the method integrates data of two different types. The system makes perfect output for robot control in the end. But this paper does not propose only integrating a different kind of data but also fusion information which receives from different kind of sensors. This method has advantages which can simply embody algorithm and can control robot on real time.

  • PDF

An Autonomous Operational Service System for Machine Vision-based Inspection towards Smart Factory of Manufacturing Multi-wire Harnesses

  • Seung Beom, Hong;Kyou Ho, Lee
    • Journal of information and communication convergence engineering
    • /
    • 제20권4호
    • /
    • pp.317-325
    • /
    • 2022
  • In this study, we propose a technological system designed to provide machine vision-based automatic inspection and autonomous operation services for an entire process related to product inspection in wire harness manufacturing. The smart factory paradigm is a valuable and necessary goal, small companies may encounter steep barriers to entry. Therefore, the best approach is to develop towards this approach gradually in stages starting with the relatively simple improvement to manufacturing processes, such as replacing manual quality assurance stages with machine vision-based inspection. In this study, we consider design issues of a system based on the proposed technology and describe an experimental implementation. In addition, we evaluated the implementation of the proposed technology. The test results show that the adoption of the proposed machine vision-based automatic inspection and operation service system for multi-wire harness production may be considered justified, and the effectiveness of the proposed technology was verified.