• 제목/요약/키워드: robotic vision

검색결과 126건 처리시간 0.023초

육묘용 로봇 이식기의 개발(I)-로봇 이식기의 기계시각 시스템의 개발- (Development of a Robotic Transplanter for Bedding Plants (I)-Development of the Machine Vision System of a Robotic Transplanter-)

  • 류관희;이희환;김기영;황호준
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1997년도 동계 학술대회 논문집
    • /
    • pp.392-400
    • /
    • 1997
  • This study was conducted to develope the machine vision system of a robotic transplanter for bedding plants. Specific objectives of this study were 1) to get coordinates of the healthy seedlings except empty cells and bad seedlings in high-density plug tray, and 2) to get the angle of the leaves of the healthy seedlings to avoid damage to the seedlings by gripper. The results of this study are summarized as follows. (1) The machine vision system of a robotic transplanter was developed. (2) The success rates of detecting empty cell and bad seedlings in 72-cell and 128-cell plug trays were 98.8% and 94.9% respectively. (3) The success rates of calculating the angle of leaves in 72-cell and 128-cell plug trays were 93.5% and 91.0% respectively.

  • PDF

DEVELOPMENT OF A MACHINE VISION SYSTEM FOR WEED CONTROL USING PRECISION CHEMICAL APPLICATION

  • Lee, Won-Suk;David C. Slaughter;D.Ken Giles
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1996년도 International Conference on Agricultural Machinery Engineering Proceedings
    • /
    • pp.802-811
    • /
    • 1996
  • Farmers need alternatives for weed control due to the desire to reduce chemicals used in farming. However, conventional mechanical cultivation cannot selectively remove weeds located in the seedline between crop plants and there are no selective heribicides for some crop/weed situations. Since hand labor is costly , an automated weed control system could be feasible. A robotic weed control system can also reduce or eliminate the need for chemicals. Currently no such system exists for removing weeds located in the seedline between crop plants. The goal of this project is to build a real-time , machine vision weed control system that can detect crop and weed locations. remove weeds and thin crop plants. In order to accomplish this objective , a real-time robotic system was developed to identify and locate outdoor plants using machine vision technology, pattern recognition techniques, knowledge-based decision theory, and robotics. The prototype weed control system is composed f a real-time computer vision system, a uniform illumination device, and a precision chemical application system. The prototype system is mounted on the UC Davis Robotic Cultivator , which finds the center of the seedline of crop plants. Field tests showed that the robotic spraying system correctly targeted simulated weeds (metal coins of 2.54 cm diameter) with an average error of 0.78 cm and the standard deviation of 0.62cm.

  • PDF

얇은 막대 배치작업을 위한 최적의 가중치 행렬을 사용한 실시간 로봇 비젼 제어기법 (Real-time Robotic Vision Control Scheme Using Optimal Weighting Matrix for Slender Bar Placement Task)

  • 장민우;김재명;장완식
    • 한국생산제조학회지
    • /
    • 제26권1호
    • /
    • pp.50-58
    • /
    • 2017
  • This paper proposes a real-time robotic vision control scheme using the weighting matrix to efficiently process the vision data obtained during robotic movement to a target. This scheme is based on the vision system model that can actively control the camera parameter and robotic position change over previous studies. The vision control algorithm involves parameter estimation, joint angle estimation, and weighting matrix models. To demonstrate the effectiveness of the proposed control scheme, this study is divided into two parts: not applying the weighting matrix and applying the weighting matrix to the vision data obtained while the camera is moving towards the target. Finally, the position accuracy of the two cases is compared by performing the slender bar placement task experimentally.

육묘용 로봇 이식기의 개발(III)-로봇이식기의 개발- (Development of a Robotic Transplanter for Bedding Plants(III)-Development of a Robotic Transplanter)

  • 류관희;이희환;김기영;한재성
    • 한국농업기계학회:학술대회논문집
    • /
    • 한국농업기계학회 1997년도 하계 학술대회 논문집
    • /
    • pp.238-246
    • /
    • 1997
  • This study was conducted to develop a robotic transplanter for bedding plants. The robotic transplanter consisted of machine vision system, a manipulator, a gripper and plug tray transfer system. The performance of the robotic transplanter was tested and compared by two different transplanting methods, which were to consider the leaf orientation of seedlings and not to. Results of this study were as follows. (1) A cartesian coordinate manipulator for a robotic transplanter with 3 degree of freedom was constructed. The accuracy of position control was $\pm$1 mm. (2) The robotic transplanter with the machine vision system, the manipulator, the gripper and the transfer system was developed and tested with a shovel-type finger. Without considering the orientation of leaves, the success rates of transplanting healthy cucumber seedlings in 72-cell and 128-cell plug-trays were 95.5% and 94.5% respectively. Considering the orientation of leaves, the success rates of transplanting healthy cucumber seedling in 72-cell and 128-cell plug-trays were 96.0% and 95.0% respectively.

  • PDF

육묘용 로봇 이식기의 개발(I) - 기계시각 시스템 - (Development of a Robotic Transplanter for Bedding Plants(I) - Machine Vision System -)

  • 류관희;김기영;이희환;황호준
    • Journal of Biosystems Engineering
    • /
    • 제22권3호
    • /
    • pp.317-324
    • /
    • 1997
  • This study was conducted to develope a machine vision system for a robotic transplanter for bedding plants. Specific objectives of this study were 1) to get coordinates of the healthy seedlings in high-density plug tray, and 2) to get the angle of the leaves of the healthy seedlings to avoid the damage to seedlings by gripper. Results of this study were summarized as follows. (1) The machine vision system of a robotic transplanter was developed. (2) Success rates of detecting empty cell and bad seedlings for 72-cell and 128-cell plug-trays were 98.8% and 94, 9% respectively. (3) Success rates of calculating the angle of leaves fer 72-cell and 128-cell plug-trays were 93.5% and 91.0% respectively.

  • PDF

A Robotic Vision System for Turbine Blade Cooling Hole Detection

  • Wang, Jianjun;Tang, Qing;Gan, Zhongxue
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.237-240
    • /
    • 2003
  • Gas turbines are extensively used in flight propulsion, electrical power generation, and other industrial applications. During its life span, a turbine blade is taken out periodically for repair and maintenance. This includes re-coating the blade surface and re-drilling the cooling holes/channels. A successful laser re-drilling requires the measurement of a hole within the accuracy of ${\pm}0.15mm$ in position and ${\pm}3^{\circ}$ in orientation. Detection of gas turbine blade/vane cooling hole position and orientation thus becomes a very important step for the vane/blade repair process. The industry is in urgent need of an automated system to fulfill the above task. This paper proposes approaches and algorithms to detect the cooling hole position and orientation by using a vision system mounted on a robot arm. The channel orientation is determined based on the alignment of the vision system with the channel axis. The opening position of the channel is the intersection between the channel axis and the surface around the channel opening. Experimental results have indicated that the concept of cooling hole identification is feasible. It has been shown that the reproducible detection of cooling channel position is with +/- 0.15mm accuracy and cooling channel orientation is with +/$-\;3^{\circ}$ with the current test conditions. Average processing time to search and identify channel position and orientation is less than 1 minute.

  • PDF

Towards a Ubiquitous Robotic Companion: Design and Implementation of Ubiquitous Robotic Service Framework

  • Ha, Young-Guk;Sohn, Joo-Chan;Cho, Young-Jo;Yoon, Hyun-Soo
    • ETRI Journal
    • /
    • 제27권6호
    • /
    • pp.666-676
    • /
    • 2005
  • In recent years, motivated by the emergence of ubiquitous computing technologies, a new class of networked robots, ubiquitous robots, has been introduced. The Ubiquitous Robotic Companion (URC) is our conceptual vision of ubiquitous service robots that provide users with the services they need, anytime and anywhere in ubiquitous computing environments. To realize the vision of URC, one of the essential requirements for robotic systems is to support ubiquity of services: that is, a robot service must be always available even though there are changes in the service environments. Specifically robotic systems need to be automatically interoperable with sensors and devices in current service environments, rather than statically preprogrammed for them. In this paper, the design and implementation of a semantic-based ubiquitous robotic space (SemanticURS) is presented. SemanticURS enables automated integration of networked robots into ubiquitous computing environments exploiting Semantic Web Services and AI-based planning technologies.

  • PDF

기계시각을 이용한 육묘용 로봇 이식기의 개발 (Development of a Robotic Transplanter Using Machine Vision for Bedding Plants)

  • 류관희;김기영;이희환;한재성;황호준
    • 생물환경조절학회지
    • /
    • 제6권1호
    • /
    • pp.55-65
    • /
    • 1997
  • 본 연구는 육묘용 로봇 이식기를 개발하기 위한 목적으로 수행되었으며 그 결과는 다음과 같다. 1. 기계시각에 의한 결주 및 불량묘 판별은 72공 플러그묘판의 경우 98.8%, 128공의 플러그묘판의 경우 94.9%, 잎의 방향을 고려한 경우 72공의 플러그묘판의 경우 93.5%, 128공의 경우 91.0%의 정확도를 나타내었다. 2. 로봇 이식기의 기구부와 구동부를 개발하였고, 이를 제어하기 위한 제어기를 개발하였으며, 머니플레이터의 위치제어 정확도는 $\pm$ 1mm로 나타났다. 3. 잎의 방향을 고려하지 않은 이식방법에서의 이식성공률은 72공 플러그묘판의 경우 95.5%, 128공의 플러그묘판의 경우 94.5%로 나타났으며, 잎의 방향을 고려한 이식방법에서의 이식성공률은 72공의 경우 96.0%, 128공의 경우 95.0%로 나타났다.

  • PDF

A VISION SYSTEM IN ROBOTIC WELDING

  • Absi Alfaro, S. C.
    • 대한용접접합학회:학술대회논문집
    • /
    • 대한용접접합학회 2002년도 Proceedings of the International Welding/Joining Conference-Korea
    • /
    • pp.314-319
    • /
    • 2002
  • The Automation and Control Group at the University of Brasilia is developing an automatic welding station based on an industrial robot and a controllable welding machine. Several techniques were applied in order to improve the quality of the welding joints. This paper deals with the implementation of a laser-based computer vision system to guide the robotic manipulator during the welding process. Currently the robot is taught to follow a prescribed trajectory which is recorded a repeated over and over relying on the repeatability specification from the robot manufacturer. The objective of the computer vision system is monitoring the actual trajectory followed by the welding torch and to evaluate deviations from the desired trajectory. The position errors then being transfer to a control algorithm in order to actuate the robotic manipulator and cancel the trajectory errors. The computer vision systems consists of a CCD camera attached to the welding torch, a laser emitting diode circuit, a PC computer-based frame grabber card, and a computer vision algorithm. The laser circuit establishes a sharp luminous reference line which images are captured through the video camera. The raw image data is then digitized and stored in the frame grabber card for further processing using specifically written algorithms. These image-processing algorithms give the actual welding path, the relative position between the pieces and the required corrections. Two case studies are considered: the first is the joining of two flat metal pieces; and the second is concerned with joining a cylindrical-shape piece to a flat surface. An implementation of this computer vision system using parallel computer processing is being studied.

  • PDF

Trinocular Vision System을 이용한 물체 자세정보 인식 향상방안 (A Study on the Improvement of Pose Information of Objects by Using Trinocular Vision System)

  • 김종형;장경재;권혁동
    • 한국생산제조학회지
    • /
    • 제26권2호
    • /
    • pp.223-229
    • /
    • 2017
  • Recently, robotic bin-picking tasks have drawn considerable attention, because flexibility is required in robotic assembly tasks. Generally, stereo camera systems have been used widely for robotic bin-picking, but these have two limitations: First, computational burden for solving correspondence problem on stereo images increases calculation time. Second, errors in image processing and camera calibration reduce accuracy. Moreover, the errors in robot kinematic parameters directly affect robot gripping. In this paper, we propose a method of correcting the bin-picking error by using trinocular vision system which consists of two stereo cameras andone hand-eye camera. First, the two stereo cameras, with wide viewing angle, measure object's pose roughly. Then, the 3rd hand-eye camera approaches the object, and corrects the previous measurement of the stereo camera system. Experimental results show usefulness of the proposed method.