• Title/Summary/Keyword: single camera

Search Result 776, Processing Time 0.031 seconds

Depth Evaluation from Pattern Projection Optimized for Automated Electronics Assembling Robots

  • Park, Jong-Rul;Cho, Jun Dong
    • IEIE Transactions on Smart Processing and Computing
    • /
    • v.3 no.4
    • /
    • pp.195-204
    • /
    • 2014
  • This paper presents the depth evaluation for object detection by automated assembling robots. Pattern distortion analysis from a structured light system identifies an object with the greatest depth from its background. An automated assembling robot should prior select and pick an object with the greatest depth to reduce the physical harm during the picking action of the robot arm. Object detection is then combined with a depth evaluation to provide contour, showing the edges of an object with the greatest depth. The contour provides shape information to an automated assembling robot, which equips the laser based proxy sensor, for picking up and placing an object in the intended place. The depth evaluation process using structured light for an automated electronics assembling robot is accelerated for an image frame to be used for computation using the simplest experimental set, which consists of a single camera and projector. The experiments for the depth evaluation process required 31 ms to 32 ms, which were optimized for the robot vision system that equips a 30-frames-per-second camera.

Obstacle Avoidance Algorithm of a Mobile Robot using Image Information (화상 정보를 이용한 이동 로봇의 장애물 회피 알고리즘)

  • Kwon, O-Sang;Lee, Eung-Hyuk;Han, Yong-Hwan;Hong, Seung-Hong
    • Journal of IKEEE
    • /
    • v.2 no.1 s.2
    • /
    • pp.139-149
    • /
    • 1998
  • There are some problems in robot navigations with a single kind of sensor. We propose a system that takes advantages of both CCD camera and ultrasonic sensors for the concerning matter. A coordinate extraction algorithm to avoid obstacles during the navigation is also proposed. We implemented a CCD based vision system at the front part of the vehicle and did experiments to verify the suggested algorithm's availability. From experimental results, the error rate was reduced when a CCD camera was used rather than when only ultrasonic sensors were used. Also we can generate path to avoid those obstacles using the measured values.

  • PDF

Algorithms for Multi-sensor and Multi-primitive Photogrammetric Triangulation

  • Shin, Sung-Woong;Habib, Ayman F.;Ghanma, Mwafag;Kim, Chang-Jae;Kim, Eui-Myoung
    • ETRI Journal
    • /
    • v.29 no.4
    • /
    • pp.411-420
    • /
    • 2007
  • The steady evolution of mapping technology is leading to an increasing availability of multi-sensory geo-spatial datasets, such as data acquired by single-head frame cameras, multi-head frame cameras, line cameras, and light detection and ranging systems, at a reasonable cost. The complementary nature of the data collected by these systems makes their integration to obtain a complete description of the object space. However, such integration is only possible after accurate co-registration of the collected data to a common reference frame. The registration can be carried out reliably through a triangulation procedure which considers the characteristics of the involved data. This paper introduces algorithms for a multi-primitive and multi-sensory triangulation environment, which is geared towards taking advantage of the complementary characteristics of spatial data available from the above mentioned sensors. The triangulation procedure ensures the alignment of involved data to a common reference frame. The devised methodologies are tested and proven efficient through experiments using real multi-sensory data.

  • PDF

Analysis of Rotational Motion of Skid Steering Mobile Robot using Marker and Camera (마커와 카메라를 이용한 스키드 구동 이동 로봇의 회전 운동 분석)

  • Ha, Jong-Eun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.11 no.2
    • /
    • pp.185-190
    • /
    • 2016
  • This paper deals with analysis of the characteristics of mobile robot's motion by automatic detection of markers on a robot using a camera. Analysis of motion behaviors according to parameters is important in developing control algorithm for robot operation or autonomous navigation. For this purpose, we use four chessboard patterns on the robot. Their location on the robot is adjusted to be on single plane. Homography is used to compute the actual amount of movement of the robot. Presented method is tested using P3-AT robot and it gives reliable results.

Performing Missions of a Minicar Using a Single Camera (단안 카메라를 이용한 소형 자동차의 임무 수행)

  • Kim, Jin-Woo;Ha, Jong-Eun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.12 no.1
    • /
    • pp.123-128
    • /
    • 2017
  • This paper deals with performing missions through autonomous navigation using camera and other sensors. Extracting pose of the car is necessary to navigate safely within the given road. Homography is used to find it. Color image is converted into grey image and thresholding and edge is used to find control points. Two control ponits are converted into world coordinates using homography to find the angle and position of the car. Color is used to find traffic signal. It was confirmed that the given tasks performed well through experiments.

A Study of the Behavior of Liquid Phase Spray Considering Critical Condition of the Fuel (연료의 임계조건을 고려한 디젤 액상분무거동에 관한 연구)

  • Park, Jong-Sang;Kim, Si-Pom;Chung, Sung-Sik;Ha, Jong-Yul;Yeom, Jeong-Kuk
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.31 no.5
    • /
    • pp.467-472
    • /
    • 2007
  • In this study the penetration distance of liquid phase fuel(i.e. liquid phsae length) was investigated in evaporative field. An exciplex fluorescence method was applied to the evaporative fuel spray to measure and investigate both the liquid and the vapor phase of the injected spray. For accurate investigation, images of the liquid and vapor phase regions were recorded using a 35mm still camera and CCD camera, respectively. Liquid fuel was injected from a single-hole nozzle (l/d=1.0mm/0.2mm) into a constant-volume chamber under high pressure and temperature in order to visualize the spray phenomena. Experimental results indicate that the liquid phase length decreased down to a certain constant value in accordance with increase in the ambient gas density and temperature. The constant value, about 40mm in this study the, is reached when the ambient density and temperature of the used fuel exceed critical condition.

A Study on the Visual Odometer using Ground Feature Point (지면 특징점을 이용한 영상 주행기록계에 관한 연구)

  • Lee, Yoon-Sub;Noh, Gyung-Gon;Kim, Jin-Geol
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.3
    • /
    • pp.330-338
    • /
    • 2011
  • Odometry is the critical factor to estimate the location of the robot. In the mobile robot with wheels, odometry can be performed using the information from the encoder. However, the information of location in the encoder is inaccurate because of the errors caused by the wheel's alignment or slip. In general, visual odometer has been used to compensate for the kinetic errors of robot. In case of using the visual odometry under some robot system, the kinetic analysis is required for compensation of errors, which means that the conventional visual odometry cannot be easily applied to the implementation of the other type of the robot system. In this paper, the novel visual odometry, which employs only the single camera toward the ground, is proposed. The camera is mounted at the center of the bottom of the mobile robot. Feature points of the ground image are extracted by using median filter and color contrast filter. In addition, the linear and angular vectors of the mobile robot are calculated with feature points matching, and the visual odometry is performed by using these linear and angular vectors. The proposed odometry is verified through the experimental results of driving tests using the encoder and the new visual odometry.

A Study on the Metal Transfer and Spatter Generation in High Current $CO_2$ Welding (고전류 $CO_2$용접에서의 금속이행 및 스패터 발생 현상에 관한 연구)

  • 김남훈;유회수;김희진;고진현
    • Journal of Welding and Joining
    • /
    • v.21 no.3
    • /
    • pp.51-57
    • /
    • 2003
  • The metal transfer in $CO_2$ welding shows the transition of transfer mode from short-circuiting to repelled transfer will the increase of welding current. While the short-circuiting mode in $CO_2$ welding has been studied very extensively relating with droplet formation and spatter generation, the repelled transfer has little been understood. In this study, high current $CO_2$ welding has been performed with bead-on-plate welds along with the waveform analyzer and high speed camera. The image of high speed camera was synchronized with its waveform so that the moment of spatter generation could be realized during drop detachment. As a results of this study, it was found that welding arc changes its location either once or three times and thus single or double pulse signals were developed in the voltage waveform. Whenever the arc moved its location, new arc was developed in a explosive way and thus it caused spatter generation. Specially severe spattering took place when the waveform showed a double-peak pattern. As a consequence of these results, new waveform control techniques could be suggested for suppressing the spatter generation in the high-current $CO_2$ welding.

Experimental Verification of Multi-Sensor Geolocation Algorithm using Sequential Kalman Filter (순차적 칼만 필터를 적용한 다중센서 위치추정 알고리즘 실험적 검증)

  • Lee, Seongheon;Kim, Youngjoo;Bang, Hyochoong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.1
    • /
    • pp.7-13
    • /
    • 2015
  • Unmanned air vehicles (UAVs) are getting popular not only as a private usage for the aerial photograph but military usage for the surveillance, reconnaissance and supply missions. For an UAV to successfully achieve these kind of missions, geolocation (localization) must be implied to track an interested target or fly by reference. In this research, we adopted multi-sensor fusion (MSF) algorithm to increase the accuracy of the geolocation and verified the algorithm using two multicopter UAVs. One UAV is equipped with an optical camera, and another UAV is equipped with an optical camera and a laser range finder. Throughout the experiment, we have obtained measurements about a fixed ground target and estimated the target position by a series of coordinate transformations and sequential Kalman filter. The result showed that the MSF has better performance in estimating target location than the case of using single sensor. Moreover, the experimental result implied that multi-sensor geolocation algorithm is able to have further improvements in localization accuracy and feasibility of other complicated applications such as moving target tracking and multiple target tracking.

IMPLEMENTATION OFWHOLE SHAPE MEASUREMENT SYSTEM USING A CYLINDRICAL MIRROR

  • Uranishi, Yuki;Manabe, Yoshitsugu;Sasaki, Hiroshi;Chihara, Kunihiro
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.601-605
    • /
    • 2009
  • We have proposed a measurement system for measuring a whole shape of an object easily. The proposed system consists of a camera and a cylinder whose inside is coated by a mirror layer. A target object is placed inside the cylinder and an image is captured by the camera from right above. The captured image includes sets of points that are observed from multiple viewpoints: one is observed directly, and others are observed via the mirror. Therefore, the whole shape of the object can be measured using stereo vision in a single shot. This paper shows that a prototype of the proposed system was implemented and an actual object was measured using the prototype. A method based on a pattern matching which uses a value of SSD (Sum of Squared Difference), and a method based on DP (Dynamic Programming) are employed to identify a set of corresponding points in warped captured images.

  • PDF