• Title/Summary/Keyword: Vision sensor

Search Result 832, Processing Time 0.037 seconds

The Design of Controller for Unlimited Track Mobile Robot

  • Park, Han-Soo;Heon Jeong;Park, Sei-Seung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.41.6-41
    • /
    • 2001
  • As autonomous mobile robot become more widely used in industry, the importance of navigation system is rising, But eh primary method of locomotion is with wheels, which cause man problems in controlling tracked mobile robots. In this paper, we discuss the used navigation control of tracked mobile robots with multiple sensors. The multiple sensors are composed of ultrasonic wave sensors and vision sensors. Vision sensors gauge distance using a laser and create visual images, to estimate robot position. The 80196 is used at close range and the vision board is used at long range. Data is managed in the main PC and management is distributed to ever sensor. The controller employs fuzzy logic.

  • PDF

A study on vision algorithm for bin-picking using labeling method (Labeling 방법을 이용한 Bin-Picking용 시각 기능 연구)

  • Choi, J.W.;Park, K.T.;Chung, G.J.
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.10 no.4
    • /
    • pp.248-254
    • /
    • 1993
  • This paper proposes the labeling method for solving bin-picking problem in robot vision. It has the processing steps such as image thresholding, region labeling, and moment computation. To determine a target object from object, the modified labeling method is used to. The moment concept applied to determine the position and orientation of target object. Finally, some experiment result are illustrated and compared with the results of conventional shrinking algorithm and collision fronts algorithm. The proposed labeling method has reduced processing time.

  • PDF

Autonomous Calibration of a 2D Laser Displacement Sensor by Matching a Single Point on a Flat Structure (평면 구조물의 단일점 일치를 이용한 2차원 레이저 거리감지센서의 자동 캘리브레이션)

  • Joung, Ji Hoon;Kang, Tae-Sun;Shin, Hyeon-Ho;Kim, SooJong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.2
    • /
    • pp.218-222
    • /
    • 2014
  • In this paper, we introduce an autonomous calibration method for a 2D laser displacement sensor (e.g. laser vision sensor and laser range finder) by matching a single point on a flat structure. Many arc welding robots install a 2D laser displacement sensor to expand their application by recognizing their environment (e.g. base metal and seam). In such systems, sensing data should be transformed to the robot's coordinates, and the geometric relation (i.e. rotation and translation) between the robot's coordinates and sensor coordinates should be known for the transformation. Calibration means the inference process of geometric relation between the sensor and robot. Generally, the matching of more than 3 points is required to infer the geometric relation. However, we introduce a novel method to calibrate using only 1 point matching and use a specific flat structure (i.e. circular hole) which enables us to find the geometric relation with a single point matching. We make the rotation component of the calibration results as a constant to use only a single point by moving a robot to a specific pose. The flat structure can be installed easily in a manufacturing site, because the structure does not have a volume (i.e. almost 2D structure). The calibration process is fully autonomous and does not need any manual operation. A robot which installed the sensor moves to the specific pose by sensing features of the circular hole such as length of chord and center position of the chord. We show the precision of the proposed method by performing repetitive experiments in various situations. Furthermore, we applied the result of the proposed method to sensor based seam tracking with a robot, and report the difference of the robot's TCP (Tool Center Point) trajectory. This experiment shows that the proposed method ensures precision.

A New Robotic 3D Inspection System of Automotive Screw Hole

  • Baeg, Moon-Hong;Baeg, Seung-Ho;Moon, Chan-Woo;Jeong, Gu-Min;Ahn, Hyun-Sik;Kim, Do-Hyun
    • International Journal of Control, Automation, and Systems
    • /
    • v.6 no.5
    • /
    • pp.740-745
    • /
    • 2008
  • This paper presents a new non-contact 3D robotic inspection system to measure the precise positions of screw and punch holes on a car body frame. The newly developed sensor consists of a CCD camera, two laser line generators and LED light. This lightweight sensor can be mounted on an industrial robot hand. An inspection algorithm and system that work with this sensor is presented. In performance evaluation tests, the measurement accuracy of this inspection system was about 200 ${\mu}m$, which is a sufficient accuracy in the automotive industry.

A Study on the Sensor Calibration of Motion Capture System using PSD Sensor to Improve the Accuracy (PSD 센서를 이용한 모션캡쳐센서의 정밀도 향상을 위한 보정에 관한 연구)

  • Choi, Hun-Il;Jo, Yong-Jun;Ryu, Young-Kee
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.583-585
    • /
    • 2004
  • In this paper we will deal with a calibration method for low cost motion capture system using psd(position sensitive detection) optical sensor. To measure the incident direction of the light from LED emitted marker, the PSD is used the output current ratio on the electrode of PSD is proportional with the incident position of the light focused by lens. In order to defect the direction of the light, the current output is converted into digital voltage value by opamp circuits peak detector and AD converter with the digital value the incident position is measured. Unfortunately, due to the non-linearly problem of the circuit poor position accuracy is shown. To overcome such problems, we compensated the non-linearly by using least-square fitting method. After compensated the non-linearly in the circuit, the system showed more enhanced position accuracy.

  • PDF

Automatic Registration of Two Parts using Robot with Multiple 3D Sensor Systems

  • Ha, Jong-Eun
    • Journal of Electrical Engineering and Technology
    • /
    • v.10 no.4
    • /
    • pp.1830-1835
    • /
    • 2015
  • In this paper, we propose an algorithm for the automatic registration of two rigid parts using multiple 3D sensor systems on a robot. Four sets of structured laser stripe system consisted of a camera and a visible laser stripe is used for the acquisition of 3D information. Detailed procedures including extrinsic calibration among four 3D sensor systems and hand/eye calibration of 3D sensing system on robot arm are presented. We find a best pose using search-based pose estimation algorithm where cost function is proposed by reflecting geometric constraints between sensor systems and target objects. A pose with minimum gap and height difference is found by greedy search. Experimental result using demo system shows the robustness and feasibility of the proposed algorithm.

Overview of sensor fusion techniques for vehicle positioning (차량정밀측위를 위한 복합측위 기술 동향)

  • Park, Jin-Won;Choi, Kae-Won
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.11 no.2
    • /
    • pp.139-144
    • /
    • 2016
  • This paper provides an overview of recent trends in sensor fusion technologies for vehicle positioning. The GNSS by itself cannot satisfy precision and reliability required by autonomous driving. We survey sensor fusion techniques that combine the outputs from the GNSS and the inertial navigation sensors such as an odometer and a gyroscope. Moreover, we overview landmark-based positioning that matches landmarks detected by a lidar or a stereo vision to high-precision digital maps.

Environmental Monitoring System for Base Station with Sensor Node Networks

  • Hur, Chung-Inn;Kim, Hwan-Yong
    • Journal of information and communication convergence engineering
    • /
    • v.7 no.3
    • /
    • pp.258-262
    • /
    • 2009
  • A Practical application of environmental monitoring system based on wireless sensor node network with the core of embedded system STR711FR2 microprocessor is presented in the paper. The adaptable and classifiable wireless sensor node network is used to achieve the data acquisition and multi-hop wireless communication of parameters of the monitoring base station environment including repeaters. The structure of the system is proposed and the hardware architecture of the system is designed, and the system operating procedures is proposed. As a result of field test, designed hardware platform operated with 50kbps bit rate and 5MHz channel spacing at 2040Hz. The wireless monitoring system can be managed and swiftly retreated without support of base station environmental monitoring.

Hand/Eye calibration of Robot arms with a 3D visual sensing system (3차원 시각 센서를 탑재한로봇의 Hand/Eye 캘리브레이션)

  • 김민영;노영준;조형석;김재훈
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2000.10a
    • /
    • pp.76-76
    • /
    • 2000
  • The calibration of the robot system with a visual sensor consists of robot, hand-to-eye, and sensor calibration. This paper describe a new technique for computing 3D position and orientation of a 3D sensor system relative to the end effect of a robot manipulator in an eye-on-hand robot configuration. When the 3D coordinates of the feature points at each robot movement and the relative robot motion between two robot movements are known, a homogeneous equation of the form AX : XB is derived. To solve for X uniquely, it is necessary to make two robot arm movements and form a system of two equation of the form: A$_1$X : XB$_1$ and A$_2$X = XB$_2$. A closed-form solution to this system of equations is developed and the constraints for solution existence are described in detail. Test results through a series of simulation show that this technique is simple, efficient, and accurate fur hand/eye calibration.

  • PDF

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.