• Title/Summary/Keyword: 2d laser sensor

Search Result 100, Processing Time 0.028 seconds

Solder Paste Inspection of PCB using Laser Sensor (Laser 거리센서를 이용한 PCB에서의 납 도포상태검사)

  • O, Seung-Yong;Choe, Gyeong-Jin;Lee, Yong-Hyeon;Park, Jong-Guk
    • Proceedings of the KIEE Conference
    • /
    • 2003.11b
    • /
    • pp.291-294
    • /
    • 2003
  • In this paper, 2D and 3D inspection algorithm for printed solder on PCB is introduced. The aim of inspection is the detection of error such as rich solder poor solder and missing solder. For Inspection, laser distance sensor is used. For 2D inspection, laser image that is created by normalizing laser data between 0 and 255 are used. Reference Image is made using gerber file. Image processing algorithm is used for 2D inspection. By adding thickness of metal stencil to laser image, volume for solder can be calculated and 3D inspection is carried out.

  • PDF

Linearly Polarized 1-kW 20/400-㎛ Yb-doped Fiber Laser with 10-GHz Linewidth (선편광된 10 GHz 선폭의 1 kW급 20/400-㎛ 이터븀 첨가 광섬유 레이저)

  • Jung, Yeji;Jung, Minwan;Lee, Kangin;Kim, Taewoo;Kim, Jae-Ihn;Lee, Yongsoo;Cho, Joonyong
    • Korean Journal of Optics and Photonics
    • /
    • v.32 no.3
    • /
    • pp.120-125
    • /
    • 2021
  • We have developed a linearly polarized high-power Yb-doped fiber laser in the master oscillator power amplifier (MOPA) scheme for efficient spectral beam combining. We modulated the phase of the seed laser by pseudo-random binary sequence (PRBS), with the bit length optimized to suppress stimulated Brillouin scattering (SBS), and subsequently amplified seed power in a 3-stage amplifier system. We have constructed by coiling the polarization-maintaining (PM) Yb-doped fiber, with core and cladding diameters of 20 ㎛ and 400 ㎛ respectively, to a diameter of 9-12 cm for suppression of the mode instability (MI). Finally, we obtained an output power of 1.004 kW with a slope efficiency of 83.7% in the main amplification stage. The beam quality factor M2 and the polarization extinction ratio (PER) were measured to be 1.12 and 21.5 dB respectively. Furthermore, the peak-intensity difference between the Rayleigh signal and SBS signal was observed to be 2.36 dB in the backward spectra, indicating that SBS is successfully suppressed. In addition, it can be expected that the MI does not occur because not only there is no decrease in slope efficiency, but also the beam quality for each amplified output is maintained.

Road marking classification method based on intensity of 2D Laser Scanner (신호세기를 이용한 2차원 레이저 스캐너 기반 노면표시 분류 기법)

  • Park, Seong-Hyeon;Choi, Jeong-hee;Park, Yong-Wan
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.11 no.5
    • /
    • pp.313-323
    • /
    • 2016
  • With the development of autonomous vehicle, there has been active research on advanced driver assistance system for road marking detection using vision sensor and 3D Laser scanner. However, vision sensor has the weak points that detection is difficult in situations involving severe illumination variance, such as at night, inside a tunnel or in a shaded area; and that processing time is long because of a large amount of data from both vision sensor and 3D Laser scanner. Accordingly, this paper proposes a road marking detection and classification method using single 2D Laser scanner. This method road marking detection and classification based on accumulation distance data and intensity data acquired through 2D Laser scanner. Experiments using a real autonomous vehicle in a real environment showed that calculation time decreased in comparison with 3D Laser scanner-based method, thus demonstrating the possibility of road marking type classification using single 2D Laser scanner.

CALOS : Camera And Laser for Odometry Sensing (CALOS : 주행계 추정을 위한 카메라와 레이저 융합)

  • Bok, Yun-Su;Hwang, Young-Bae;Kweon, In-So
    • The Journal of Korea Robotics Society
    • /
    • v.1 no.2
    • /
    • pp.180-187
    • /
    • 2006
  • This paper presents a new sensor system, CALOS, for motion estimation and 3D reconstruction. The 2D laser sensor provides accurate depth information of a plane, not the whole 3D structure. On the contrary, the CCD cameras provide the projected image of whole 3D scene, not the depth of the scene. To overcome the limitations, we combine these two types of sensors, the laser sensor and the CCD cameras. We develop a motion estimation scheme appropriate for this sensor system. In the proposed scheme, the motion between two frames is estimated by using three points among the scan data and their corresponding image points, and refined by non-linear optimization. We validate the accuracy of the proposed method by 3D reconstruction using real images. The results show that the proposed system can be a practical solution for motion estimation as well as for 3D reconstruction.

  • PDF

3D Spreader Position Information by the CCD Cameras and the Laser Distance Measuring Unit for ATC

  • Bae, Dong-Suk;Lee, Jung-Jae;Lee, Bong-Ki;Lee, Jang-Myung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1679-1684
    • /
    • 2004
  • This paper introduces a novel approach that can provide the three dimensional information on the movement of a spreader by using two CCD cameras and a laser distance sensor, which enables an ALS (Automatic Landing System) to be used for yard cranes at a harbor. So far a kind of 2D Laser scanner sensor or laser distance measuring units are used as corner detectors for the geometrical matching between the spreader and a container, which provides only 2D information which is not enough for an accurate and fast ALS system required presently. In addition to this deficiency in performance, the price for the system is too high to be adopted widely for the ALS. Therefore, to overcome these defects, a novel method to acquire the three dimensional information for the movement of a spreader including skew and sway angles is proposed using two CCD cameras and a laser distance sensor. To show the efficiency of proposed algorithm, real experiments are performed to show the accuracy improvement in distance measurement by fusing the sensory information of CCD camera and laser distance sensor.

  • PDF

Autonomous Calibration of a 2D Laser Displacement Sensor by Matching a Single Point on a Flat Structure (평면 구조물의 단일점 일치를 이용한 2차원 레이저 거리감지센서의 자동 캘리브레이션)

  • Joung, Ji Hoon;Kang, Tae-Sun;Shin, Hyeon-Ho;Kim, SooJong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.2
    • /
    • pp.218-222
    • /
    • 2014
  • In this paper, we introduce an autonomous calibration method for a 2D laser displacement sensor (e.g. laser vision sensor and laser range finder) by matching a single point on a flat structure. Many arc welding robots install a 2D laser displacement sensor to expand their application by recognizing their environment (e.g. base metal and seam). In such systems, sensing data should be transformed to the robot's coordinates, and the geometric relation (i.e. rotation and translation) between the robot's coordinates and sensor coordinates should be known for the transformation. Calibration means the inference process of geometric relation between the sensor and robot. Generally, the matching of more than 3 points is required to infer the geometric relation. However, we introduce a novel method to calibrate using only 1 point matching and use a specific flat structure (i.e. circular hole) which enables us to find the geometric relation with a single point matching. We make the rotation component of the calibration results as a constant to use only a single point by moving a robot to a specific pose. The flat structure can be installed easily in a manufacturing site, because the structure does not have a volume (i.e. almost 2D structure). The calibration process is fully autonomous and does not need any manual operation. A robot which installed the sensor moves to the specific pose by sensing features of the circular hole such as length of chord and center position of the chord. We show the precision of the proposed method by performing repetitive experiments in various situations. Furthermore, we applied the result of the proposed method to sensor based seam tracking with a robot, and report the difference of the robot's TCP (Tool Center Point) trajectory. This experiment shows that the proposed method ensures precision.

Map Building Based on Sensor Fusion for Autonomous Vehicle (자율주행을 위한 센서 데이터 융합 기반의 맵 생성)

  • Kang, Minsung;Hur, Soojung;Park, Ikhyun;Park, Yongwan
    • Transactions of the Korean Society of Automotive Engineers
    • /
    • v.22 no.6
    • /
    • pp.14-22
    • /
    • 2014
  • An autonomous vehicle requires a technology of generating maps by recognizing surrounding environment. The recognition of the vehicle's environment can be achieved by using distance information from a 2D laser scanner and color information from a camera. Such sensor information is used to generate 2D or 3D maps. A 2D map is used mostly for generating routs, because it contains information only about a section. In contrast, a 3D map involves height values also, and therefore can be used not only for generating routs but also for finding out vehicle accessible space. Nevertheless, an autonomous vehicle using 3D maps has difficulty in recognizing environment in real time. Accordingly, this paper proposes the technology for generating 2D maps that guarantee real-time recognition. The proposed technology uses only the color information obtained by removing height values from 3D maps generated based on the fusion of 2D laser scanner and camera data.

A Study on Intelligent Robot Bin-Picking System with CCD Camera and Laser Sensor (CCD카메라와 레이저 센서를 조합한 지능형 로봇 빈-피킹에 관한 연구)

  • Kim, Jin-Dae;Lee, Jeh-Won;Shin, Chan-Bai
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.23 no.11 s.188
    • /
    • pp.58-67
    • /
    • 2006
  • Due to the variety of signal processing and complicated mathematical analysis, it is not easy to accomplish 3D bin-picking with non-contact sensor. To solve this difficulties the reliable signal processing algorithm and a good sensing device has been recommended. In this research, 3D laser scanner and CCD camera is applied as a sensing device respectively. With these sensor we develop a two-step bin-picking method and reliable algorithm for the recognition of 3D bin object. In the proposed bin-picking, the problem is reduced to 2D intial recognition with CCD camera at first, and then 3D pose detection with a laser scanner. To get a good movement in the robot base frame, the hand eye calibration between robot's end effector and sensing device should be also carried out. In this paper, we examine auto-calibration technique in the sensor calibration step. A new thinning algorithm and constrained hough transform is also studied for the robustness in the real environment usage. From the experimental results, we could see the robust bin-picking operation under the non-aligned 3D hole object.

Capturing Distance Parameters Using a Laser Sensor in a Stereoscopic 3D Camera Rig System

  • Chung, Wan-Young;Ilham, Julian;Kim, Jong-Jin
    • Journal of Sensor Science and Technology
    • /
    • v.22 no.6
    • /
    • pp.387-392
    • /
    • 2013
  • Camera rigs for shooting 3D video are classified as manual, motorized, or fully automatic. Even in an automatic camera rig, the process of Stereoscopic 3D (S3D) video capture is very complex and time-consuming. One of the key time-consuming operations is capturing the distance parameters, which are near distance, far distance, and convergence distance. Traditionally these distances are measured by tape measure or triangular indirect measurement methods. These two methods consume a long time for every scene in shot. In our study, a compact laser distance sensing system with long range distance sensitivity is developed. The system is small enough to be installed on top of a camera and the measuring accuracy is within 2% even at a range of 50 m. The shooting time of an automatic camera rig equipped with the laser distance sensing system can be reduced significantly to less than a minute.

Development of multi-line laser vision sensor and welding application (멀티 라인 레이저 비전 센서를 이용한 고속 3차원 계측 및 모델링에 관한 연구)

  • 성기은;이세헌
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2002.05a
    • /
    • pp.169-172
    • /
    • 2002
  • A vision sensor measure range data using laser light source. This sensor generally use patterned laser which shaped single line. But this vision sensor cannot satisfy new trend which feeds foster and more precise processing. The sensor's sampling rate increases as reduced image processing time. However, the sampling rate can not over 30fps, because a camera has mechanical sampling limit. If we use multi line laser pattern, we will measure multi range data in one image. In the case of using same sampling rate camera, number of 2D range data profile in one second is directly proportional to laser line's number. For example, the vision sensor using 5 laser lines can sample 150 profiles per second in best condition.

  • PDF