• Title/Summary/Keyword: camera image

Search Result 4,918, Processing Time 0.032 seconds

Development of a Mobile Robot System for Visual Inspection under Hot Environment

  • Park, Sang-Deok;Lee, Ho-Gil;Kim, Hong-Seok;Son, Woong-Hee
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1506-1510
    • /
    • 2004
  • A mobile robot system is developed to inspect the condition of industrial facilities under hot environment. The mobile robot is equipped with internal and external heat insulating material, an internal cooling mechanism, two CCD cameras, wireless communication devices for both the control and image signals, and an embedded controller. The portable controller is equipped with two joysticks for both the mobile robot and the inspection CCD camera, an LCD monitor, and several buttons. The developed mobile robot travels on the internal floor in hot furnaces by operators' joystick operation, captures the images of facilities in the furnaces using a zoom CCD camera, and sends the images to the portable controller through wireless communication. The mobile robot can be operated without any problem under hot environment less than 400$^{\circ}C$ in 30 minutes. This kind of automatic inspection mobile robot can be helpful to prevent significant troubles of industrial facilities without danger of human beings under harmful environment.

  • PDF

A STUDY ON PERCEPTION METHOD OF THE MARKING LOCATION FOR AN AUTOMATION OF BILLET MARKING PROCESSES

  • Park, Jin-Woo;Yook, Hyun-Ho;Boo, Kwang-Suck;Che, Woo-Seong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1953-1957
    • /
    • 2004
  • The machine vision has been applied to a number of industrial applications for quality control and automations to improve the manufacturing processes. In this paper, the automation system using the machine vision is developed, which is applicable to the marking process in a steel production process line. The working environment is very harsh to workers so that the automatic system in the steel industry is required increasingly. The developed automatic marking system consists of several mechanical and electrical elements such as the laser position detecting sensor system for a structured laser beam which is projected to the billet in order to detect the geometry of the billet. An image processing algorithm has been developed to percept the two center positions of a camera and a billet, respectively, and to align two centers. A series of experiments has been conducted to investigate the performance of the proposed algorithm. The results show that two centers of the camera and the billet could be detected very well and differences between two center positions could be also decreased via the proposed tracking algorithm.

  • PDF

Design of Miniaturized Telemetry Module for Bi-Directional Wireless Endoscopy

  • Park, H. J.;H. W. Nam;B. S. Song;J. H. Cho
    • Proceedings of the IEEK Conference
    • /
    • 2002.07a
    • /
    • pp.494-496
    • /
    • 2002
  • A bi-directional and multi-channel wireless telemetry capsule, 11mm in diameter, is presented that can transmit video images from inside the human body and receive a control signal from an external control unit. The proposed telemetry capsule includes transmitting and receiving antennas, a demodulator, decoder, four LEDs, and CMOS image sensor, along with their driving circuits. The receiver demodulates the received signal radiated from the external control unit. Next, the decoder receives the stream of control signals and interprets five of the binary digits as an address code. Thereafter, the remaining signal is interpreted as four bits of binary data. Consequently, the proposed telemetry module can demodulate external signals so as to control the behavior of the camera and four LEDs during the transmission of video images. The proposed telemetry capsule can simultaneously transmit a video signal and receive a control signal determining the behavior of the capsule itself. As a result, the total power consumption of the telemetry capsule can be reduced by turning off the camera power during dead time and separately controlling the LEDs for proper illumination of the intestine.

  • PDF

Experimental Setup for Autonomous Navigation of Robotic Vehicle for University Campus (대학 캠퍼스용 로봇차량의 자율주행을 위한 실험환경 구축)

  • Cho, Sung Taek;Park, Young Jun;Jung, Seul
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.26 no.2
    • /
    • pp.105-112
    • /
    • 2016
  • This paper presents the experimental setup for autonomous navigation of a robotic vehicle for touring university campus. The robotic vehicle is developed for navigation of specific areas such as university campus or play parks. The robotic vehicle can carry two passengers to travel short distances. For the robotic vehicle to navigate autonomously the specific distance from the main gate to the administrative building in the university, the experimental setup for SLAM is presented. As an initial step, a simple method of following the line detected by a single camera is implemented for the partial area. The central line on the pavement colored with two kinds, red and yellow, is detected by image processing, and the robotic vehicle is commanded to follow the line. Experimental studies are conducted to demonstrate the performance of navigation as a possible touring vehicle.

The Development of Outdoor Augmented Reality System by GPS (GPS를 이용한 옥외용 증강현실 시스템 개발)

  • Choi, T.J.;Jang, B.T.;Han, S.H.;Kim, J.K.;Hur, W.
    • Proceedings of the IEEK Conference
    • /
    • 2000.06c
    • /
    • pp.153-156
    • /
    • 2000
  • In this paper, we developed an outdoor augmented reality system which has remote real scenes acquisition ability. The real scenes acquisition system consists of Image acquisition system, tracking system, wireless data transceiver and power supply. The tracking system consists of Tans Vector and RT-20 which measures a position and attitude of the CCD camera that attached to the remote control helicopter. Wireless data transceiver system is utilized for data transmission of remote system that of attitude, position information, and real scenes data that acquired by the CCD camera. Maximum propelling power of remote control helicopter is 15Kg, so we used 7.2V li-ion cell as a power supply for system minimize. As the results of experiment, the developing system presented application possibility of remote information acquisition system such as construction simulation & estimation, broadcasting, tour guide.

  • PDF

Study on Biophoton Emission from roots of Angelica sinensis D., Angelica acutiloba K., and Angelica pubescens M. (국내 수입되는 바디나물속 기원 한약재의 Biophoton(생체광자) 방출 특성 연구)

  • Park, Wan-Su;Lee, Chang-Hoon
    • The Korea Journal of Herbology
    • /
    • v.22 no.3
    • /
    • pp.39-45
    • /
    • 2007
  • Objectives : The purpose of this study is to investigate the delayed luminescence-biophoton emission from root of Angelica sinensis D., Angelica acutiloba K., and Angelica pubescens M. These three species of Genus Angelica are now imported from other nations into Republic of Korea. Methods : Randomly selected samples from roots of Angelica sinensis D., Angelica acutiloba K., and Angelica pubescens M. were radiated with 150 W metal halide lamp for 1 minute. After radiation. biophoton emissions of each sample were detected by electron multiplication(EM)-charge coupled device camera. The detected biophoton image was calculated with unit of counts per pixel. Results: The average biophoton emissions of delayed luminescence with EM ratio of $\times$150 and $\times$250 were distinguished significantly. The maximum biophoton emissions of delayed luminescence with EM ratio of $\times$250 were distinguished significantly. Conclusions : These results suggest that biophoton imaging of roots of Angelica sinensis D., Angelica acutiloba K., and Angelica pubescens M. could become the meaningful method for the study of differentiation for these three species of Genus Angelica.

  • PDF

FOCAL REDUCER FOR CQUEAN (Camera for QUasars in EArly uNiverse)

  • Lim, Juhee;Chang, Seunghyuk;Pak, Soojong;Kim, Youngju;Park, Won-Kee;Im, Myungshin
    • Journal of The Korean Astronomical Society
    • /
    • v.46 no.4
    • /
    • pp.161-172
    • /
    • 2013
  • A focal reducer is developed for CQUEAN (Camera for QUasars in EArly uNiverse), which is a CCD imaging system on the 2.1 m Otto Struve telescope at the McDonald observatory. It allows CQUEAN to secure a wider field of view by reducing the effective focal length by a factor of three. The optical point spread function without seeing effects is designed to be within one pixel ($0.283^{\prime\prime}$) over the field of view of $4.82^{\prime}{\times}4.82^{\prime}$ in optimum wavelength ranges of 0.8-1.1 ${\mu}m$. In this paper, we describe and discuss the characteristics of optical design, the lens and barrel fabrications and the alignment processes. The observation results show that the image quality of the focal reducer confirms the expectations from the design.

Construction of Two-Dimensional Database of Korean Traditional Shoes for the Development of Cultural Contents(1) (문화콘텐츠개발을 위한 한국 전통신발의 2D데이터베이스 구축(1))

  • Park, Hea-Ryung
    • Fashion & Textile Research Journal
    • /
    • v.12 no.6
    • /
    • pp.796-811
    • /
    • 2010
  • Research materials of Korean traditional shoes have so far been mainly literary explanations or plane pictures expressed on the basis of the explanations and photographs of incomplete forms of relics excavated and it makes us have difficulty in observing them visually and producing products with them by design application. This project is to establish database of literal data of Korean traditional shoes and visual data using 3D in order to make the foundation of developing culture industry contents using Korean traditional shoes. According to the initial research plan. first. it analyzed and arranged the Korean traditional shoes into period. sex and function as the research goals of the first year. categorized the form. composition. materials. patterns. and colors of traditional shoes and then database of the materials was performed with text. Second. visual image materials including forms. composition. materials. patterns. and colors of traditional shoes were established as database with scanner. digital camera and computer 2D. Results of such a database will be able to be used as important materials which can be the foundation of culture industry contents development of traditional shoes and be the materials for developing digital culture contents of traditional shoes and teaching Korean traditional culture.

A real-time robust body-part tracking system for intelligent environment (지능형 환경을 위한 실시간 신체 부위 추적 시스템 -조명 및 복장 변화에 강인한 신체 부위 추적 시스템-)

  • Jung, Jin-Ki;Cho, Kyu-Sung;Choi, Jin;Yang, Hyun S.
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.411-417
    • /
    • 2009
  • We proposed a robust body part tracking system for intelligent environment that will not limit freedom of users. Unlike any previous gesture recognizer, we upgraded the generality of the system by creating the ability the ability to recognize details, such as, the ability to detect the difference between long sleeves and short sleeves. For the precise each body part tracking, we obtained the image of hands, head, and feet separately from a single camera, and when detecting each body part, we separately chose the appropriate feature for certain parts. Using a calibrated camera, we transferred 2D detected body parts into the 3D posture. In the experimentation, this system showed advanced hand tracking performance in real time(50fps).

  • PDF

Cylindrical Object Recognition using Sensor Data Fusion (센서데이터 융합을 이용한 원주형 물체인식)

  • Kim, Dong-Gi;Yun, Gwang-Ik;Yun, Ji-Seop;Gang, Lee-Seok
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.7 no.8
    • /
    • pp.656-663
    • /
    • 2001
  • This paper presents a sensor fusion method to recognize a cylindrical object a CCD camera, a laser slit beam and ultrasonic sensors on a pan/tilt device. For object recognition with a vision sensor, an active light source projects a stripe pattern of light on the object surface. The 2D image data are transformed into 3D data using the geometry between the camera and the laser slit beam. The ultrasonic sensor uses an ultrasonic transducer array mounted in horizontal direction on the pan/tilt device. The time of flight is estimated by finding the maximum correlation between the received ultrasonic pulse and a set of stored templates - also called a matched filter. The distance of flight is calculated by simply multiplying the time of flight by the speed of sound and the maximum amplitude of the filtered signal is used to determine the face angle to the object. To determine the position and the radius of cylindrical objects, we use a statistical sensor fusion. Experimental results show that the fused data increase the reliability for the object recognition.

  • PDF