• Title/Summary/Keyword: camera monitoring

Search Result 745, Processing Time 0.029 seconds

Analysis on Mapping Accuracy of a Drone Composite Sensor: Focusing on Pre-calibration According to the Circumstances of Data Acquisition Area (드론 탑재 복합센서의 매핑 정확도 분석: 데이터 취득 환경에 따른 사전 캘리브레이션 여부를 중심으로)

  • Jeon, Ilseo;Ham, Sangwoo;Lee, Impyeong
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.3
    • /
    • pp.577-589
    • /
    • 2021
  • Drone mapping systems can be applied to many fields such as disaster damage investigation, environmental monitoring, and construction process monitoring. To integrate individual sensors attached to a drone, it was essential to undergo complicated procedures including time synchronization. Recently, a variety of composite sensors are released which consist of visual sensors and GPS/INS. Composite sensors integrate multi-sensory data internally, and they provide geotagged image files to users. Therefore, to use composite sensors in drone mapping systems, mapping accuracies from composite sensors should be examined. In this study, we analyzed the mapping accuracies of a composite sensor, focusing on the data acquisition area and pre-calibration effect. In the first experiment, we analyzed how mapping accuracy varies with the number of ground control points. When 2 GCPs were used for mapping, the total RMSE has been reduced by 40 cm from more than 1 m to about 60 cm. In the second experiment, we assessed mapping accuracies based on whether pre-calibration is conducted or not. Using a few ground control points showed the pre-calibration does not affect mapping accuracies. The formation of weak geometry of the image sequences has resulted that pre-calibration can be essential to decrease possible mapping errors. In the absence of ground control points, pre-calibration also can improve mapping errors. Based on this study, we expect future drone mapping systems using composite sensors will contribute to streamlining a survey and calibration process depending on the data acquisition circumstances.

Individual Ortho-rectification of Coast Guard Aerial Images for Oil Spill Monitoring (유출유 모니터링을 위한 해경 항공 영상의 개별정사보정)

  • Oh, Youngon;Bui, An Ngoc;Choi, Kyoungah;Lee, Impyeong
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1479-1488
    • /
    • 2022
  • Accidents in which oil spills occur intermittently in the ocean due to ship collisions and sinkings. In order to prepare prompt countermeasures when such an accident occurs, it is necessary to accurately identify the current status of spilled oil. To this end, the Coast Guard patrols the target area with a fixed-wing airplane or helicopter and checks it with the naked eye or video, but it was difficult to determine the area contaminated by the spilled oil and its exact location on the map. Accordingly, this study develops a technology for direct ortho-rectification by automatically geo-referencing aerial images collected by the Coast Guard without individual ground reference points to identify the current status of spilled oil. First, meta information required for georeferencing is extracted from a visualized screen of sensor information such as video by optical character recognition (OCR). Based on the extracted information, the external orientation parameters of the image are determined. Images are individually orthorectified using the determined the external orientation parameters. The accuracy of individual orthoimages generated through this method was evaluated to be about tens of meters up to 100 m. The accuracy level was reasonably acceptable considering the inherent errors of the position and attitude sensors, the inaccuracies in the internal orientation parameters such as camera focal length, without using no ground control points. It is judged to be an appropriate level for identifying the current status of spilled oil contaminated areas in the sea. In the future, if real-time transmission of images captured during flight becomes possible, individual orthoimages can be generated in real time through the proposed individual orthorectification technology. Based on this, it can be effectively used to quickly identify the current status of spilled oil contamination and establish countermeasures.

Comparative Analysis of Pre-processing Method for Standardization of Multi-spectral Drone Images (다중분광 드론영상의 표준화를 위한 전처리 기법 비교·분석)

  • Ahn, Ho-Yong;Ryu, Jae-Hyun;Na, Sang-il;Lee, Byung-mo;Kim, Min-ji;Lee, Kyung-do
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1219-1230
    • /
    • 2022
  • Multi-spectral drones in agricultural observation require quantitative and reliable data based on physical quantities such as radiance or reflectance in crop yield analysis. In the case of remote sensing data for crop monitoring, images taken in the same area over time-series are required. In particular, biophysical data such as leaf area index or chlorophyll are analyzed through time-series data under the same reference, it can be directly analyzed. So, comparable reflectance data are required. Orthoimagery using drone images, the entire image pixel values are distorted or there is a difference in pixel values at the junction boundary, which limits accurate physical quantity estimation. In this study, reflectance and vegetation index based on drone images were calculated according to the correction method of drone images for time-series crop monitoring. comparing the drone reflectance and ground measured data for spectral characteristics analysis.

Cause of Rockfall at Natural Monument Pohang Daljeon-ri Columnar Joint (천연기념물 포항 달전리 주상절리의 낙석 발생원인)

  • Kim, Jae Hwan;Kong, Dal-Yong
    • Economic and Environmental Geology
    • /
    • v.55 no.5
    • /
    • pp.497-510
    • /
    • 2022
  • Monthly monitoring, 3D scan survey, and electrical resistivity survey were conducted from January 2018 to August 2022 to identify the cause of rockfall occurring in Daljeon-ri Columnar Joint (Natural Monument No. 415), Pohang. A total of 3,231 rocks fell from the columnar joint over the past 5 years, and 1,521 (47%) of the falling rocks were below 20 cm in length, 978 (30.3%) of 20-30 cm, and 732 (22.7%) of rocks over 30 cm. While the number of rockfalls by year has decreased since 2018, the frequency of rockfalls bigger than 30 cm tends to increase. Large-scale rockfalls occurred mainly during the thawing season (March-April) and the rainy season (June-July), and the analysis of the relationship between cumulative rainfall and rockfall occurrence showed that cumulative rainfall for 3 to 4 days is also closely related to the occurrence of rockfall. Smectite and illite, which are expansible clay minerals, were observed in XRD analysis of the slope material (filling minerals) in the columnar joint, and the presence of a fault fracture zone was confirmed in the electrical resistivity survey. In addition, the confirmed fault fracture zone and the maximum erosion point analyzed through 3D precision measurement coincided with the main rockfall occurrence point observed by the BTC-6PXD camera. Therefore, the main cause of rockfall at Daljeon-ri columnar joint in Pohang is a combination of internal factors (development of fault fracture zones and joints, weathering of rocks, presence of expansive clay minerals) and external factors (precipitation, rapid thawing phenomenon), resulting in large-scale rockfall. Meanwhile, it was also confirmed that the Pohang-Gyeongju earthquake, which was continuously raised, was not the main cause.

Participation in G-CLEF Preliminary Design Study by KASI

  • Kim, Kang-Min;Chun, Moo-Young;Park, Chan;Park, Sung-Joon;Kim, Jihun;Oh, Jae Sok;Jang, Jeong Gyun;Jang, Bi Ho;Tahk, Gyungmo;Nah, Jakyoung;Yu, Young Sam;Szentgyorgyi, Andrew;Norton, Timothy;Podgorski, William;Evans, Ian;Mueller, Mark;Uomoto, Alan;Crane, Jeffrey;Hare, Tyson
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.40 no.1
    • /
    • pp.52.3-53
    • /
    • 2015
  • The GMT-Consortium Large Earth Finder (G-CLEF) is a fiber-fed, optical band high dispersion echelle spectrograph that selected as the first light instrument for the Giant Magellan Telescope (GMT). This G-CLEF has been designed to be a general- purpose echelle spectrograph with the precisional radial velocity (PRV) capability of 10 cm/sec as a goal. The preliminary design review (PDR) was held on April 8 to 10, 2015 and the scientific observations will be started in 2022 with four mirrors installed on GMT. We have been participating in this preliminary design study in flexure control camera (slit monitoring system), calibration lamp sources, dichroic assembly and the fabrication of the proto-Mangin Mirror. We present the design concept on the parts KASI undertaken, introducing the specifications and capabilities of G-CLEF.

  • PDF

Position Control of Mobile Robot for Human-Following in Intelligent Space with Distributed Sensors

  • Jin Tae-Seok;Lee Jang-Myung;Hashimoto Hideki
    • International Journal of Control, Automation, and Systems
    • /
    • v.4 no.2
    • /
    • pp.204-216
    • /
    • 2006
  • Latest advances in hardware technology and state of the art of mobile robot and artificial intelligence research can be employed to develop autonomous and distributed monitoring systems. And mobile service robot requires the perception of its present position to coexist with humans and support humans effectively in populated environments. To realize these abilities, robot needs to keep track of relevant changes in the environment. This paper proposes a localization of mobile robot using the images by distributed intelligent networked devices (DINDs) in intelligent space (ISpace) is used in order to achieve these goals. This scheme combines data from the observed position using dead-reckoning sensors and the estimated position using images of moving object, such as those of a walking human, used to determine the moving location of a mobile robot. The moving object is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the intelligent space. Using the a priori known path of a moving object and a perspective camera model, the geometric constraint equations that represent the relation between image frame coordinates of a moving object and the estimated position of the robot are derived. The proposed method utilizes the error between the observed and estimated image coordinates to localize the mobile robot, and the Kalman filtering scheme is used to estimate the location of moving robot. The proposed approach is applied for a mobile robot in ISpace to show the reduction of uncertainty in the determining of the location of the mobile robot. Its performance is verified by computer simulation and experiment.

PID Controled UAV Monitoring System for Fire-Event Detection (PID 제어 UAV를 이용한 발화 감지 시스템의 구현)

  • Choi, Jeong-Wook;Kim, Bo-Seong;Yu, Je-Min;Choi, Ji-Hoon;Lee, Seung-Dae
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.15 no.1
    • /
    • pp.1-8
    • /
    • 2020
  • If a dangerous situation arises in a place where out of reach from the human, UAVs can be used to determine the size and location of the situation to reduce the further damage. With this in mind, this paper sets the minimum value of the roll, pitch, and yaw using beta flight to detect the UAV's smooth hovering, integration, and derivative (PID) values to ensure that the UAV stays horizontal, minimizing errors for safe hovering, and the camera uses Open CV to install the Raspberry Pi program and then HSV (color, saturation, Brightness) using the color palette, the filter is black and white except for the red color, which is the closest to the fire we want, so that the UAV detects the image in the air in real time. Finally, it was confirmed that hovering was possible at a height of 0.5 to 5m, and red color recognition was possible at a distance of 5cm and at a distance of 5m.

Development of vision system for the recognition of character image which was included at the slab image (슬라브 영상에 포함된 문자영상의 인식을 위한 비전시스템의 개발)

  • Park, Sang-Gug
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.12 no.1
    • /
    • pp.95-100
    • /
    • 2007
  • In the steel & iron processing line, some characters are marked for the material management in the surface of material. This paper describes about the developed results of vision system for the recognition of material management characters, which was included in the slab image. Our vision system for the character recognition includes that CCD camera system which acquire slab image, optical transmission system which transmit captured image to the long distance, input and output system for the interface with existing system and monitoring system for the checking of recognition results. We have installed our vision system at the continuous casting line and tested. Also, we have performed inspection of durability, reliability and recognition rate. Through the testing, we have confirmed that our system have high recognition rate, 97.4%.

  • PDF

A Development of Urban Farm Management System based on USN (USN 기반의 도시 농업 관리 시스템 개발)

  • Ryu, Dae-Hyun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.8 no.12
    • /
    • pp.1917-1922
    • /
    • 2013
  • The objective of this study is developing urban farm management system based on USN for remote monitoring and control. This system makes it easy to manage urban farm and make the database of collected information for to build the best environment for growing crops. For this, we build a green house and installed several types of sensors and camera through which the remote sensing information collected. In addition, building a web page for user convenience and information in real time to enable control. We confirmed experimentally all functions related to stability for a long period of time through field tests such as collection and transfer of information, environmental control in green house. It will be convenient for farmers to grow crops by providing the time and space constraints and a lot of flexibility. In addition, factory, office, home like environment, including facilities for it will be possible to extend.

Measurement of Soil Deformation around the Tip of Model Pile by Close-Range Photogrammetry (근접 사진측량에 의한 모형말뚝 선단부 주변의 지반 변형 측정)

  • Lee, Chang No;Oh, Jae Hong
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.31 no.2
    • /
    • pp.173-180
    • /
    • 2013
  • In this paper, we studied on measurement of soil deformation around the tip of model pile by close-range photogrammetry. The rigorous bundle adjustment method was utilized to monitor the soil deformation in the laboratory model pile-load test as function of incremental penetration of the pile. Control points were installed on the frame of the laboratory model box case and more than 150 target points were inserted inside the soil around the model pile and on the surface. Four overlapping images including three horizontal and one vertical image were acquired by a non-metric camera for each penetration step. The images were processed to automatically locate the control and target points in the images for the self-calibration and the bundle adjustment. During the bundle adjustment, the refraction index of the acrylic case of the laboratory model was accounted for accurate measurement. The experiment showed the proposed approach enabled the automated photogrammetric monitoring of soil deformation around the tip of model pile.