• Title/Summary/Keyword: Polar Mapping

Search Result 52, Processing Time 0.016 seconds

Sensitivity Experiment of Surface Reflectance to Error-inducing Variables Based on the GEMS Satellite Observations (GEMS 위성관측에 기반한 지면반사도 산출 시에 오차 유발 변수에 대한 민감도 실험)

  • Shin, Hee-Woo;Yoo, Jung-Moon
    • Journal of the Korean earth science society
    • /
    • v.39 no.1
    • /
    • pp.53-66
    • /
    • 2018
  • The information of surface reflectance ($R_{sfc}$) is important for the heat balance and the environmental/climate monitoring. The $R_{sfc}$ sensitivity to error-induced variables for the Geostationary Environment Monitoring Spectrometer (GEMS) retrieval from geostationary-orbit satellite observations at 300-500 nm was investigated, utilizing polar-orbit satellite data of the MODerate resolution Imaging Spectroradiometer (MODIS) and Ozone Mapping Instrument (OMI), and the radiative transfer model (RTM) experiment. The variables in this study can be cloud, Rayleigh-scattering, aerosol, ozone and surface type. The cloud detection in high-resolution MODIS pixels ($1km{\times}1km$) was compared with that in GEMS-scale pixels ($8km{\times}7km$). The GEMS detection was consistent (~79%) with the MODIS result. However, the detection probability in partially-cloudy (${\leq}40%$) GEMS pixels decreased due to other effects (i.e., aerosol and surface type). The Rayleigh-scattering effect in RGB images was noticeable over ocean, based on the RTM calculation. The reflectance at top of atmosphere ($R_{toa}$) increased with aerosol amounts in case of $R_{sfc}$<0.2, but decreased in $R_{sfc}{\geq}0.2$. The $R_{sfc}$ errors due to the aerosol increased with wavelength in the UV, but were constant or slightly decreased in the visible. The ozone absorption was most sensitive at 328 nm in the UV region (328-354 nm). The $R_{sfc}$ error was +0.1 because of negative total ozone anomaly (-100 DU) under the condition of $R_{sfc}=0.15$. This study can be useful to estimate $R_{sfc}$ uncertainties in the GEMS retrieval.

Physical Offset of UAVs Calibration Method for Multi-sensor Fusion (다중 센서 융합을 위한 무인항공기 물리 오프셋 검보정 방법)

  • Kim, Cheolwook;Lim, Pyeong-chae;Chi, Junhwa;Kim, Taejung;Rhee, Sooahm
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1125-1139
    • /
    • 2022
  • In an unmanned aerial vehicles (UAVs) system, a physical offset can be existed between the global positioning system/inertial measurement unit (GPS/IMU) sensor and the observation sensor such as a hyperspectral sensor, and a lidar sensor. As a result of the physical offset, a misalignment between each image can be occurred along with a flight direction. In particular, in a case of multi-sensor system, an observation sensor has to be replaced regularly to equip another observation sensor, and then, a high cost should be paid to acquire a calibration parameter. In this study, we establish a precise sensor model equation to apply for a multiple sensor in common and propose an independent physical offset estimation method. The proposed method consists of 3 steps. Firstly, we define an appropriate rotation matrix for our system, and an initial sensor model equation for direct-georeferencing. Next, an observation equation for the physical offset estimation is established by extracting a corresponding point between a ground control point and the observed data from a sensor. Finally, the physical offset is estimated based on the observed data, and the precise sensor model equation is established by applying the estimated parameters to the initial sensor model equation. 4 region's datasets(Jeon-ju, Incheon, Alaska, Norway) with a different latitude, longitude were compared to analyze the effects of the calibration parameter. We confirmed that a misalignment between images were adjusted after applying for the physical offset in the sensor model equation. An absolute position accuracy was analyzed in the Incheon dataset, compared to a ground control point. For the hyperspectral image, root mean square error (RMSE) for X, Y direction was calculated for 0.12 m, and for the point cloud, RMSE was calculated for 0.03 m. Furthermore, a relative position accuracy for a specific point between the adjusted point cloud and the hyperspectral images were also analyzed for 0.07 m, so we confirmed that a precise data mapping is available for an observation without a ground control point through the proposed estimation method, and we also confirmed a possibility of multi-sensor fusion. From this study, we expect that a flexible multi-sensor platform system can be operated through the independent parameter estimation method with an economic cost saving.