• Title/Summary/Keyword: Flight vehicles

Search Result 305, Processing Time 0.025 seconds

Transient Heat Transfer Analysis of Small Launch Vehicle Common Bulkhead Propellant Tank with Different Insulation Thickness (소형발사체 공통격벽 추진제 탱크의 단열재 두께 변화에 따른 과도 열전달 해석)

  • Ji-Yoon Yang;Gyeong-Han Lee;Sang-Woo Kim;Soo-Yong Lee
    • Journal of Aerospace System Engineering
    • /
    • v.18 no.3
    • /
    • pp.70-75
    • /
    • 2024
  • The insulation performance of a common bulkhead propellant tank for small launch vehicles with variations in insulation thickness was analyzed. The common bulkhead propellant tank composed of a single part allows for lightweight design, as it eliminates the need for tank connections. However, problems such as propellant loss and ignition delay due to heat transfer caused by temperature differences between oxidizer and fuel may arise. Therefore, it is essential to verify the insulation performance of the common bulkhead structure that separates the oxidizer tank and fuel tank. In this study, transient heat transfer analysis was conducted for propellant tanks with insulation thicknesses of (50, 55, 60, 65, and 70) mm to analyze the insulation performance using boil-off mass. Subsequently, the boil-off mass of the oxidizer generated during the first-stage flight time of the propellant tank was determined. The results confirmed that increasing the insulation thickness reduces the boil-off mass, thereby improving the insulation performance.

Insurance system for legal settlement of drone accidents (드론사고의 법적 구제에 관한 보험제도)

  • Kim, Sun-Ihee;Kwon, Min-Hee
    • The Korean Journal of Air & Space Law and Policy
    • /
    • v.33 no.1
    • /
    • pp.227-260
    • /
    • 2018
  • Recently, as the use of drones increases, the risk of drone accidents and third-party property damage is also increasing. In Korea, due to the recent increase in drone use, accidents have been frequently reported in the media. The number of reports from citizens, and military and police calls regarding illegal or inappropriate drone use has also been increasing. Drone operators may be responsible for paying damages to third parties due to drone accidents, and are liable for paying settlements due to illegal video recording. Therefore, it is necessary to study the idea of providing drone insurance, which can mitigate the liability and risk caused by drone accidents. In the US, comprehensive housing insurance covers damages caused by recreational drones around the property. In the UK, when a drone accident occurs, the drone owner or operator bears strict liability. Also, in the UK, drone insurance joining obligation depends on the weight of the drones and their intended use. In Germany, in the event of personal or material damage, drone owner bears strict liability as long as their drone is registered as an aircraft. Germany also requires by law that all drone owners carry liability insurance. In Korea, insurance is required only for "ultra-light aircraft use businesses, airplane rental companies and leisure sports businesses," where the aircraft is "paid for according to the demand of others." Therefore, it can be difficult to file claims for third party damages caused by unmanned aerial vehicles in personal use. Foreign insurance companies are selling drone insurance that covers a variety of damages that can occur during drone accidents. Some insurance companies in Korea also have developed and sell drone insurance. However, the premiums are very high. In addition, drone insurance that addresses specific problems related to drone accidents is also lacking. In order for drone insurance to be viable, it is first necessary to reduce the insurance premiums or rates. In order to trim the excess cost of drone insurance premiums, drone flight data should be accessible to the insurance company, possibly provided by the drone pilot project. Finally, in order to facilitate claims by third parties, it is necessary to study how to establish specific policy language that addresses drone weight, location, and flight frequency.

Derivation of Green Coverage Ratio Based on Deep Learning Using MAV and UAV Aerial Images (유·무인 항공영상을 이용한 심층학습 기반 녹피율 산정)

  • Han, Seungyeon;Lee, Impyeong
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.6_1
    • /
    • pp.1757-1766
    • /
    • 2021
  • The green coverage ratio is the ratio of the land area to green coverage area, and it is used as a practical urban greening index. The green coverage ratio is calculated based on the land cover map, but low spatial resolution and inconsistent production cycle of land cover map make it difficult to calculate the correct green coverage area and analyze the precise green coverage. Therefore, this study proposes a new method to calculate green coverage area using aerial images and deep neural networks. Green coverage ratio can be quickly calculated using manned aerial images acquired by local governments, but precise analysis is difficult because components of image such as acquisition date, resolution, and sensors cannot be selected and modified. This limitation can be supplemented by using an unmanned aerial vehicle that can mount various sensors and acquire high-resolution images due to low-altitude flight. In this study, we proposed a method to calculate green coverage ratio from manned or unmanned aerial images, and experimentally verified the proposed method. Aerial images enable precise analysis by high resolution and relatively constant cycles, and deep learning can automatically detect green coverage area in aerial images. Local governments acquire manned aerial images for various purposes every year and we can utilize them to calculate green coverage ratio quickly. However, acquired manned aerial images may be difficult to accurately analyze because details such as acquisition date, resolution, and sensors cannot be selected. These limitations can be supplemented by using unmanned aerial vehicles that can mount various sensors and acquire high-resolution images due to low-altitude flight. Accordingly, the green coverage ratio was calculated from the two aerial images, and as a result, it could be calculated with high accuracy from all green types. However, the green coverage ratio calculated from manned aerial images had limitations in complex environments. The unmanned aerial images used to compensate for this were able to calculate a high accuracy of green coverage ratio even in complex environments, and more precise green area detection was possible through additional band images. In the future, it is expected that the rust rate can be calculated effectively by using the newly acquired unmanned aerial imagery supplementary to the existing manned aerial imagery.

Comparison of Reflectance and Vegetation Index Changes by Type of UAV-Mounted Multi-Spectral Sensors (무인비행체 탑재 다중분광 센서별 반사율 및 식생지수 변화 비교)

  • Lee, Kyung-do;Ahn, Ho-yong;Ryu, Jae-hyun;So, Kyu-ho;Na, Sang-il
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_1
    • /
    • pp.947-958
    • /
    • 2021
  • This study was conducted to provide basic data for crop monitoring by comparing and analyzing changes in reflectance and vegetation index by sensor of multi-spectral sensors mounted on unmanned aerial vehicles. For four types of unmanned aerial vehicle-mounted multispectral sensors, such as RedEdge-MX, S110 NIR, Sequioa, and P4M, on September 14 and September 15, 2020, aerial images were taken, once in the morning and in the afternoon, a total of 4 times, and reflectance and vegetation index were calculated and compared. In the case of reflectance, the time-series coefficient of variation of all sensors showed an average value of about 10% or more, indicating that there is a limit to its use. The coefficient of variation of the vegetation index by sensor for the crop test group showed an average value of 1.2 to 3.6% in the crop experimental sites with high vitality due to thick vegetation, showing variability within 5%. However, this was a higher value than the coefficient of variation on a clear day, and it is estimated that the weather conditions such as clouds were different in the morning and afternoon during the experiment period. It is thought that it is necessary to establish and implement a UAV flight plan. As a result of comparing the NDVI between the multi-spectral sensors of the unmanned aerial vehicle, in this experiment, it is thought that the RedEdeg-MX sensor can be used together without special correction of the NDVI value even if several sensors of the same type are used in a stable light environment. RedEdge-MX, P4M, and Sequioa sensors showed a linear relationship with each other, but supplementary experiments are needed to evaluate joint utilization through off-set correction between vegetation indices.

Physical Offset of UAVs Calibration Method for Multi-sensor Fusion (다중 센서 융합을 위한 무인항공기 물리 오프셋 검보정 방법)

  • Kim, Cheolwook;Lim, Pyeong-chae;Chi, Junhwa;Kim, Taejung;Rhee, Sooahm
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1125-1139
    • /
    • 2022
  • In an unmanned aerial vehicles (UAVs) system, a physical offset can be existed between the global positioning system/inertial measurement unit (GPS/IMU) sensor and the observation sensor such as a hyperspectral sensor, and a lidar sensor. As a result of the physical offset, a misalignment between each image can be occurred along with a flight direction. In particular, in a case of multi-sensor system, an observation sensor has to be replaced regularly to equip another observation sensor, and then, a high cost should be paid to acquire a calibration parameter. In this study, we establish a precise sensor model equation to apply for a multiple sensor in common and propose an independent physical offset estimation method. The proposed method consists of 3 steps. Firstly, we define an appropriate rotation matrix for our system, and an initial sensor model equation for direct-georeferencing. Next, an observation equation for the physical offset estimation is established by extracting a corresponding point between a ground control point and the observed data from a sensor. Finally, the physical offset is estimated based on the observed data, and the precise sensor model equation is established by applying the estimated parameters to the initial sensor model equation. 4 region's datasets(Jeon-ju, Incheon, Alaska, Norway) with a different latitude, longitude were compared to analyze the effects of the calibration parameter. We confirmed that a misalignment between images were adjusted after applying for the physical offset in the sensor model equation. An absolute position accuracy was analyzed in the Incheon dataset, compared to a ground control point. For the hyperspectral image, root mean square error (RMSE) for X, Y direction was calculated for 0.12 m, and for the point cloud, RMSE was calculated for 0.03 m. Furthermore, a relative position accuracy for a specific point between the adjusted point cloud and the hyperspectral images were also analyzed for 0.07 m, so we confirmed that a precise data mapping is available for an observation without a ground control point through the proposed estimation method, and we also confirmed a possibility of multi-sensor fusion. From this study, we expect that a flexible multi-sensor platform system can be operated through the independent parameter estimation method with an economic cost saving.