• 제목/요약/키워드: crop separability

검색결과 3건 처리시간 0.014초

OSMI 파장영역에서 옥수수와 두류작물의 분광반사특성 (Characteristics of Spectral Reflectance for Corns and Legumes at OSMI(Ocean Scanning Multi-spectral Imager) Bands)

  • 홍석영;임상규;황선주;김선오
    • 대한원격탐사학회지
    • /
    • 제14권4호
    • /
    • pp.343-352
    • /
    • 1998
  • 주요 밭작물인 옥수수, 땅콩, 콩에 대한 시계열 분광반사율 자료를 수집하여, 작물의 재배력과 분광반사 특성 차이에 따른 작물간 구분여부를 살펴보았다. 분광반사계를 이용하여 각 작물의 생육기간 동안 0.33~1.10 $mu{m}$ 범위의 파장영역에서 약 12회 분광반사율을 측정하였다. OSMI의 중심 파장역 또는 중심 파장역과 가장 가까운 채널인 445nm, 490nm, 510nm, 555nm, 690nm, 865nm를 각각 OSMI 동등 밴드 1,2,3,4,5,6 으로 하였다. OSMI 동등 밴드와 밴드간 비율을 이용하여 일반선형모형(GLM) 분석한 결과, 분광반사율 측정일 중 옥수수와 두류작물을 구분하기에 가장 적합한 시기는 6월 22일인 것으로 나타났다. 또한 옥수수 생육에 따른 분광반사율과 밴드간 비율을 이용하여 청예용 옥수수인 rs510의 영양생장기 동안 생육단계를 추정하는 다중회귀식을 만들었다.

Spectral Reflectance Signatures of Major Upland Crops at OSMI Bands

  • Hong, Suk-Young;Rim, Sang-Kyu;Jung, Won-Kyo
    • 대한원격탐사학회:학술대회논문집
    • /
    • 대한원격탐사학회 1998년도 Proceedings of International Symposium on Remote Sensing
    • /
    • pp.370-375
    • /
    • 1998
  • Spectral reflectance signatures of upland crops at OSMI bands were collected and evaluated for the feasibility of crop discrimination knowledge-based on crop calendar. Effective bands and their ratio values for discriminating corn from two other legumes were defined with OSMI equivalent bands and their ratio values. June 22 among measurements dates was the best date for corn discrimination from two other legumes, peanut and soybean, because all OSMI equivalent bands and their ratio values in June 22 were highly significant for corn separability. Phenological growth stage of a silage corn (rs510) could be estimated as a function of spectral reflectance signatures in vegetative stage. Five growth stage prediction models were generated by the SAS procedures REG and STEPWISE with OSMI equivalent bands and their ratio values in vegetative stage.

  • PDF

Two-stage Deep Learning Model with LSTM-based Autoencoder and CNN for Crop Classification Using Multi-temporal Remote Sensing Images

  • Kwak, Geun-Ho;Park, No-Wook
    • 대한원격탐사학회지
    • /
    • 제37권4호
    • /
    • pp.719-731
    • /
    • 2021
  • This study proposes a two-stage hybrid classification model for crop classification using multi-temporal remote sensing images; the model combines feature embedding by using an autoencoder (AE) with a convolutional neural network (CNN) classifier to fully utilize features including informative temporal and spatial signatures. Long short-term memory (LSTM)-based AE (LAE) is fine-tuned using class label information to extract latent features that contain less noise and useful temporal signatures. The CNN classifier is then applied to effectively account for the spatial characteristics of the extracted latent features. A crop classification experiment with multi-temporal unmanned aerial vehicle images is conducted to illustrate the potential application of the proposed hybrid model. The classification performance of the proposed model is compared with various combinations of conventional deep learning models (CNN, LSTM, and convolutional LSTM) and different inputs (original multi-temporal images and features from stacked AE). From the crop classification experiment, the best classification accuracy was achieved by the proposed model that utilized the latent features by fine-tuned LAE as input for the CNN classifier. The latent features that contain useful temporal signatures and are less noisy could increase the class separability between crops with similar spectral signatures, thereby leading to superior classification accuracy. The experimental results demonstrate the importance of effective feature extraction and the potential of the proposed classification model for crop classification using multi-temporal remote sensing images.