• Title/Summary/Keyword: Weed Detection using Deep Learning

Search Result 2, Processing Time 0.02 seconds

Towards Real Time Detection of Rice Weed in Uncontrolled Crop Conditions (통제되지 않는 농작물 조건에서 쌀 잡초의 실시간 검출에 관한 연구)

  • Umraiz, Muhammad;Kim, Sang-cheol
    • Journal of Internet of Things and Convergence
    • /
    • v.6 no.1
    • /
    • pp.83-95
    • /
    • 2020
  • Being a dense and complex task of precisely detecting the weeds in practical crop field environment, previous approaches lack in terms of speed of processing image frames with accuracy. Although much of the attention has been given to classify the plants diseases but detecting crop weed issue remained in limelight. Previous approaches report to use fast algorithms but inference time is not even closer to real time, making them impractical solutions to be used in uncontrolled conditions. Therefore, we propose a detection model for the complex rice weed detection task. Experimental results show that inference time in our approach is reduced with a significant margin in weed detection task, making it practically deployable application in real conditions. The samples are collected at two different growth stages of rice and annotated manually

Deep Learning Approaches for Accurate Weed Area Assessment in Maize Fields (딥러닝 기반 옥수수 포장의 잡초 면적 평가)

  • Hyeok-jin Bak;Dongwon Kwon;Wan-Gyu Sang;Ho-young Ban;Sungyul Chang;Jae-Kyeong Baek;Yun-Ho Lee;Woo-jin Im;Myung-chul Seo;Jung-Il Cho
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.25 no.1
    • /
    • pp.17-27
    • /
    • 2023
  • Weeds are one of the factors that reduce crop yield through nutrient and photosynthetic competition. Quantification of weed density are an important part of making accurate decisions for precision weeding. In this study, we tried to quantify the density of weeds in images of maize fields taken by unmanned aerial vehicle (UAV). UAV image data collection took place in maize fields from May 17 to June 4, 2021, when maize was in its early growth stage. UAV images were labeled with pixels from maize and those without and the cropped to be used as the input data of the semantic segmentation network for the maize detection model. We trained a model to separate maize from background using the deep learning segmentation networks DeepLabV3+, U-Net, Linknet, and FPN. All four models showed pixel accuracy of 0.97, and the mIOU score was 0.76 and 0.74 in DeepLabV3+ and U-Net, higher than 0.69 for Linknet and FPN. Weed density was calculated as the difference between the green area classified as ExGR (Excess green-Excess red) and the maize area predicted by the model. Each image evaluated for weed density was recombined to quantify and visualize the distribution and density of weeds in a wide range of maize fields. We propose a method to quantify weed density for accurate weeding by effectively separating weeds, maize, and background from UAV images of maize fields.