农业大数据学报 ›› 2024, Vol. 6 ›› Issue (4): 497-508.doi: 10.19788/j.issn.2096-6369.000065

• • 上一篇    下一篇

基于AR眼镜和改进YOLOv8m-seg的田间小麦赤霉病严重度识别方法

徐玮(), 周佳良*(), 钱啸, 符首夫   

  1. 北京金禾天成科技有限公司,北京 100027

Severity Recognition Method of Field Wheat Fusarium Head Blight Based on AR Glasses and Improved YOLOv8m-seg

XU Wei(), ZHOU JiaLiang*(), QIAN Xiao, FU ShouFu   

  1. JinHeTech, Beijing 100027, China
  • Received:2024-09-09 Accepted:2024-10-20 Published:2024-12-26 Online:2024-12-02

摘要:

及时发现小麦田间赤霉病发生情况并根据发病严重程度采取相应的防治措施,有利于提高小麦的产量和质量。当前识别小麦赤霉病严重度的方法大多基于一株或几株麦穗进行识别,这种方式由于效率较低不适用于田间调查。为解决该问题,该研究提出一种高效准确的田间小麦赤霉病严重度识别方法。通过引入CBAM注意力机制以改进YOLOv8m-seg模型的性能。利用改进的YOLOv8m-seg模型对采集的远景图像进行小麦麦穗实例分割,然后基于非目标抑制方法进行单株小麦麦穗切图,再利用改进的YOLOv8m-seg模型对每一株小麦麦穗中的病小穗和健康小穗进行实例分割,最后通过病小穗和健康小穗的数量计算每一株小麦麦穗的赤霉病严重度。为验证本文方法的有效性,构建了小麦麦穗(D-WE)和小麦小穗(D-WS)两个数据集进行测试。试验结果表明YOLOv8m-seg在两个数据集上的综合性能优于YOLOv8n-seg、YOLOv8s-seg、YOLOv8l-seg和YOLOv8x-seg。引入CBAM的模型优于引入SE、ECA和CA注意力机制的模型,与原模型相比,改进YOLOv8m-seg模型的平均精度均值在两个数据集上分别提高了0.9个百分点和1.2个百分点。该研究提出的小麦赤霉病严重度识别方法与其他三种识别方法相比严重度准确率分别提高了38.4个百分点、6.2个百分点和2.4个百分点,通过TensorRT将改进的YOLOv8m-seg模型部署后总算法耗时仅仅为原来的1/7。最后,该研究基于AR眼镜进行三地的小麦田间赤霉病严重度调查,调查结果表明,基于AR眼镜的小麦赤霉病智能识别平均病穗计数准确率高达0.953,且调查耗时仅为人工调查的1/3,充分说明了该研究提出方法的有效性,为智能化小麦赤霉病田间调查奠定良好的基础。

关键词: 小麦赤霉病, 卷积神经网络, YOLOv8, 注意力机制, AR眼镜

Abstract:

Timely detection of the severity of Fusarium head blight in the field and taking corresponding prevention and control measures based on the severity of the disease can improve the quality of wheat production. The current methods for identifying the severity of wheat Fusarium head blight are mostly based on identifying one or several wheat ears, which is not suitable for field investigations due to its low efficiency. To address this issue, the study proposes an efficient and accurate method for identifying the severity of wheat Fusarium head blight in the field. By introducing CBAM attention mechanism to improve the performance of YOLOv8m-seg model. Using the improved YOLOv8m-seg model to segment wheat ear instances in the collected distant images, and then using non target suppression method to cut individual wheat ear. Then, using the improved YOLOv8m-seg model to segment diseased and healthy spikelets in each wheat ear, the severity of Fusarium head blight in each wheat ear is calculated based on the number of diseased and healthy spikelets. To verify the effectiveness of the method proposed in this article, two datasets were constructed for testing, namely dateset of wheat ear (D-WE) and dateset of wheat spikelet (D-WS). The experimental results show that YOLOv8m-seg has better overall performance than YOLOv8n-seg, YOLOv8s-seg, YOLOv8l-seg, and YOLOv8x-seg on two datasets. The model that introduces CBAM is superior to the model that introduces SE, ECA, and CA attention mechanisms. Compared with the original model, the mean average precision of the improved YOLOv8m-seg model has increased by 0.9 percentage points and 1.2 percentage points on two datasets, respectively. The severity recognition method for Fusarium head blight proposed in this study has improved the severity accuracy by 38.4 percentage points, 6.2 percentage points, and 2.4 percentage points compared to the other three recognition methods. After deploying the improved YOLOv8m-seg model through TensorRT inference framework, the total algorithm time consumed is only 1/7 of the original. Finally, this study conducted a investigation on the severity of wheat Fusarium head blight in three locations based on AR glasses. The results showed that the average counting accuracy of intelligent identification of wheat Fusarium head blight based on AR glasses was as high as 0.953, and the investigation time is one-third of the manual investigation time. This fully demonstrates the effectiveness of the proposed method and lays a good foundation for intelligent field investigation of wheat Fusarium head blight.

Key words: Fusarium head blight, CNN, YOLOv8, attention mechanism, AR glasses