本文已被:浏览 27次 下载 459次
Received:April 01, 2024 Revised:May 06, 2024
Received:April 01, 2024 Revised:May 06, 2024
中文摘要: 为了实现柑橘采摘的智能化, 果园环境中对柑橘快速而精准的识别成为关键. 针对现有目标检测算法对环境的适应缺陷和效率低下的问题, 提出一种基于YOLOv8n模型的轻量化柑橘成熟度检测算法YOLOv8n-CMD (YOLOv8n citrus maturity detection). 首先, 优化backbone网络结构, 提高小目标检测能力; 其次, 添加CBAM注意力机制, 改善模型分类效果; 然后, 引入Ghost卷积, 将YOLOv8原模型中的颈部C2f模块与Ghost结合, 减少计算量和参数量; 最后使用SimSPPF模块代替原网络金字塔池化层, 提高模型检测效率. 实验结果表明: YOLOv8n-CMD算法相较于原模型的模型参数量和计算量分别减少了31.8%和7.4%, 精准度提高了3.0%, 更适合果园环境下的柑橘检测研究.
Abstract:To achieve intelligent citrus picking, fast and accurate identification of citrus in the orchard environment becomes critical. Aiming at the defective adaptation of existing target detection algorithms to the environment and low efficiency, this study proposes a lightweight citrus maturity detection algorithm based on the YOLOv8n model, YOLOv8n-CMD (YOLOv8n citrus maturity detection). Firstly, the backbone network structure is optimized to improve the detection of small targets. Secondly, the CBAM attention mechanism is added to improve the classification effect of the model. Then, Ghost convolution is introduced, and the neck C2f module in the original YOLOv8 model is combined with Ghost to reduce the amount of computation and that of parameters. Finally, the SimSPPF module is used in place of the original pyramidal pooling layer to improve model detection efficiency. Experimental results show that the YOLOv8n-CMD algorithm reduces the number of parameters and computation by 31.8% and 7.4%, respectively, and improves the accuracy by 3.0%, which is more suitable for citrus detection research in the orchard environment.
文章编号: 中图分类号: 文献标志码:
基金项目:
引用文本:
肖阳,项明宇,李熹.基于改进YOLOv8n的轻量化柑橘成熟度检测.计算机系统应用,2024,33(11):202-208
XIAO Yang,XIANG Ming-Yu,LI Xi.Lightweight Citrus Maturity Detection Based on Improved YOLOv8n.COMPUTER SYSTEMS APPLICATIONS,2024,33(11):202-208
肖阳,项明宇,李熹.基于改进YOLOv8n的轻量化柑橘成熟度检测.计算机系统应用,2024,33(11):202-208
XIAO Yang,XIANG Ming-Yu,LI Xi.Lightweight Citrus Maturity Detection Based on Improved YOLOv8n.COMPUTER SYSTEMS APPLICATIONS,2024,33(11):202-208