###
计算机系统应用英文版:2020,29(6):89-96
本文二维码信息
码上扫一扫!
基于CNN的农作物病虫害图像识别模型
(1.中国科学院 计算机网络信息中心, 北京 100190;2.中国科学院大学, 北京 100049;3.生态环境部 南京环境科学研究所, 南京 210042)
Recognition Model of Crop Pests and Diseases Images Based on CNN
(1.Computer Network Information Center, Chinese Academy of Sciences, Beijing 100190, China;2.University of Chinese Academy of Sciences, Beijing 100049, China;3.Nanjing Institute of Environmental Sciences, Ministry of Ecology and Environment, Nanjing 210042, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1329次   下载 4743
Received:November 18, 2019    Revised:December 11, 2019
中文摘要: 中国是传统的农业大国, 农业不仅是国民经济建设与发展的基础, 也是社会正常稳定有序运行的保障. 然而每年由于农作物病虫害造成的损失巨大, 且传统的农作物病虫害识别方法效果并不理想. 同时近年深度学习飞速发展, 在图像分类与识别的方面取得了巨大进展. 因此本文通过基于深度学习的方法构建农作物病虫害图像识别模型, 并针对样本不平衡问题改进卷积网络损失函数. 实验证明该模型可以对农作物病虫害进行有效识别并且对损失函数进行优化后模型的准确率也进一步得到了提升.
Abstract:China is a traditional agricultural country. Agriculture is not only the foundation for the construction and development of the national economy, but also the guarantee for the normal, stable, and orderly operation of the society. However, the annual losses due to crop pests and diseases are huge, and the traditional methods for identifying crop pests and diseases are not ideal. At the same time, the rapid development of deep learning in recent years has made great progress in image classification and recognition. Therefore, this study constructs an image recognition model for crop pests and diseases through deep learning based methods, and improves the convolution network loss function for sample imbalance problems. The model can effectively identify crop pests and diseases and the accuracy of the model is further improved after optimizing the loss function.
文章编号:     中图分类号:    文献标志码:
基金项目:国家重点研发计划(2017YFC0505205); 中国科学院科技服务网络计划区域重点项目四川卧龙自然保护区信息化综合服务平台(Y82E01); 中科院信息化专项课题(XXH13505-03-205); 生态环境部生物多样性调查、观测和评估项目
引用文本:
史冰莹,李佳琦,张磊,李健.基于CNN的农作物病虫害图像识别模型.计算机系统应用,2020,29(6):89-96
SHI Bing-Ying,LI Jia-Qi,ZHANG Lei,LI Jian.Recognition Model of Crop Pests and Diseases Images Based on CNN.COMPUTER SYSTEMS APPLICATIONS,2020,29(6):89-96