###
计算机系统应用英文版:2020,29(2):181-186
本文二维码信息
码上扫一扫!
基于Faster R-CNN和增量学习的车辆目标检测
(河海大学 计算机与信息学院, 南京 210098)
Vehicle Detection Based on Faster R-CNN and Incremental Learning
(College of Computer and Information, Hohai University, Nanjing 210098, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 2392次   下载 3008
Received:July 02, 2019    Revised:July 23, 2019
中文摘要: 随着深度学习的研究热潮,近年来对车辆目标的检测逐步从机器学习方法转变为深度学习方法.目前,大多数深度学习方法对车辆目标的检测都存在不同程度的错检漏检问题.针对车辆目标检测中存在的小目标的错检漏检、截断式待检目标的漏检和重叠遮挡待检目标的漏检等问题,提出一种基于增量学习数据集的车辆目标检测方法,该方法与Faster R-CNN算法结合对车辆目标实现检测和分类.在实验的最后,分别从主观判断和客观检测数据两个方面,对比了车辆目标检测中未使用增量学习方法和使用增量学习方法对实验结果的影响.实验证明,使用基于增量学习和Faster R-CNN的车辆目标检测方法在主观判断方面对错检漏检的目标有明显地改善效果,从客观数据分析,使用该方法与未使用增量学习方法相比,VGG16网络mAP值提升了4%,ResNet101网络mAP值提升了6%.
Abstract:With the research boom of deep learning, vehicle target detection has gradually changed from machine learning to deep learning in recent years. At present, most of the deep learning methods have different degrees of error detection and omission in vehicle target detection. The vehicle detection method based on incremental learning dataset is proposed to solve the problems of small targets error detection and the omission of truncated and overlapping targets. This method is combined with faster R-CNN algorithm to detect and classify vehicle targets. At the end of the experiment, the influence of with/without incremental learning method on the experimental results was compared from two aspects of subjective judgment and objective test data. The result show that the vehicle detection method based on incremental learning and faster R-CNN has significantly improved the performance in subjective judgment of the missed targets. Objective data also show that the VGG16 network mAP value is increased by 4% and the ResNet101 network mAP value is increased by 6% compared with the incremental learning method.
文章编号:     中图分类号:    文献标志码:
基金项目:
引用文本:
张子颖,王敏.基于Faster R-CNN和增量学习的车辆目标检测.计算机系统应用,2020,29(2):181-186
ZHANG Zi-Ying,WANG Min.Vehicle Detection Based on Faster R-CNN and Incremental Learning.COMPUTER SYSTEMS APPLICATIONS,2020,29(2):181-186