基于SENet改进的Faster R-CNN行人检测模型
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:


Pedestrian Detection Model Based on Improved Faster R-CNN with SENet
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    随着无人驾驶和智能驾驶技术的发展,计算机视觉对视频图像检测的实时性和准确性要求也越来越高.现有的行人检测方法在检测速度和检测精度两个方面难以权衡.针对此问题,提出一种改进的Faster R-CNN模型,在Faster R-CNN的主体特征提取网络模块中加入SE网络单元,进行道路行人检测.这种方法不仅能达到相对较高的准确率,用于视频检测时还能达到一个较好的检测速率,其综合表现比Faster R-CNN模型更好.在INRIA数据集和私有数据集上的实验表明,模型的mAP最好成绩能达到93.76%,最高检测速度达到了13.79 f/s.

    Abstract:

    Computer vision is an important branch of machine learning at present, which requests much higher instantaneity and accuracy as the driverless and SI-Drive development. To optimize the current methods, the Faster Region-based Convolutional Neural Network (Faster R-CNN) is upgraded by adding SENet to it in this study. The upgraded Faster R-CNN model is applied in pedestrian detection. The new model does not only bring higher accuracy but also accomplish a better detection rate. To verify the new method, an examine was done in INRIA set and our set. The result shows that the upgraded model has a better detection performance on both accuracy and rate which can meet the related specifications of real-time pedestrian detection basically. Finally, the method was tested in the NVIDIA GTX1080Ti GPU. The results show that the mAP of upgraded model can achieve up to 92.7%, while the detection rate is up to 13.79 f/s under a relatively plain experimental condition. On the whole, the new model performs better than the traditional Faster R-CNN model.

    参考文献
    相似文献
    引证文献
引用本文

李克文,李新宇.基于SENet改进的Faster R-CNN行人检测模型.计算机系统应用,2020,29(4):266-271

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2019-08-02
  • 最后修改日期:2019-09-09
  • 录用日期:
  • 在线发布日期: 2020-04-09
  • 出版日期:
文章二维码
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号