本文已被:浏览 477次 下载 1564次
Received:May 21, 2023 Revised:June 26, 2023
Received:May 21, 2023 Revised:June 26, 2023
中文摘要: 针对交通信号灯检测中目标尺度小、检测精度低的问题, 提出一种改进YOLOv5s的交通信号灯检测算法. 首先, 构建一种特征金字塔模块RSN-BiFPN, 充分融合不同尺度的交通信号灯特征, 以减少目标漏检和误检. 其次, 引入新的特征融合层和预测头, 提高网络对小目标的感知性能, 增强检测准确性; 最后, 采用EIoU函数优化损失, 加快网络收敛速度. 通过在S2TLD公开数据集上进行的大量的实验结果表明, 本文所提方法相较于基础网络, 精确率提升4.1%, 达96.1%; 召回率提升3%, 达95.9%; 平均精确度提升1.9%, 达96.5%. 同时, 改进后的算法实现了更快的检测速度, 达每秒22.7帧, 本文方法有效实现交通信号灯快速、准确地检测, 可广泛应用于交通道路中信号灯分析相关研究.
Abstract:Aiming at the small target scale and low detection accuracy in traffic signal detection, this study proposes a traffic signal detection algorithm based on improved YOLOv5s. Firstly, a feature pyramid module RSN-BiFPN is constructed to fully integrate traffic signal features of different scales to reduce target missed detection and false detection. Secondly, a new feature fusion layer and prediction head are introduced to improve the perception performance of the network for small objects and enhance detection accuracy. Finally, the EIoU function is adopted to optimize the loss and accelerate network convergence. Experiments conducted on the public dataset S2TLD show that compared with the basic network, the precision rate of the proposed method is increased by 4.1% at 96.1%, the recall rate is 95.9% with an increase of 3%, and the average precision is increased by 1.9%, reaching 96.5%. Meanwhile, the improved algorithm achieves a faster detection speed of 22.7 frames per second. The proposed method can realize rapid and accurate detection of traffic lights and can be widely employed in the research on analyzing traffic lights.
文章编号: 中图分类号: 文献标志码:
基金项目:国家自然科学基金(41875184,41975183)
引用文本:
王军,葛宝康,程勇.基于改进YOLOv5s算法的交通信号灯检测.计算机系统应用,2023,32(12):243-252
WANG Jun,GE Bao-Kang,CHENG Yong.Traffic Light Detection Based on Improved YOLOv5s Algorithm.COMPUTER SYSTEMS APPLICATIONS,2023,32(12):243-252
王军,葛宝康,程勇.基于改进YOLOv5s算法的交通信号灯检测.计算机系统应用,2023,32(12):243-252
WANG Jun,GE Bao-Kang,CHENG Yong.Traffic Light Detection Based on Improved YOLOv5s Algorithm.COMPUTER SYSTEMS APPLICATIONS,2023,32(12):243-252