基于YOLOv3网络的电能表示数识别方法
作者:
作者单位:

作者简介:

通讯作者:

基金项目:

国家科技重大专项(2017ZX05013-001)


Automatic Reading Method of Electric Energy Meter Based on YOLOv3
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    随着智能电网的不断发展,基于数字图像处理方法的电能表自动抄表系统被广泛应用,为提升传统电能表示数自动识别的准确率,提出了一种基于YOLOv3(You Only Look Once)网络的电能表示数识别新方法.对于电能表图像,构建基于YOLOv3-Tiny网络的计数器定位模型并训练,使用训练完毕的模型定位计数器目标区域,裁剪计数器区域生成计数器图像;对于计数器图像,构建基于YOLOv3网络的计数器识别模型并训练,使用训练完毕的模型识别计数器目标区域的数字.选择巴西巴拉那联邦大学公开的电能表数据集作为研究对象,通过与YOLOv2-Tiny定位模型、CR-NET识别模型的对比实验,表明了本方法具有更高的定位准确率和识别准确率.

    Abstract:

    With the continuous development of smart grid, the automatic reading system of electric energy meter based on digital image processing method is widely used. To improve the accuracy of automatic reading of traditional electric energy meter, a new method of automatic reading of electric energy meter based on YOLOv3 network is proposed. For the electric energy meter image, a counter positioning model based on the YOLOv3-Tiny network is constructed and trained, the trained target model is used to locate the counter target area, and the counter area is generated to achieve a counter image. For the counter image, a counter recognition model based on the YOLOv3 network is constructed and trained, and the trained model is used to identify the number of the counter target area. The electric energy meter data set published by the Federal University of Paraná Brazil was selected as the research object. The comparison experiment with YOLOv2-Tiny positioning model and CR-NET recognition model shows that the proposed method has higher positioning accuracy and recognition accuracy.

    参考文献
    相似文献
    引证文献
引用本文

龚安,张洋,唐永红.基于YOLOv3网络的电能表示数识别方法.计算机系统应用,2020,29(1):196-202

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
历史
  • 收稿日期:2019-05-30
  • 最后修改日期:2019-06-28
  • 录用日期:
  • 在线发布日期: 2019-12-30
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号