基于改进SURF的图像匹配算法
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:

国家自然科学基金(11701159)


Image Matching Algorithm Based on Improved SURF
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 增强出版
  • |
  • 文章评论
    摘要:

    本文针对传统SURF (Speeded Up Robust Features)算法精度和速度较低的问题, 提出一种优化的图像匹配算法. 在特征点提取阶段引入局部二维熵来刻画特征点的独特性, 通过计算特征点的局部二维熵并设置合适的阈值来剔除一部分误点; 在匹配阶段用曼哈顿距离代替欧式距离, 并引入最近邻和次近邻的概念, 提取出模板图像中特征点与待匹配图像中特征点曼哈顿距离最近的前两个点, 如果最近的距离除以次近的距离得到的比值小于设定的阈值T, 则接受这一对匹配对, 以此减少错误匹配. 实验结果表明该算法优于传统算法, 精度和速度均有一定程度的提高.

    Abstract:

    In order to solve the problem of low accuracy and speed of traditional SURF (Speeded-Up Robust Features) algorithm, an optimized image matching algorithm is proposed. A local two-dimensional entropy is introduced to characterize the uniqueness of feature points in the stage of feature point extracting. Some error points are eliminated by calculating the local two-dimensional entropy of the feature points and setting appropriate thresholds. The Euclidean distance is replaced by the Manhattan distance during the matching phase. The concept of the nearest neighbors and nearer neighbors is introduced. The first two points are extracted with the closest Manhattan points between the feature points in the template image and one in the image to be matched. If the ratio obtained by dividing the nearest distance by nearer distance is less than the threshold T, then this pair of matches is accepted to reduce mismatches. The experimental results show that the algorithm is superior to the traditional algorithm, and the accuracy is improved while the speed is also improved.

    参考文献
    相似文献
    引证文献
引用本文

陈雪松,陈秀芳,毕波,唐锦萍.基于改进SURF的图像匹配算法.计算机系统应用,2020,29(12):222-227

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2020-04-13
  • 最后修改日期:2020-06-23
  • 录用日期:
  • 在线发布日期: 2020-12-02
  • 出版日期:
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号