基于等间距线的几何畸变校正
DOI:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

基金项目:


Geometric Distortion Correction Based on Uniformly-Spaced Lines
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 增强出版
  • |
  • 文章评论
    摘要:

    摄像机视觉测量系统中图像存在几何畸变,会降低尺寸测量精度,必须进行校正。目前基于单张图像的畸变校正方法主要有标定板特征点转移矩阵的非参数化方法和利用特征点共线性或无穷点约束条件的参数化方法,由于上述方法需要获取多个特征点的坐标,而获取孤立的坐标点容易造成误差,使得校正效果不好。提出采用等间距线方法,利用畸变模型建立间距和畸变参数的关系,在图像中检测出线段间距从而计算出畸变参数。文章最后用等间距线方法对畸变图像进行了校正,结果表明该方法计算量小,具有良好的校正效果。

    Abstract:

    Images geometric distortion would reduce the accuracy of dimensional measurement in Camera Vision Measurement Systems, so we need to correct it. So far, there are two typical distortion correction methods based on a single image. One is a parameter-free method utilizing the transference between the feature points in a checkerboard pattern and its images, and the other is a parameterize method utilizing the geometric constraints such as rectilinear elements or vanishing points. Both of those methods should get lots of feature points’ coordinate, but it will get a wrong coordinate when the feature point is a isolated point, therefore sometimes those method have a bad correction effect. In this paper, we propose a uniformly-spaced lines method and establish a relation of the spacing interval and distortion parameter from the distortion model. We could compute the distortion parameter from the spacing interval between lines we have detected in the image. In the end, we use the uniformly-spaced lines method to correct some distorted images. The result shows that this method has a low-level calculated amount and a nice correction effect.

    参考文献
    相似文献
    引证文献
引用本文

管海兵,蒙水金,彭中伟.基于等间距线的几何畸变校正.计算机系统应用,2011,20(11):107-109,122

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2011-03-18
  • 最后修改日期:2011-04-07
  • 录用日期:
  • 在线发布日期:
  • 出版日期:
您是第位访问者
版权所有:中国科学院软件研究所 京ICP备05046678号-3
地址:北京海淀区中关村南四街4号 中科院软件园区 7号楼305房间,邮政编码:100190
电话:010-62661041 传真: Email:csa (a) iscas.ac.cn
技术支持:北京勤云科技发展有限公司

京公网安备 11040202500063号