###
DOI:
计算机系统应用英文版:2015,24(10):111-115
本文二维码信息
码上扫一扫!
基于改进差分和光流的新型运动目标检测方法
(江苏科技大学 机械工程学院, 镇江 212003)
Novel Moving Target Detection Algorithm Based on Improved Difference between Frames and Optical Flow
(School of Mechanical Engineering, Jiangsu University of Science and Technology, Zhenjiang 212003, China)
摘要
图/表
参考文献
相似文献
本文已被:浏览 1167次   下载 1968
Received:February 05, 2015    Revised:April 15, 2015
中文摘要: 针对运动目标检测中的空洞和虚假目标的问题, 提出一种改进差分和改进光流的运动目标检测方法. 该方法首先对连续的七帧图像依次进行预处理、差分、灰度变换和二值化处理, 并将前、后三帧二值图像分别累加得到的二值图像进行逻辑与运算, 得到中间帧中运动目标的粗略区域; 其次将中间帧与背景帧差分, 并对得到的图像进行边缘提取和二值化处理, 然后对其进行像素的算术运算, 得到中间帧中运动目标的精确区域; 在基础上通过改进的光流法得到运动目标的准确信息; 最后通过阈值分割和形态学处理完成对目标的分割. 对比实验表明, 该方法能实现运动目标的准确快速检测与分割.
Abstract:To overcome cavities and false targets in moving target detection, a novel moving target detection algorithm based on difference between frames and optical flow is presented. Firstly, image preprocessing, difference, gray-scale transformation and binarization are carried out on seven successive images. And the first and the last three frames among the obtained binary images are summed respectively, then logic operation is done on the two accumulations to get rough area of the moving target in the intermediate frame. Then, difference is done between intermediate frame and background frame, and image edge extraction, binarization and arithmetic operations of pixels are operated on the processed frame to obtain accurate area of the moving target in the intermediate frame. Based on what is said, the moving target accurate information is obtained by the improved optical flow method. Finally, threshold segmentation and morphological processing is completed to extract the target. Comparative experiments indicate that the proposed method can detect and segment the moving target accurately and fastly.
文章编号:     中图分类号:    文献标志码:
基金项目:江苏科技大学人才项目(35020902)
引用文本:
季鸣,王红茹,童伟.基于改进差分和光流的新型运动目标检测方法.计算机系统应用,2015,24(10):111-115
JI Ming,WANG Hong-Ru,TONG Wei.Novel Moving Target Detection Algorithm Based on Improved Difference between Frames and Optical Flow.COMPUTER SYSTEMS APPLICATIONS,2015,24(10):111-115