﻿ 基于分层置信度传播的光流估计方法
 计算机系统应用  2018, Vol. 27 Issue (9): 25-32 PDF

Hierarchical Belief Propagation for Optical Flow Estimation
ZHANG Zi-Xing, WEN Ying
Department of Computer Science and Technology, East China Normal University, Shanghai 200062, China
Foundation item: National Natural Science Foundation of China (61773166); Natural Science Foundation of Shanghai Municipality (17ZR1408200)
Abstract: As an effective way to find correspondences between images, Belief Propagation (BP) is widely used for estimating optical flow in recent years. Nevertheless, its application to directly estimating high-accuracy large displacement optical flow needs huge label space and long time to process. In order to overcome the drawback of BP, we propose a Hierarchical Belief Propagation (HBP) algorithm to estimate high-accuracy large displacement optical flow. We treat input images as Markov Random Fields (MRFs). To accelerate computation, we perform BP on hierarchical MRFs, i.e., superpixel MRF and pixel MRF. The basic displacements obtained on the superpixel MRF are used as a coarse reference to constrain label space to a smaller size on the pixel MRF. Based on this constrained label space, we can estimate accurate optical flow efficiently. Experiments on MPI Sintel dataset show that the proposed method is competitive on speed and accuracy.
Key words: optical flow     large displacements     Markov Random Fields (MRFs)     Belief Propagation (BP)     superpixel

1 概述

 图 1 本文提出的分层置信度传播光流估计算法的流程示意图

2 相关工作

 $E(f) = \sum\limits_{p \in {\cal{P}}} {{C_p}({f_p})} + \sum\limits_{(p,q) \in N} S ({f_p},{f_q})$ (1)

 $m_{pq}^t({f_q}) = \mathop {\min }\limits_{{f_p}} (S({f_p},{f_q}) + {C_p}({f_p}) + \sum\limits_{s \in {N_p}\backslash q}^{} {m_{sp}^{t - 1}({f_p}))}$ (2)

 ${b_p}({f_p}) = {C_p}({f_p}) + \sum\limits_{q \in {N_p}} {{m_{qp}}({f_p})}$ (3)

3 本文方法

 $E(f) = \sum\limits_{p \in {\cal P}} {\left(\alpha {C_{\rm {color}}}({f_p}) + \beta {C_{\rm {desc}}}({f_p})\right)} + \sum\limits_{(p,q) \in N} {S({f_p},{f_q})}$ (4)

 ${C_p}({f_p}) = \alpha {C_{\rm {color}}}({f_p}) + \beta {C_{\rm {desc}}}({f_p})$ (5)

3.1 分层结构

 ${m_{pq}}({u_q},{v_q}) = \mathop {\min }\limits_{{v_p}} {m_{pq|{v_p}}}({u_q}) + \rho ({v_q} - {v_p})$ (6)
 ${m_{pq|{v_p}}}({u_q}) = \mathop {\min }\limits_{{u_p}} {\phi _{pq}}({u_p},{v_p}) + \rho ({u_q} - {u_p})$ (7)

 图 2 min-convolution算法求取下包络

3.2 数据项定义

3.3 多帧信息

 ${C_p}({f_p}) = \min \left \{ {C_{t,t - 1}}( - {f_p}),{C_{t,t + 1}}({f_p}),{C_{t,t + 2}}(2{f_p})\right \}$ (8)

3.4 后处理

 图 3 多帧信息示意图

4 实验及结果分析

4.1 参数分析

 图 4 d的不同取值对结果误差的影响

 图 5 δ的不同取值对结果误差的影响

4.2 MPI Sintel实验结果

 图 6 本文方法与相关方法的可视化结果比较

5 结束语

 [1] 周智, 张伟峰, 赵斌, 等. 基于光流的ATM机异常行为实时检测. 计算机系统应用, 2017, 26(9): 232-237. DOI:10.15888/j.cnki.csa.005929 [2] Horn BKP, Schunck BG. Determining optical flow. Artificial Intelligence, 1981, 17(1-3): 185-203. DOI:10.1016/0004-3702(81)90024-2 [3] Brox T, Bruhn A, Papenberg N, et al. High accuracy optical flow estimation based on a theory for warping. In: Pajdla T, Matas J, eds. Computer Vision-ECCV 2004. Lecture Notes in Computer Science, vol 3024. Springer. Berlin, Heidelberg. 2004. 25–36. [doi: 10.1007/978-3-540-24673-2_3] [4] Sun DQ, Roth S, Black MJ. Secrets of optical flow estimation and their principles. 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. San Francisco, CA, USA. IEEE. 2010. 2432–2439. [doi: 10.1109/CVPR.2010.5539939] [5] Xu L, Jia JY, Matsushita Y. Motion detail preserving optical flow estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(9): 1744-1757. DOI:10.1109/TPAMI.2011.236 [6] Thomas Brox, Jitendra Malik. Large displacement optical flow: Descriptor matching in variational motion estimation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(3): 500-513. DOI:10.1109/TPAMI.2010.143 [7] Brox T, Bregler C, Malik J. Large displacement optical flow. 2009 IEEE Conference on Computer Vision and Pattern Recognition. Miami, FL, USA. IEEE. 2009. 41–48. [doi: 10.1109/CVPR.2009.5206697] [8] Tu Z, Poppe R, Veltkamp RC. Weighted local intensity fusion method for variational optical flow estimation. Pattern Recognition, 2016, 50(C): 223-232. [9] Weinzaepfel P, Revaud J, Harchaoui Z, et al. Deepflow: Large displacement optical flow with deep matching. 2013 IEEE International Conference on Computer Vision. Sydney, NSW, Australia. IEEE. 2013. 1385–1392. [doi: 10.1109/ICCV.2013.175] [10] Yang JL, Li HD. Dense, accurate optical flow estimation with piecewise parametric model. 2015 IEEE Conference on Computer Vision and Pattern Recognition. Boston, MA, USA. IEEE. 2015. 1019–1027. [doi: 10.1109/CVPR.2015.7298704] [11] Bao LC, Yang QX, Jin HL. Fast edge-preserving patchmatch for large displacement optical flow. IEEE Transactions on Image Processing, 2014, 23(12): 4996-5006. DOI:10.1109/TIP.2014.2359374 [12] Bailer C, Taetz B, Stricker D. Flow fields: Dense correspondence fields for highly accurate large displacement optical flow estimation. 2015 IEEE International Conference on Computer Vision. Santiago, Chile. IEEE. 2015. 4015–4023. [doi: 10.1109/ICCV.2015.457] [13] Hu YL, Song R, Li YS. Efficient coarse-to-fine patch match for large displacement optical flow. 2016 IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas, NV, USA. IEEE. 2016. 5704–5712. [doi: 10.1109/CVPR.2016.615] [14] Barnes C, Shechtman E, Goldman DB, et al. The generalized patchmatch correspondence algorithm. In: Daniilidis K, Maragos P, Paragios N, eds. Computer Vision-ECCV 2010. Lecture Notes in Computer Science, vol 6313. Springer. Berlin, Heidelberg. 2010. 29–43. [doi: 10.1007/978-3-642-15558-1_3] [15] Felzenszwalb PF, Huttenlocher DP. Efficient belief propagation for early vision. International Journal of Computer Vision, 2006, 70(1): 41-54. DOI:10.1007/s11263-006-7899-4 [16] Li Y, Min D, Brown MS, et al. Spm-bp: Sped-up patchmatch belief propagation for continuous mrfs. 2015 IEEE Inter-national Conference on Computer Vision. Santiago, Chile. IEEE. 2015. 4006–4014. [doi: 10.1109/ICCV.2015.456] [17] Chen QF, Koltun V. Full flow: Optical flow estimation by global optimization over regular grids. arXiv: 1604.03513. [18] Achanta R, Shaji A, Smith K, et al. Slic superpixels compared to state-of-the-art superpixel methods. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 34(11): 2274-2282. DOI:10.1109/TPAMI.2012.120 [19] Liu C, Yuen J, Torralba A. Sift flow: Dense correspondence across scenes and its applications. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2011, 33(5): 978-994. DOI:10.1109/TPAMI.2010.147 [20] Zabih R, Woodfill J. Non-parametric local transforms for computing visual correspondence. In: Eklundh JO, ed. Computer Vision-ECCV’94. Lecture Notes in Computer Science, vol 801. Springer. Berlin, Heidelberg. 1994. 151–158. [doi: 10.1007/BFb0028345] [21] Kennedy R, Taylor CJ. Hierarchically constrained optical flow. 2015 IEEE Conference on Computer Vision and Pattern Recognition. Boston, MA, USA. IEEE. 2015. 3340–3348. [doi: 10.1109/CVPR.2015.7298955] [22] Revaud J, Weinzaepfel P, Harchaoui Z, et al. Epicflow: Edge-preserving interpolation of correspondences for optical flow. 2015 IEEE Conference on Computer Vision and Pattern Recognition. Boston, MA, USA. IEEE. 2015. 1164–1172. [doi: 10.1109/CVPR.2015.7298720] [23] Butler DJ, Wulff J, Stanley GB, et al. A naturalistic open source movie for optical flow evaluation. In: Fitzgibbon A, Lazebnik S, Perona P, et al., eds. Computer Vision-ECCV 2012. Lecture Notes in Computer Science, vol 7577. Springer. Berlin, Heidelberg. 2012. 611–625. [doi: 10.1007/978-3-642-33783-3_44] [24] Li Y, Min D, Do MN, et al. Fast guided global interpolation for depth and motion. In: Leibe B, Matas J, Sebe N, et al., eds. Computer Vision-ECCV 2016. Lecture Notes in Computer Science, vol 9907. Springer. Cham. 2016. 717–733. [doi: 10.1007/978-3-319-46487-9_44] [25] Sun DQ, Roth S, Black MJ. A quantitative analysis of current practices in optical flow estimation and the principles behind them. International Journal of Computer Vision, 2014, 106(2): 115-137. DOI:10.1007/s11263-013-0644-x