zbMATH — the first resource for mathematics

A homotopy-based approach for computing defocus blur and affine transform simultaneously. (English) Zbl 1138.68503
Summary: This paper presents a homotopy-based algorithm for a simultaneous recovery of defocus blur and the affine parameters of apparent shifts between planar patches of two pictures. These parameters are recovered from two images of the same scene acquired by a camera evolving in time and/or space and for which the intrinsic parameters are known. Using limited Taylor’s expansion one of the images (and its partial derivatives) is expressed as a function of the partial derivatives of the two images, the blur difference, the affine parameters and a continuous parameter derived from homotopy methods. All of these unknowns can thus be directly computed by resolving a system of equations at a single scale. The proposed algorithm is tested using synthetic and real images. The results confirm that dense and accurate estimation of the previously mentioned parameters can be obtained.

68T10 Pattern recognition, speech recognition
68T45 Machine vision and scene understanding
Full Text: DOI
[1] Horn, B.K.P.; Schunck, B.G., Determining optical flow, Artif. intell., 17, 185-204, (1981)
[2] B. Lucas, T. Kanade, An iterative image registration technique with an application to stereo vision, in: Proceedings of DARPA Image Understanding Workshop, 1981, pp. 121-130.
[3] Barron, J.L.; Fleet, D.J.; Beauchemin, S.S., Performance of optical flow techniques, Int. J. comput. vision, 12, 43-77, (1994)
[4] Lavest, J.M.; Rives, G.; Dhome, M., 3D reconstruction by zooming, IEEE trans. robotics autom., 9, 2, 196-208, (1993)
[5] Waxman, A.M.; Whon, K., Contour evolution, neighbourhood deformation and global image flow: planar surface in motion, Int. J. robotics res., 4, 95-108, (1985)
[6] Werkhoven, P.; Koenderink, J.J., Extraction of motion parallax structure in the visual system I, Biol. cybern., 63, 185-191, (1990) · Zbl 0705.92026
[7] D.G. Jones, J. Malik, A computational framework for determining stereo correspondance from a set of linear spatial filters, in: Proceedings of the 2nd European Conference on Computer Vision (ECCV), 1992, pp. 395-410.
[8] R. Manmatha, A framework for recovering affine transforms using points, lines, or image brightness, in: Proceedings of IEEE, International Conference on Computer Vision and Pattern Recognition, 1994, pp. 141-146.
[9] Schechner, Y.Y.; Kiryati, N., Depth from defocus vs. stereo: how different really are they?, Int. J. comput. vision, 39, 141-162, (2000) · Zbl 1060.68646
[10] F. Deschênes, D. Ziou, P. Fuchs, Enhanced depth from defocus estimation: tolerance to spatial displacements, in: Proceedings of the International Conference on Image and Signal Processing, vol. 2, Agadir, Morocco, May 3-5, 2001, pp. 978-985.
[11] Deschênes, F.; Ziou, D.; Fuchs, P., A unified approach for a simultaneaous and cooperative estimation of defocus blur and spatial shifts, Image vision comput., 22, 1, 35-58, (2004)
[12] Negahdaripour, S., Revised definition of optical flow: integration of radiometric and geometric cues for dynamic scene analysis, IEEE trans. pattern anal. Mach. intell., 20, 9, 961-979, (1998)
[13] Myles, Z.; Lobo, N.V., Recovering affine motion and defocus blur simultaneously, IEEE trans. pattern anal. Mach. intell., 20, 6, 652-658, (1998)
[14] Black, M.J.; Fleet, D.J.; Yacoob, Y., Robustly estimating changes in image appearance, Comput. vision image understanding, 78, 8-31, (2000)
[15] Haussecker, H.W.; Fleet, D.J., Computing optical flow with physical models of brightness variation, IEEE trans. pattern anal. Mach. intell., 23, 6, 661-673, (2001)
[16] H. Jin, P. Favaro, S. Soatto, Real-time feature tracking and outlier rejection with changes in illumination, in: Proceedings of the 8th International Conference on Computer Vision, 2001, pp. 684-689.
[17] Campani, M.; Verri, A., Motion analysis from first-order properties of optical flow, Comput. vision graphics image process., 50, 1, 90-107, (1992) · Zbl 0780.68132
[18] Chou, W.-S.; Chen, Y.-C., Estimation of the velocity field of two-dimensional deformable motion, Pattern recognition, 26, 2, 351-364, (1993)
[19] Bergen, J.R.; Burt, P.J.; Hingorani, R.; Peleg, S., A three-frame algorithm for estimating two-component images motion, IEEE trans. pattern anal. Mach. intell., 14, 9, 886-896, (2000)
[20] A. Kubota, K. Kodama, K. Aizawa, Registration and blur estimation methods for multiple differently focused images, in: International Conference on Image Processing, vol. 2, Kobe, Japan, October 25-28, 1999, pp. 447-451.
[21] Zhang, Y.; Wen, C.; Zhang, Y., Estimation of motion parameters from blurred images, Patter recognition lett., 21, 5, 425-433, (2000)
[22] Y. Zhang, C. Wen, Y. Zhang, Simultaneously recovering affine motion and defocus blur using moments, in: Proceedings of the 15th International Conference on Pattern Recognition (ICPR’00), Barcelona, 2000, pp. 881-884.
[23] Flusser, J.; Suk, T., Degraded image analysis: an invariant approach, IEEE trans. pattern anal. Mach. intell., 20, 6, 590-603, (1998)
[24] Deschênes, F.; Ziou, D.; Fuchs, P., Improved estimation of defocus blur and spatial shifts in spatial domain: a homotopy-based approach, Pattern recognition, 36, 2105-2125, (2003) · Zbl 1035.68126
[25] U. Mudenagudi, S. Chaudhuri, Depth estimation using defocused stereo image pairs, in: Proceedings of the 7th IEEE International Conference on Computer Vision (ICCV), Kerkyra, Greece, 1999, pp. 483-488.
[26] Ziou, D.; Deschenes, F., Depth from defocus estimation in spatial domain, Comput. vision image understanding, 81, 2, 143-165, (2001) · Zbl 1011.68545
[27] Papoulis, A., Systems and transforms with applications in optics, (1968), McGraw-Hill New York
[28] Subbarao, M.; Surya, G., Depth from defocus: a spatial domain approach, Int. J. comput. vision, 13, 3, 271-294, (1994)
[29] Massey, W.S., Basic course in algebraic topology, (1991), Springer New York, USA · Zbl 0725.55001
[30] Martínez, J.M., Algorithms for solving nonlinear systems of equations, (), 81-108 · Zbl 0828.90125
[31] Allgower, E.L.; Georg, K., Simplicial and continuation methods for approximating fixed points and solutions to systems of equations, SIAM rev., 22, 1, 29-85, (1980) · Zbl 0432.65027
[32] Melville, R.C.; Trajkovic, L.; Fang, S.C.; Watson, L.T., Artificial parameter homotopy methods for the DC operating point problem, IEEE trans. CAD, 12, 861-877, (1993)
[33] Stonick, V.L.; Alexander, S.T., Global optimal rational approximation using homotopy continuation methods, IEEE trans. signal process., 40, 9, 2358-2361, (1992)
[34] Watson, L.T., Globally convergent homotopy algorithms for nonlinear systems of equations, Nonlinear dyn., 1, 143-191, (1990)
[35] F.M. Coetzee, V.L. Stonick, Sequential homotopy-based computation of multiple solutions to nonlinear equations, in: Proceedings of the International Conference on Acoustics, Speech and Signal Processing (ICASSP), May 1995.
[36] J. Verschelde, Homotopy continuation methods for solving polynomial systems, Ph.D. Thesis, Katholieke Universiteit Leuven, 1996.
[37] Li, T.Y., Solving polynomial systems, Math. intelligencer, 9, 3, 33-39, (1987) · Zbl 0637.65047
[38] F. Deschênes, D. Ziou, P. Fuchs, A homotopy-based approach for computing defocus blur and geometric transformations simultaneously, Technical Report No. 282, DMI, Université de Sherbrooke, 2002.
[39] Xiong, Y.; Shafer, S.A., Hypergeometric filters for optical flow and affine matching, Int. J. comput. vision, 24, 2, 163-177, (1997)
[40] Farid, H.; Simoncelli, E.P., Range estimation by optical differentiation, J. opt. soc. am., 15, 7, 1777-1786, (1998)
[41] Lee, H.C., Review of image-blur models in a photographic system using the principles of optics, Opt. eng., 29, 5, 405-421, (1990)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.