×

Patch-based image restoration using expectation propagation. (English) Zbl 1483.94014

Summary: This paper presents a new Expectation Propagation (EP) framework for image restoration using patch-based prior distributions. While Monte Carlo techniques are classically used to sample from intractable posterior distributions, they can suffer from scalability issues in high-dimensional inference problems such as image restoration. To address this issue, EP is used here to approximate the posterior distributions using products of multivariate Gaussian densities. Moreover, imposing structural constraints on the covariance matrices of these densities allows for greater scalability and distributed computation. While the method is naturally suited to handle additive Gaussian observation noise, it can also be extended to non-Gaussian noise. Experiments conducted for denoising, inpainting, and deconvolution problems with Gaussian and Poisson noise illustrate the potential benefits of such a flexible approximate Bayesian method for uncertainty quantification in imaging problems, at a reduced computational cost compared to sampling techniques.

MSC:

94A08 Image processing (compression, reconstruction, etc.) in information and communication theory
62F15 Bayesian inference
68U10 Computing methodologies for image processing
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] M. A. T. Figueiredo and R. D. Nowak, An EM algorithm for wavelet-based image restoration, IEEE Trans. Image Process., 12 (2003), pp. 906-916. · Zbl 1279.94015
[2] F. Abramovich, T. Sapatinas, and B. W. Silverman, Wavelet thresholding via a Bayesian approach, J. R. Stat. Soc. Ser. B Stat. Methodol., 60 (1998), pp. 725-749. · Zbl 0910.62031
[3] J. M. Bioucas-Dias, Bayesian wavelet-based image deconvolution: A GEM algorithm exploiting a class of heavy-tailed priors, IEEE Trans. Image Process., 15 (2006), pp. 937-951.
[4] H. Rue and L. Held, Gaussian Markov Random Fields: Theory and Applications, CRC Press, Boca Raton, FL, 2005. · Zbl 1093.60003
[5] D. Zoran and Y. Weiss, From learning models of natural image patches to whole image restoration, in Proceedings of the IEEE International Conference on Computer Vision (ICCV), IEEE, Washington, DC, 2011, pp. 479-486.
[6] M. Niknejad, J. Bioucas-Dias, and M. A. T. Figueiredo, External patch-based image restoration using importance sampling, IEEE Trans. Image Process., 28 (2019), pp. 4460-4470. · Zbl 07122991
[7] M. Elad, Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing, Springer, New York, 2010. · Zbl 1211.94001
[8] A. M. Teodoro, M. S. C. Almeida, and M. A. T. Figueiredo, Single-frame image denoising and inpainting using Gaussian mixtures, in Proceedings of ICPRAM (2), Lisbon, Portugal, 2015, pp. 283-288.
[9] M. Niknejad, H. Rabbani, and M. Babaie-Zadeh, Image restoration using Gaussian mixture models with spatially constrained patch clustering, IEEE Trans. Image Process., 24 (2015), pp. 3624-3636. · Zbl 1408.94510
[10] A. M. Teodoro, J. M. Bioucas-Dias, and M. A. T. Figueiredo, Block-Gaussian-mixture priors for hyperspectral denoising and inpainting, IEEE Trans. Geosci. Remote Sensing, 59 (2021), pp. 2478-2486.
[11] R. Zhang, D. H. Ye, D. Pal, J.-B. Thibault, K. D. Sauer, and C. A. Bouman, A Gaussian mixture MRF for model-based iterative reconstruction with applications to low-dose X-ray CT, IEEE Trans. Comput. Imaging, 2 (2016), pp. 359-374.
[12] S. Roth and M. J. Black, Fields of experts: A framework for learning image priors, in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), Vol. 2, IEEE, Washington, DC, 2005, pp. 860-867.
[13] S. V. Venkatakrishnan, C. A. Bouman, and B. Wohlberg, Plug-and-play priors for model based reconstruction, in Proceedings of the 2013 IEEE Global Conference on Signal and Information Processing, IEEE, Washington, DC, 2013, pp. 945-948.
[14] Y. Romano, M. Elad, and P. Milanfar, The little engine that could: Regularization by denoising (RED), SIAM J. Imaging Sci., 10 (2017), pp. 1804-1844, https://doi.org/10.1137/16M1102884. · Zbl 1401.62101
[15] D. Ulyanov, A. Vedaldi, and V. Lempitsky, Deep image prior, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, IEEE, Washington, DC, 2018, pp. 9446-9454.
[16] G. Mataev, P. Milanfar, and M. Elad, DeepRED: Deep image prior powered by RED, in Proceedings of the IEEE/CVF International Conference on Computer Vision Workshops, IEEE, Washington, DC, 2019, pp. 0-0.
[17] G. Vaksman, M. Elad, and P. Milanfar, LIDIA: Lightweight Learned Image Denoising with Instance Adaptation, in IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, IEEE, Washington, DC, 2020, pp. 2220-2229.
[18] K. Zhang, W. Zuo, Y. Chen, D. Meng, and L. Zhang, Beyond a Gaussian denoiser: Residual learning of deep CNN for image denoising, IEEE Trans. Image Process., 26 (2017), pp. 3142-3155. · Zbl 1409.94754
[19] J. M. Bernardo and A. F. M. Smith, Bayesian Theory, Wiley Ser. Probab. Statist. Probab. Statist., John Wiley & Sons, Chichester, UK, 1994. · Zbl 0796.62002
[20] M. Pereyra, Maximum-a-posteriori estimation with Bayesian confidence regions, SIAM J. Imaging Sci., 10 (2017), pp. 285-302, https://doi.org/10.1137/16M1071249. · Zbl 1375.94024
[21] A. Repetti, M. Pereyra, and Y. Wiaux, Scalable Bayesian uncertainty quantification in imaging inverse problems via convex optimization, SIAM J. Imaging Sci., 12 (2019), pp. 87-118, https://doi.org/10.1137/18M1173629. · Zbl 1429.94022
[22] A. Durmus, É. Moulines, and M. Pereyra, Efficient Bayesian computation by proximal Markov chain Monte Carlo: When Langevin meets Moreau, SIAM J. Imaging Sci., 11 (2018), pp. 473-506, https://doi.org/10.1137/16M1108340. · Zbl 1401.65016
[23] C. M. Bishop, Pattern Recognition and Machine Learning, Springer, New York, 2006. · Zbl 1107.68072
[24] D. M. Blei, A. Kucukelbir, and J. D. McAuliffe, Variational inference: A review for statisticians, J. Amer. Statist. Assoc., 112 (2017), pp. 859-877.
[25] T. P. Minka, Expectation propagation for approximate Bayesian inference, in Proceedings of the Seventeenth Conference on Uncertainty in Artificial Intelligence, UAI’01, Seattle, WA, 2011, pp. 362-369.
[26] M. Opper and O. Winther, Gaussian processes for classification: Mean-field algorithms, Neural Comput., 12 (2000), pp. 2655-2684.
[27] A. Gelman, A. Vehtari, P. Jylänki, C. Robert, N. Chopin, and J. P. Cunningham, Expectation propagation as a way of life, J. Mach. Learn. Res., 21 (2020), pp. 1-53. · Zbl 1498.68287
[28] M. Seeger and H. Nickisch, Fast convergent algorithms for expectation propagation approximate Bayesian inference, in Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, Ft. Lauderdale, FL, 2011, pp. 652-660.
[29] M. Seeger, F. Steinke, and K. Tsuda, Bayesian inference and optimal design in the sparse linear model, in Artificial Intelligence and Statistics, PMLR, 2007, pp. 444-451.
[30] M. W. Seeger and H. Nickisch, Large scale Bayesian inference and experimental design for sparse linear models, SIAM J. Imaging Sci., 4 (2011), pp. 166-199, https://doi.org/10.1137/090758775. · Zbl 1215.68232
[31] J. M. Hernández-Lobato, D. Hernández-Lobato, and A. Suárez, Expectation propagation in linear regression models with spike-and-slab priors, Mach. Learn., 99 (2015), pp. 437-487. · Zbl 1359.62290
[32] Y. Altmann, A. Perelli, and M. E. Davies, Expectation-propagation algorithms for linear regression with Poisson noise: Application to photon-limited spectral unmixing, in Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), IEEE, Washington, DC, 2019, pp. 5067-5071.
[33] A. S. I. Kim and M. P. Wand, On expectation propagation for generalised, linear and mixed models, Aust. N. Z. J. Stat., 60 (2018), pp. 75-102. · Zbl 1521.62091
[34] N. Heess, D. Tarlow, and J. Winn, Learning to pass expectation propagation messages, in Advances in Neural Information Processing Systems, Vol. 26, Curran Associates, Red Hook, NY, 2013, pp. 3219-3227.
[35] A. Braunstein, A. Muntoni, A. Pagnani, and M. Pieropan, Compressed sensing reconstruction using expectation propagation, J. Phys. A, 53 (2019), 184001. · Zbl 1514.62084
[36] A. P. Muntoni, R. D. H. Rojas, A. Braunstein, A. Pagnani, and I. P. Castillo, Nonconvex image reconstruction via expectation propagation, Phys. Rev. E, 100 (2019), 032134.
[37] A. Braunstein, A. P. Muntoni, and A. Pagnani, An analytic approximation of the feasible space of metabolic networks, Nature Commun., 8 (2017), 14915.
[38] C. Zhang, S. Arridge, and B. Jin, Expectation propagation for Poisson data, Inverse Problems, 35 (2019), 085006. · Zbl 1425.62050
[39] J. P. Cunningham, P. Hennig, and S. Lacoste-Julien, Gaussian Probabilities and Expectation Propagation, preprint, https://arxiv.org/abs/1111.6832, 2011.
[40] B. Potetz and M. Hajiarbabi, Whitened expectation propagation: Non-Lambertian shape from shading and shadow, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, Washington, DC, 2013, pp. 1674-1681.
[41] D. L. Donoho, A. Maleki, and A. Montanari, Message-passing algorithms for compressed sensing, Proc. Natl. Acad. Sci. USA, 106 (2009), pp. 18914-18919.
[42] S. Rangan, Generalized approximate message passing for estimation with random linear mixing, in Proceedings of the IEEE International Symposium on Information Theory, IEEE, Washington, DC, 2011, pp. 2168-2172.
[43] P. Schniter, S. Rangan, and A. K. Fletcher, Vector approximate message passing for the generalized linear model, in Proceedings of the IEEE 50th Asilomar Conference on Signals, Systems and Computers, IEEE, Washington, DC, 2016, pp. 1525-1529.
[44] J. Barzilai and J. M. Borwein, Two-point step size gradient methods, IMA J. Numer. Anal., 8 (1988), pp. 141-148. · Zbl 0638.65055
[45] M. R. Hestenes and E. Stiefel, Methods of conjugate gradients for solving linear systems, J. Research Nat. Bur. Standards, 49 (1952), pp. 409-436. · Zbl 0048.09901
[46] P. Sidén, F. Lindgren, D. Bolin, and M. Villani, Efficient covariance approximations for large sparse precision matrices, J. Comput. Graph. Statist., 27 (2018), pp. 898-909. · Zbl 07499000
[47] Y.-J. Ko and M. W. Seeger, Expectation propagation for rectified linear Poisson regression, in Proceedings of the Asian Conference on Machine Learning, Hamilton, Aukland, New Zealand, 2016, pp. 253-268.
[48] Y. J. Ko, Applications of Approximate Learning and Inference for Probabilistic Models, Tech. report, EPFL, Lausanne, Switzerland, 2017.
[49] M. P. Wand, J. T. Ormerod, S. A. Padoan, and R. Frühwirth, Mean field variational Bayes for elaborate distributions, Bayesian Anal., 6 (2011), pp. 847-900. · Zbl 1330.62158
[50] Y. Altmann, S. McLaughlin, and M. Padgett, Unsupervised restoration of subsampled images constructed from geometric and binomial data, in Proceedings of the 7th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing (CAMSAP), IEEE, Washington, DC, 2017, pp. 1-5.
[51] T. Minka, Divergence Measures and Message Passing, Tech. report MSR-TR-2005-173, Microsoft Research, 2005.
[52] A. Gelman, J. B. Carlin, H. S. Stern, D. B. Dunson, A. Vehtari, and D. B. Rubin, Bayesian Data Analysis, CRC Press, Boca Raton, FL, 2013.
[53] G. E. Hinton, Products of experts, in Proceedings of the International Conference on Artificial Neural Networks (ICANN), IEEE, Washington, DC, 1999, pp. 1-6.
[54] T. K. Moon, The expectation-maximization algorithm, IEEE Signal Process. Mag., 13 (1996), pp. 47-60.
[55] G. Celeux, F. Forbes, and N. Peyrard, EM procedures using mean field-like approximations for Markov model-based image segmentation, Pattern Recognition, 36 (2003), pp. 131-144. · Zbl 1010.68158
[56] H.-C. Kim and Z. Ghahramani, Bayesian Gaussian process classification with the EM-EP algorithm, IEEE Trans. Pattern Anal. Mach. Intell., 28 (2006), pp. 1948-1959.
[57] D. Martin, C. Fowlkes, D. Tal, and J. Malik, A Database of Human Segmented Natural Images and Its Application to Evaluating Segmentation Algorithms and Measuring Ecological Statistics, Research paper, Department of Electrical Engineering and Computer Sciences, University of California Berkeley, Berkeley, CA, 2001.
[58] K. Dabov, A. Foi, V. Katkovnik, and K. Egiazarian, Image denoising by sparse 3-D transform-domain collaborative filtering, IEEE Trans. Image Process., 16 (2007), pp. 2080-2095.
[59] M. Makitalo and A. Foi, Optimal inversion of the Anscombe transformation in low-count Poisson image denoising, IEEE Trans. Image Process., 20 (2010), pp. 99-109. · Zbl 1372.94173
[60] Z. T. Harmany, R. F. Marcia, and R. M. Willett, This is SPIRAL-TAP: Sparse Poisson intensity reconstruction algorithms-theory and practice, IEEE Trans. Image Process., 21 (2011), pp. 1084-1096. · Zbl 1372.94381
[61] M. A. T. Figueiredo and J. M. Bioucas-Dias, Restoration of Poissonian images using alternating direction optimization, IEEE Trans. Image Process., 19 (2010), pp. 3133-3145. · Zbl 1371.94128
[62] F. J. Anscombe, The transformation of Poisson, binomial and negative-binomial data, Biometrika, 35 (1948), pp. 246-254. · Zbl 0032.03702
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.