×

A concavity property for the reciprocal of Fisher information and its consequences on Costa’s EPI. (English) Zbl 1400.94103

Summary: We prove that the reciprocal of Fisher information of a log-concave probability density \(X\) in \(\mathbb{R}^n\) is concave in \(t\) with respect to the addition of a Gaussian noise \(Z_t = N(0, t I_n)\). As a byproduct of this result we show that the third derivative of the entropy power of a log-concave probability density \(X\) in \(\mathbb{R}^n\) is nonnegative in \(t\) with respect to the addition of a Gaussian noise \(Z_t\). For log-concave densities this improves the well-known Costa’s concavity property of the entropy power [M. H. M. Costa, IEEE Trans. Inf. Theory 31, 751–760 (1985; Zbl 0585.94006)].

MSC:

94A17 Measures of information, entropy

Citations:

Zbl 0585.94006
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] Shannon, C. E., A mathematical theory of communication, Bell Syst. Tech. J., 27, 379 (1948), 623 · Zbl 1154.94303
[2] Stam, A. J., Some inequalities satisfied by the quantities of information of Fisher and Shannon, Inf. Control, 2, 101-112 (1959) · Zbl 0085.34701
[3] Costa, M. H.M., A new entropy power inequality, IEEE Trans. Inform. Theory, 31, 751 (1985) · Zbl 0585.94006
[4] Guo, D.; Shamai, S.; Verdú, S., Mutual information and minimum mean-square error in Gaussian channels, IEEE Trans. Inform. Theory, 51, 1261 (2005) · Zbl 1309.94099
[5] Guo, D.; Shamai, S.; Verdú, S., A simple proof of the entropy-power inequality, IEEE Trans. Inform. Theory, 52, 2165 (2006) · Zbl 1318.94029
[6] Rioul, O., Information theoretic proofs of entropy power inequalities, IEEE Trans. Inform. Theory, 57, 33 (2011) · Zbl 1366.94205
[7] Toscani, G., Heat equation and convolution inequalities, Milan J. Math., 82, 183 (2014) · Zbl 1303.26023
[8] Zamir, R.; Feder, M., A generalization of the entropy power inequality with applications, IEEE Trans. Inform. Theory, 39, 1723 (1993) · Zbl 0802.94003
[9] Blachman, N. M., The convolution inequality for entropy powers, IEEE Trans. Inform. Theory, 2, 267 (1965) · Zbl 0134.37401
[10] Dembo, A., A simple proof of the concavity of the entropy power with respect to the variance of additive normal noise, IEEE Trans. Inform. Theory, 35, 887 (1989)
[12] Villani, C., A short proof of the concavity of entropy power, IEEE Trans. Inform. Theory, 46, 1695 (2000) · Zbl 0994.94018
[13] Marshall, A. W.; Olkin, I., Inequalities: Theory of Majorization and its Applications (1979), Academic Press: Academic Press Orlando · Zbl 0437.26007
[15] McKean, H. P., Speed of approach to equilibrium for Kac’s caricature of a Maxwellian gas, Arch. Ration. Mech. Anal., 21, 343 (1966) · Zbl 1302.60049
[16] Dharmadhikari, S.; Joag-Dev, K., Unimodality, Convexity, and Applications (1988), Academic Press: Academic Press Boston · Zbl 0646.62008
[17] Lions, P. L.; Toscani, G., A strenghened central limit theorem for smooth densities, J. Funct. Anal., 129, 148 (1995) · Zbl 0822.60018
[18] Dembo, A.; Cover, T. M.; Thomas, J. A., Information theoretic inequalities, IEEE Trans. Inform. Theory, 37, 1501 (1991) · Zbl 0741.94001
[19] Toscani, G., An information-theoretic proof of Nash’s inequality, Rend. Lincei Sci. Fis. Nat., 24, 83 (2013) · Zbl 1301.62006
[20] Toscani, G., Lyapunov functionals for the heat equation and sharp inequalities, Atti Accad. Peloritana Pericolanti Cl. Sci. Fis. Mat. Natur., 91, 1 (2013)
[21] Savaré, G.; Toscani, G., The concavity of Rényi entropy power, IEEE Trans. Inform. Theory, 60, 2687 (2014) · Zbl 1360.94169
[22] Toscani, G., Rényi entropies and nonlinear diffusion equations, Acta Appl. Math., 132, 595 (2014) · Zbl 1308.94044
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.