Ball, Keith; Nayar, Piotr; Tkocz, Tomasz A reverse entropy power inequality for log-concave random vectors. (English) Zbl 1407.94055 Stud. Math. 235, No. 1, 17-30 (2016). Summary: We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vector defines a \(1/5\)-seminorm. We make two conjectures concerning reverse entropy power inequalities in the log-concave setting and discuss some examples. Cited in 1 ReviewCited in 8 Documents MSC: 94A17 Measures of information, entropy 52A40 Inequalities and extremum problems involving convexity in convex geometry 60E15 Inequalities; stochastic orderings Keywords:entropy; log-concave; reverse entropy power inequality PDFBibTeX XMLCite \textit{K. Ball} et al., Stud. Math. 235, No. 1, 17--30 (2016; Zbl 1407.94055) Full Text: DOI arXiv References: [1] [1]S. Artstein, K. M. Ball, F. Barthe and A. Naor, On the rate of convergence in the entropic central limit theorem, Probab. Theory Related Fields 129 (2004), 381–390. · Zbl 1055.94004 [2] [2]S. Artstein, K. M. Ball, F. Barthe and A. Naor, Solution of Shannon’s problem on the monotonicity of entropy, J. Amer. Math. Soc. 17 (2004), 975–982. · Zbl 1062.94006 [3] [3]K. Ball, Logarithmically concave functions and sections of convex sets in Rn, Studia Math. 88 (1988), 69–84. · Zbl 0642.52011 [4] [4]K. Ball, F. Barthe and A. Naor, Entropy jumps in the presence of a spectral gap, Duke Math. J. 119 (2003), 41–63. · Zbl 1036.94003 [5] [5]K. Ball and V. H. Nguyen, Entropy jumps for isotropic log-concave random vectors and spectral gap, Studia Math. 213 (2012), 81–96. · Zbl 1264.94077 [6] [6]N. M. Blachman, The convolution inequality for entropy powers, IEEE Trans. Inform. Theory 11 (1965), 267–271. · Zbl 0134.37401 [7] [7]S. G. Bobkov and G. P. Chistyakov, Entropy power inequality for the R’enyi entropy, IEEE Trans. Inform. Theory 61 (2015), 708–714. · Zbl 1359.94300 [8] [8]S. G. Bobkov, G. P. Chistyakov and F. G”otze, Stability problems in Cram’er-type characterization in case of i.i.d. summands, Theory Probab. Appl. 57 (2013), 568– 588. · Zbl 1303.60015 [9] [9]S. G. Bobkov, G. P. Chistyakov and F. G”otze, Stability of Cram’er’s characterization of normal laws in information distances, arXiv:1512.03571 (2015). [10] [10]S. G. Bobkov and M. Madiman, Reverse Brunn–Minkowski and reverse entropy power inequalities for convex measures, J. Funct. Anal. 262 (2012), 3309–3339. · Zbl 1246.52012 [11] [11]S. G. Bobkov and M. Madiman, On the problem of reversibility of the entropy power inequality, in: Limit Theorems in Probability, Statistics and Number Theory, Springer Proc. Math. Statist. 42, Springer, Heidelberg, 2013, 61–74. 30K. Ball et al. · Zbl 1304.60029 [12] [12]S. G. Bobkov and M. Madiman, The entropy per coordinate of a random vector is highly constrained under convexity conditions, IEEE Trans. Inform. Theory 57 (2011), 4940–4954. · Zbl 1365.94135 [13] [13]E. A. Carlen, Superadditivity of Fisher’s information and logarithmic Sobolev inequalities, J. Funct. Anal. 101 (1991), 194–211. · Zbl 0732.60020 [14] [14]E. A. Carlen and A. Soffer, Entropy production by block variable summation and central limit theorems, Comm. Math. Phys. 140 (1991), 339–371. · Zbl 0734.60024 [15] [15]M. H. M. Costa, A new entropy power inequality, IEEE Trans. Inform. Theory 31 (1985), 751–760. · Zbl 0585.94006 [16] [16]T. M. Cover and Z. Zhang, On the maximum entropy of the sum of two dependent random variables, IEEE Trans. Inform. Theory 40 (1994), 1244–1246. · Zbl 0811.94016 [17] [17]A. Dembo, Simple proof of the concavity of the entropy power with respect to added Gaussian noise, IEEE Trans. Inform. Theory 35 (1989), 887–888. [18] [18]A. Dembo, T. M. Cover and J. A. Thomas, Information-theoretic inequalities, IEEE Trans. Inform. Theory 37 (1991), 1501–1518. · Zbl 0741.94001 [19] [19]B. Gr”unbaum, Partitions of mass-distributions and of convex bodies by hyperplanes, Pacific J. Math. 10 (1960), 1257–1261. · Zbl 0101.14603 [20] [20]O. Johnson and A. Barron, Fisher information inequalities and the central limit theorem, Probab. Theory Related Fields 129 (2004), 391–409. · Zbl 1047.62005 [21] [21]N. J. Kalton, N. T. Peck and J. W. Roberts, An F-space Sampler, London Math. Soc. Lecture Note Ser. 89, Cambridge Univ. Press, Cambridge, 1984. [22] [22]E. H. Lieb, Proof of an entropy conjecture of Wehrl, Comm. Math. Phys. 62 (1978), 35–41. · Zbl 0385.60089 [23] [23]M. Madiman and I. Kontoyiannis, The Ruzsa divergence for random elements in locally compact abelian groups, arXiv:1508.04089v1 (2015). [24] [24]C. E. Shannon, A mathematical theory of communication, Bell System Tech. J. 27 (1948), 379–423, 623–656. · Zbl 1154.94303 [25] [25]A. J. Stam, Some inequalities satisfied by the quantities of information of Fisher and Shannon, Inform. Control 2 (1959), 101–112. · Zbl 0085.34701 [26] [26]S. Szarek, On measures of symmetry and floating bodies, arXiv:1302.2076 (2013). [27] [27]S. Verd’u and D. Guo, A simple proof of the entropy-power inequality, IEEE Trans. Inform. Theory 52 (2006), 2165–2166. · Zbl 1318.94029 [28] [28]C. Villani, A short proof of the ”concavity of entropy power”, IEEE Trans. Inform. Theory 46 (2000), 1695–1696. · Zbl 0994.94018 This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.