×

Maximizing free entropy. (English) Zbl 0913.94003

The research aims to investigate some variational properties of the free entropy measure constrained on the \(p\)-th absolute moments. A series of preliminary strictly technical results are given in the first two sections of the paper. Next, Proposition 3.1 establishes that for any positive \(p\), \(r\), the Ullman distribution \(\nu^{(p)}_r\) is the solution of the maximum free entropy problem in the class of the probability measures on \(\mathbb{R}\) constrained by the condition \[ \int | x| ^p d\mu(x)\leq r^p \frac{\Gamma\left(\frac{p+1}2\right)} {2\sqrt{\pi\Gamma}\left(\frac p2 +1\right)}. \]
As a direct corollary, it follows that the semicircle law \(w_{m,r}\) is the maximal free entropy distribution in the class of probability measures on \(\mathbb{R}\) of variance bounded by \(\frac{r^2}2\). Similar results are supplied by the Propositions 3.2 and 3.3 where the counterparts are effectively computed for the classes of probability distributions supported respectively by \(\mathbb{R}^+,[-1,1]\) and \(\mathbb{C}\).
It is interesting to note that some classical distributions as well as new probability densities are shown up as maximizers. An attempt to extend the computational schemes to the multivariate case is presented and two extremal problems on free entropy in \(C\) are treated in the final sections of the paper.
Being given the relationships between the free entropy and rate function in the large deviation principle for the empirical eigenvalue distribution of random matrices, the computational schemes developed here and the corresponding computed maximizers could constitute a suitable basis for further research in this respect.

MSC:

94A17 Measures of information, entropy
94A15 Information theory (general)
PDFBibTeX XMLCite
Full Text: DOI