Gell-Mann, Murray; Lloyd, Seth Information measures, effective complexity, and total information. (English) Zbl 1294.94011 Complexity 2, No. 1, 44-52 (1996). Summary: This article defines the concept of an information measure and shows how common information measures such as entropy, Shannon information, and algorithmic information content can be combined to solve problems of characterization, inference, and learning for complex systems. Particularly useful quantities are the effective complexity, which is roughly the length of a compact description of the identified regularities of an entity, and total information, which is effective complexity plus an entropy term that measures the information required to describe the random aspects of the entity. Mathematical definitions are given for both quantities and some applications are discussed. In particular, it is pointed out that if one compares different sets of identified regularities of an entity, the ‘best’ set minimizes the total information, and then, subject to that constraint and to constraints on computation time, minimizes the effective complexity; the resulting effective complexity is then in many respects independent of the observer. Cited in 1 ReviewCited in 20 Documents MSC: 94A15 Information theory (general) 68Q30 Algorithmic information theory (Kolmogorov complexity, etc.) 94A17 Measures of information, entropy Keywords:information measures; effective complexity PDF BibTeX XML Cite \textit{M. Gell-Mann} and \textit{S. Lloyd}, Complexity 2, No. 1, 44--52 (1996; Zbl 1294.94011) Full Text: DOI