×

zbMATH — the first resource for mathematics

Information measures, effective complexity, and total information. (English) Zbl 1294.94011
Summary: This article defines the concept of an information measure and shows how common information measures such as entropy, Shannon information, and algorithmic information content can be combined to solve problems of characterization, inference, and learning for complex systems. Particularly useful quantities are the effective complexity, which is roughly the length of a compact description of the identified regularities of an entity, and total information, which is effective complexity plus an entropy term that measures the information required to describe the random aspects of the entity. Mathematical definitions are given for both quantities and some applications are discussed. In particular, it is pointed out that if one compares different sets of identified regularities of an entity, the ‘best’ set minimizes the total information, and then, subject to that constraint and to constraints on computation time, minimizes the effective complexity; the resulting effective complexity is then in many respects independent of the observer.

MSC:
94A15 Information theory (general)
68Q30 Algorithmic information theory (Kolmogorov complexity, etc.)
94A17 Measures of information, entropy
PDF BibTeX XML Cite
Full Text: DOI