×

zbMATH — the first resource for mathematics

Consolidation of probabilistic knowledge bases by inconsistency minimization. (English) Zbl 1366.68316
Schaub, Torsten (ed.) et al., ECAI 2014. 21st European conference on artificial intelligence, Prague, Czech Republic, August 18–22, 2014. Proceedings. Including proceedings of the accompanied concerence on prestigious applications of intelligent systems (PAIS 2014). Amsterdam: IOS Press (ISBN 978-1-61499-418-3/pbk; 978-1-61499-419-0/ebook). Frontiers in Artificial Intelligence and Applications 263, 729-734 (2014).
Summary: Consolidation describes the operation of restoring consistency in an inconsistent knowledge base. Here we consider this problem in the context of probabilistic conditional logic, a language that focuses on probabilistic conditionals (if-then rules). If a knowledge base, i. e., a set of probabilistic conditionals, is inconsistent traditional model-based inference techniques are not applicable. In this paper, we develop an approach to repair such knowledge bases that relies on a generalized notion of a model of a knowledge base that extends to classically inconsistent knowledge bases. We define a generalized approach to reasoning under maximum entropy on these generalized models and use it to repair the knowledge base. This approach is founded on previous work on inconsistency measures and we show that it is well-defined, provides a unique solution, and satisfies other desirable properties.
For the entire collection see [Zbl 1296.68011].

MSC:
68T35 Theory of languages and software systems (knowledge-based systems, expert systems, etc.) for artificial intelligence
68T27 Logic in artificial intelligence
PDF BibTeX XML Cite
Full Text: Link