Perrier, Eric; Imoto, Seiya; Miyano, Satoru Finding optimal Bayesian network given a super-structure. (English) Zbl 1225.68206 J. Mach. Learn. Res. 9, 2251-2286 (2008). Summary: Classical approaches used to learn a Bayesian network structure from data have disadvantages in terms of complexity and lower accuracy of their results. However, a recent empirical study has shown that a hybrid algorithm improves sensitively accuracy and speed: it learns a skeleton with an independency test (IT) approach and constrains on the directed acyclic graphs (DAG) considered during the search-and-score phase. Subsequently, we theorize the structural constraint by introducing the concept of super-structure \(S\), which is an undirected graph that restricts the search to networks whose skeleton is a subgraph of \(S\). We develop a super-structure constrained optimal search (COS): its time complexity is upper bounded by \(O(\gamma _{m}^{n})\), where \(\gamma _{m}<2\) depends on the maximal degree \(m\) of \(S\). Empirically, complexity depends on the average degree \(\tilde{m}\) and sparse structures allow larger graphs to be calculated. Our algorithm is faster than an optimal search by several orders and even finds more accurate results when given a sound super-structure. Practically, \(S\) can be approximated by IT approaches; significance level of the tests controls its sparseness, enabling to control the trade-off between speed and accuracy. For incomplete super-structures, a greedily post-processed version (COS+) still enables to significantly outperform other heuristic searches. Cited in 6 Documents MSC: 68T05 Learning and adaptive systems in artificial intelligence 62M45 Neural nets and related approaches to inference from stochastic processes Keywords:Bayesian networks; structure learning; optimal search; super-structure; connected subset Software:pcalg PDF BibTeX XML Cite \textit{E. Perrier} et al., J. Mach. Learn. Res. 9, 2251--2286 (2008; Zbl 1225.68206) Full Text: Link