×

A dynamical state underlying the second order maximum entropy principle in neuronal networks. (English) Zbl 1366.62021

Summary: The maximum entropy principle is widely used in diverse fields. We address the issue of why the second order maximum entropy model, by using only firing rates and second order correlations of neurons as constraints, can well capture the observed distribution of neuronal firing patterns in many neuronal networks, thus, conferring its great advantage in that the degree of complexity in the analysis of neuronal activity data reduces drastically from \(\mathcal{O}(2^n)\) to \(\mathcal{O}(n^2)\), where \(n\) is the number of neurons under consideration. We first derive an expression for the effective interactions of the \(n\)th order maximum entropy model using all orders of correlations of neurons as constraints and show that there exists a recursive relation among the effective interactions in the model. Then, via a perturbative analysis, we explore a possible dynamical state in which this recursive relation gives rise to the strengths of higher order interactions always smaller than the lower orders. Finally, we invoke this hierarchy of effective interactions to provide a possible mechanism underlying the success of the second order maximum entropy model and to predict whether such a model can successfully capture the observed distribution of neuronal firing patterns.

MSC:

62B10 Statistical aspects of information-theoretic topics
62M45 Neural nets and related approaches to inference from stochastic processes
92C20 Neural biology
94A17 Measures of information, entropy
PDFBibTeX XMLCite
Full Text: DOI