×

Computational capabilities of analog and evolving neural networks over infinite input streams. (English) Zbl 1410.68120

Summary: Analog and evolving recurrent neural networks are super-Turing powerful. Here, we consider analog and evolving neural nets over infinite input streams. We then characterize the topological complexity of their \(\omega\)-languages as a function of the specific analog or evolving weights that they employ. As a consequence, two infinite hierarchies of classes of analog and evolving neural networks based on the complexity of their underlying weights can be derived. These results constitute an optimal refinement of the super-Turing expressive power of analog and evolving neural networks. They show that analog and evolving neural nets represent natural models for oracle-based infinite computation.

MSC:

68Q05 Models of computation (Turing machines, etc.) (MSC2010)
68Q45 Formal languages and automata
92B20 Neural networks for/in biological studies, artificial life and related topics
PDFBibTeX XMLCite
Full Text: DOI HAL

References:

[1] Abbott, L. F.; Nelson, S. B., Synaptic plasticity: taming the beast, Nat. Neurosci., 3 Suppl., 1178-1183 (2000)
[2] Amit, D. J., Modelling Brain Function: The World of Attractor Neural Networks (1992), Cambridge University Press: Cambridge University Press New York, NY, USA
[3] Apt, K. R., \(ω\)-models in analytical hierarchy, Bull. Acad. Pol. Sci., XX, 901-904 (1972) · Zbl 0252.02045
[4] Balcázar, J. L.; Gavaldà, R.; Siegelmann, H. T., Computational power of neural networks: a characterization in terms of Kolmogorov complexity, IEEE Trans. Inf. Theory, 43, 1175-1183 (1997) · Zbl 0878.68103
[5] Cabessa, J.; Duparc, J., Expressive power of non-deterministic evolving recurrent neural networks in terms of their attractor dynamics, (Calude, C. S.; Dinneen, M. J., Proceedings of UCNC 2015. Proceedings of UCNC 2015, Lecture Notes in Computer Science, vol. 9252 (2015), Springer), 144-156 · Zbl 1465.68084
[6] Cabessa, J.; Duparc, J., Expressive power of nondeterministic recurrent neural networks in terms of their attractor dynamics, Int. J. Unconv. Comput., 12, 25-50 (2016)
[7] Cabessa, J.; Finkel, O., Expressive power of evolving neural networks working on infinite input streams, (Klasing, R.; Zeitoun, M., Proceedings of FCT 2017. Proceedings of FCT 2017, Lecture Notes in Computer Science, vol. 10472 (2017), Springer), 150-163 · Zbl 1495.68058
[8] Cabessa, J.; Siegelmann, H. T., Evolving recurrent neural networks are super-Turing, (Proceedings of IJCNN 2011 (2011), IEEE), 3200-3206
[9] Cabessa, J.; Siegelmann, H. T., The computational power of interactive recurrent neural networks, Neural Comput., 24, 996-1019 (2012) · Zbl 1247.68085
[10] Cabessa, J.; Siegelmann, H. T., The super-Turing computational power of plastic recurrent neural networks, Int. J. Neural Syst., 24 (2014)
[11] Cabessa, J.; Villa, A. E.P., A hierarchical classification of first-Order recurrent neural networks, (Dediu, A. H.; etal., Proceedings of LATA 2010. Proceedings of LATA 2010, Lecture Notes in Computer Science, vol. 6031 (2010), Springer), 142-153 · Zbl 1284.68261
[12] Cabessa, J.; Villa, A. E.P., The expressive power of analog recurrent neural networks on infinite input streams, Theor. Comput. Science, 436, 23-34 (2012) · Zbl 1253.68202
[13] Cabessa, J.; Villa, A. E.P., The super-Turing computational power of interactive evolving recurrent neural networks, (Mladenov, V.; etal., Proceedings of ICANN 2013. Proceedings of ICANN 2013, Lecture Notes in Computer Science, vol. 8131 (2013), Springer), 58-65
[14] Cabessa, J.; Villa, A. E.P., An attractor-based complexity measurement for boolean recurrent neural networks, PLoS ONE, 9, Article e94204+ pp. (2014)
[15] Cabessa, J.; Villa, A. E.P., Interactive evolving recurrent neural networks are super-Turing Universal, (Wermter, S.; etal., Proceedings of ICANN 2014. Proceedings of ICANN 2014, Lecture Notes in Computer Science, vol. 8681 (2014), Springer), 57-64
[16] Cabessa, J.; Villa, A. E.P., Computational capabilities of recurrent neural networks based on their attractor dynamics, (Proceedings of IJCNN 2015 (2015), IEEE), 1-8
[17] Cabessa, J.; Villa, A. E.P., Recurrent neural networks and super-Turing interactive computation, (Koprinkova-Hristova, P.; Mladenov, V.; Kasabov, K. N., Artificial Neural Networks: Methods and Applications in Bio-/Neuroinformatics (2015), Springer), 1-29 · Zbl 1341.68047
[18] Cabessa, J.; Villa, A. E.P., Expressive power of first-order recurrent neural networks determined by their attractor dynamics, J. Comput. Syst. Sci., 82, 1232-1250 (2016) · Zbl 1351.68097
[19] Caporale, N.; Dan, Y., Spike timing-dependent plasticity: a Hebbian learning rule, Annu. Rev. Neurosci., 31, 25-46 (2008)
[20] Eliasmith, C., A unified approach to building and controlling spiking attractor networks, Neural Comput., 17, 1276-1314 (2005) · Zbl 1087.68614
[21] Finkel, O., Ambiguity of omega-languages of Turing Machines, Log. Methods Comput. Sci., 10 (2014) · Zbl 1337.03056
[22] Hartley, R.; Szu, H., A comparison of the computational power of neural network models, (Butler, C., Proceedings of the IEEE First International Conference on Neural Networks (1987), IEEE), 17-22
[23] Hopfield, J. J., Neural networks and physical systems with emergent collective computational abilities, Proc. Natl. Acad. Sci., 79, 2554-2558 (1982) · Zbl 1369.92007
[24] Hopfield, J. J., Neurons with graded response have collective computational properties like those of two-state neurons, Proc. Natl. Acad. Sci., 81, 3088-3092 (1984) · Zbl 1371.92015
[25] Hyötyniemi, H., Turing machines are recurrent neural networks, (Alander, J.; Honkela, T.; Jakobsson, M., Proceedings of STeP 1996 (1996), University of Vaasa, Finnish Artificial Intelligence Society (FAIS)), 13-24
[26] Kasabov, N., Evolving Connectionist Systems - The Knowledge Engineering Approach (2007), Springer · Zbl 1140.68434
[27] Kauffman, S. A., The Origins of Order: Self-Organization and Selection in Evolution (1993), Oxford University Press: Oxford University Press New York
[28] Kechris, A. S., Classical Descriptive Set Theory, Graduate Texts in Mathematics, vol. 156 (1995), Springer: Springer New York, NY, USA · Zbl 0819.04002
[29] Kilian, J.; Siegelmann, H. T., The dynamic universality of sigmoidal neural networks, Inf. Comput., 128, 48-56 (1996) · Zbl 0856.68122
[30] Kleene, S. C., Representation of events in nerve nets and finite automata, (Shannon, C.; McCarthy, J., Automata Studies (1956), Princeton University Press: Princeton University Press Princeton, NJ), 3-41
[31] Little, W. A., The existence of persistent states in the brain, Math. Biosci., 19, 101-120 (1974) · Zbl 0272.92011
[32] Little, W. A.; Shaw, G. L., Analytical study of the memory storage capacity of a neural network, Math. Biosci., 39, 281-290 (1978) · Zbl 0395.92005
[33] Martin, S. J.; Grimwood, P. D.; Morris, R. G.M., Synaptic plasticity and memory: an evaluation of the hypothesis, Annu. Rev. Neurosci., 23, 649-711 (2000)
[34] McCulloch, W. S.; Pitts, W., A logical calculus of the ideas immanent in nervous activity, Bull. Math. Biophys., 5, 115-133 (1943) · Zbl 0063.03860
[35] Minsky, M. L., Computation: Finite and Infinite Machines (1967), Prentice-Hall, Inc.: Prentice-Hall, Inc. Englewood Cliffs, N. J. · Zbl 0195.02402
[36] Moschovakis, Y. N., Descriptive Set Theory, Mathematical Surveys and Monographs (2009), American Mathematical Society
[37] Neto, J.a. P.G.; Siegelmann, H. T.; Costa, J. F.; Araujo, C. P.S., Turing universality of neural nets (revisited), (Proceedings of EUROCAST ’97: Computer Aided Systems Theory. Proceedings of EUROCAST ’97: Computer Aided Systems Theory, Lecture Notes in Computer Science, vol. 1333 (1997), Springer: Springer London, UK), 361-366
[38] Perrin, D.; Pin, J. E., Infinite Words - Automata, Semigroups, Logic and Games, Pure and Applied Mathematics, vol. 141 (2004), Elsevier · Zbl 1094.68052
[39] Pollack, J. B., On Connectionist Models of Natural Language Processing (1987), Computing Research Laboratory, New Mexico State University: Computing Research Laboratory, New Mexico State University Las Cruces, NM, Ph.D. thesis
[40] Roberts, P. D.; Bell, C. C., Spike timing dependent synaptic plasticity in biological systems, Biol. Cybern., 87, 392-403 (2002) · Zbl 1105.92327
[41] Siegelmann, H. T., Recurrent neural networks and finite automata, Comput. Intell., 12, 567-574 (1996)
[42] Siegelmann, H. T., Neural Networks and Analog Computation: Beyond the Turing Limit (1999), Birkhauser Boston Inc.: Birkhauser Boston Inc. Cambridge, MA, USA · Zbl 0912.68161
[43] Siegelmann, H. T., Neural and super-Turing computing, Minds Mach., 13, 103-114 (2003) · Zbl 1030.68042
[44] Siegelmann, H. T.; Sontag, E. D., Analog computation via neural networks, Theor. Comput. Sci., 131, 331-360 (1994) · Zbl 0822.68029
[45] Siegelmann, H. T.; Sontag, E. D., On the computational power of neural nets, J. Comput. Syst. Sci., 50, 132-150 (1995) · Zbl 0826.68104
[46] Síma, J.; Orponen, P., General-purpose computation with neural networks: a survey of complexity theoretic results, Neural Comput., 15, 2727-2778 (2003) · Zbl 1097.68589
[47] Staiger, L., \(ω\)-languages, (Handbook of Formal Languages, Vol. 3: Beyond Words (1997), Springer: Springer New York, NY, USA), 339-387
[48] Stanley, K. O.; Miikkulainen, R., Evolving neural network through augmenting topologies, Evol. Comput., 10, 99-127 (2002)
[49] Thomas, W., Automata on infinite objects, (van Leeuwen, J., Handbook of Theoretical Computer Science, Volume B: Formal Models and Semantics (1990), Elsevier and MIT Press), 133-192 · Zbl 0900.68316
[50] Turing, A. M., Intelligent Machinery (1948), National Physical Laboratory: National Physical Laboratory Teddington, UK, Technical Report · Zbl 0219.68052
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.