×

Reservoir computing with computational matter. (English) Zbl 1436.68115

Stepney, Susan (ed.) et al., Computational matter. Cham: Springer. Nat. Comput. Ser., 269-293 (2018).
Summary: The reservoir computing paradigm of information processing has emerged as a natural response to the problem of training recurrent neural networks. It has been realized that the training phase can be avoided provided a network has some well-defined properties, e.g. the echo state property. This idea has been generalized to arbitrary artificial dynamical systems. In principle, any dynamical system could be used for advanced information processing applications provided that such a system has the separation and the approximation property. To carry out this idea in practice, the only auxiliary equipment that is needed is a simple read-out layer that can be used to access the internal states of the system. In the following, several applications scenarios of this generic idea are discussed, together with some related engineering aspects. We cover both practical problems one might meet when trying to implement the idea, and discuss several strategies of solving such problems.
For the entire collection see [Zbl 1443.68021].

MSC:

68Q07 Biologically inspired models of computation (DNA computing, membrane computing, etc.)
92B20 Neural networks for/in biological studies, artificial life and related topics
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Aguilera, Renato, Eleanor Demis, Kelsey Scharnhorst, Adam Z. Stieg, Masakazu Aono, and James K. Gimzewski (2015). “Morphic atomic switch networks for beyond-Moore computing architectures”.Interconnect Technology Conference and 2015 IEEE Materials for Advanced Metallization Conference (IITC/MAM), 2015. IEEE, pp. 165-168.
[2] Appeltant, Lennert, Miguel Cornelles Soriano, Guy Van der Sande, Jan Danckaert, Serge Massar, Joni Dambre, Benjamin Schrauwen, Claudio R Mirasso, and Ingo Fischer (2011). “Information processing using a single dynamical node as complex system”.Nature Communications2:468.
[3] Bennett, Christopher H. (2014). “Computing with Memristor Networks”. Master thesis.url:publications . lib . chalmers . se / publication / 218659 computing-with-memristor-networks.
[4] 1For example, this requirement is what the Liquid State Machine model builds on (Maass
[5] et al., 2002; Natschl¨ager et al., 2002).
[6] 2This paradigm is also emphasized in the original Echo State Network formulation. (Jaeger
[7] and Haas, 2004; Jaeger, 2001b)
[8] Bennett, Christopher, Aldo Jesorka, G¨oran Wendin, and Zoran Konkoli (2016). “On the inverse pattern recognition problem in the context of the time-series data processing with memristor networks”.Advances in Unconventional Computation. Ed. by Andrew Adamatzky. Vol. 2. Prototypes and algorithms. Springer.
[9] Berlekamp, E. R., J. H. Conway, and R. K. Guy (1982).Winning Ways for Your Mathematical Plays. Academic Press. · Zbl 0485.00025
[10] Bertschinger, N. and T. Natschl¨ager (2004). “Real-time computation at the edge of chaos in recurrent neural networks”.Neural Computation16(7) :1413-1436. · Zbl 1102.68530
[11] B¨using, L., B. Schrauwen, and R. Legenstein (2010). “Connectivity, dynamics, and memory in reservoir computing with binary and analog neurons”. Neural Computation22(5):1272-1311. · Zbl 1195.92002
[12] Chrol-Cannon, J. and Y. Jin (2014). “On the Correlation between Reservoir Metrics and Performance for Time Series Classification under the Influence of Synaptic Plasticity”.PLoS One9(7):e101792.
[13] Cook, Matthew (2004). “Universality in elementary cellular automata”.Complex Systems15(1):1-40. · Zbl 1167.68387
[14] Dai, Xianhua (2004). “Genetic Regulatory Systems Modeled by Recurrent Neural Network”.Advances in Neural Networks (ISNN 2004). Ed. by FuLiang Yin, Jun Wang, and Chengan Guo. Vol. 3174. LNCS. Springer, pp. 519-524.
[15] Dale, Matthew, Julian F. Miller, Susan Stepney, and Martin A. Trefzer (2016a). “Evolving Carbon Nanotube Reservoir Computers”.UCNC 2016. Vol. 9726. LNCS. Springer, pp. 49-61. · Zbl 1476.68095
[16] Dale, Matthew, Julian F. Miller, Susan Stepney, and Martin A. Trefzer (2016b). “Reservoir Computingin Materio: An Evaluation of Configuration through Evolution”.ICES 2016 at SSCI 2016. IEEE. · Zbl 1476.68095
[17] Dale, Matthew, Julian F. Miller, Susan Stepney, and Martin A. Trefzer (2017). “Reservoir Computingin Materio: a computational framework for evolutionin Materio”.Proc. IJCNN 2017. IEEE, pp. 2178-2185.
[18] Dambre, Joni, David Verstraeten, Benjamin Schrauwen, and Serge Massar (2012). “Information Processing Capacity of Dynamical Systems”.Sci. Rep.2:514.
[19] Fernando, Chrisantha and Sampsa Sojakka (2003). “Pattern Recognition in a Bucket”.Advances in Artificial Life. Vol. 2801. LNCS. Springer, pp. 588- 597.
[20] Gibbons, T. E. (2010). “Unifying quality metrics for reservoir networks”. Proc. IJCNN 2010. IEEE, pp. 1-7.
[21] Horsman, C., Susan Stepney, Rob C. Wagner, and Viv Kendon (2014). “When does a physical system compute?”Proceedings of the Royal Society A 470(2169):20140182. · Zbl 1353.68077
[22] Jaeger, H. (2001a).Short term memory in echo state networks. GMDForschungszentrum Informationstechnik.
[23] Jaeger, H. (2007). “Echo state network”.Scholarpedia2:2330.
[24] Jaeger, H. and H. Haas (2004). “Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication”.Science304:78-80.
[25] Jaeger, Herbert (2001b).The “echo state” approach to analysing and training recurrent neural networks. Tech. rep. GMD Report 148. German National Research Center for Information Technology.
[26] Jones, Ben, Dov Stekel, Jon Rowe, and Chrisantha Fernando (2007). “Is there a liquid state machine in the bacteriumEscherichia coli?”IEEE Symposium on Artificial Life, 2007. IEEE, pp. 187-191.
[27] Kendon, Viv, Angelika Sebald, and Susan Stepney (2015). “Heterotic computing: past, present, and future”.Phil. Trans. Roy. Soc. A373:20140225. · Zbl 1330.68076
[28] Klampfl, S., S.V. David, P. Yin, S.A. Shamma, and W. Maass (2009). “Integration of stimulus history in information conveyed by neurons in primary auditory cortex in response to tone sequences”.39th Annual Conference of the Society for Neuroscience, Program. Vol. 163.
[29] Konkoli, Zoran (2015). “A Perspective on Putnam’s Realizability Theorem in the Context of Unconventional Computation”.International Journal of Unconventional Computing11:83-102.
[30] Konkoli, Zoran (2016). “On reservoir computing: from mathematical foundations to unconventional applications”.Advances in Unconventional Computation. Ed. by Andrew Adamatzky. Vol. 1. Theory. Springer.
[31] Konkoli, Zoran and G¨oran Wendin (2014). “On information processing with networks of nano-scale switching elements”.International Journal of Unconventional Computing10:405-428.
[32] Langton, Chris G (1990). “Computation at the edge of chaos: phase transitions and emergent computation”.Physica D: Nonlinear Phenomena42(1) :12-37.
[33] Larger, Laurent, Miguel C. Soriano, Daniel Brunner, Lennert Appeltant, Jose M. Guti´errez, Luis Pesquera, Claudio R. Mirasso, and Ingo Fischer (2012). “Photonic information processing beyond Turing: an optoelectronic implementation of reservoir computing”.Optics express20(3):3241-3249.
[34] Legenstein, Robert and Wolfgang Maass (2007a). “Edge of chaos and prediction of computational performance for neural circuit models”.Neural Networks20(3):323-334. · Zbl 1132.68568
[35] Legenstein, Robert and Wolfgang Maass (2007b). “What makes a dynamical system computationally powerful?”New directions in statistical signal processing: From systems to brain. Ed. by Simon Haykin, Jos´e C. Principe, Terrence J. Sejnowski, and John McWhirter. MIT Press, pp. 127-154. · Zbl 1190.62183
[36] Lukoˇseviˇcius, M. and H. Jaeger (2009). “Reservoir computing approaches to recurrent neural network training”.Computer Science Review3:127-149. · Zbl 1302.68235
[37] Lukoˇseviˇcius, M., H. Jaeger, and B. Schrauwen (2012). “Reservoir Computing Trends”.KI - K¨unstliche Intelligenz26:365-371.
[38] Maass, Wolfgang, Thomas Natschl¨ager, and Henry Markram (2002). “Realtime computing without stable states: A new framework for neural computation based on perturbations”.Neural Computation14(11):2531-2560. · Zbl 1057.68618
[39] Massar, Marc and Serge Massar (2013). “Mean-field theory of echo state networks”.Physical Review E87(4):042809.
[40] Miller, J. F., S. Harding, and G. Tufte (2014). “Evolution-in-materio: evolving computation in materials”.Evolutionary Intelligence7(1):49-67.
[41] Natschl¨ager, T., W. Maass, and H. Markram (2002). “The “Liquid Computer”: A Novel Strategy for Real-Time Computing on Time Series (Special Issue on Foundations of Information Processing)”.TELEMATIK8 :39-43.
[42] Nikolic, Danko, Stefan Haeusler, Wolf Singer, and Wolfgang Maass (2006). “Temporal dynamics of information content carried by neurons in the primary visual cortex”.Advances in neural information processing systems, pp. 1041-1048.
[43] Norton, D. and D. Ventura (2010). “Improving liquid state machines through iterative refinement of the reservoir”.Neurocomputing73(16):2893-2904.
[44] Rosenstein, M. T., J. J. Collins, and C. J. De Luca (1993). “A practical method for calculating largest Lyapunov exponents from small data sets”. Physica D: Nonlinear Phenomena65(1):117-134. · Zbl 0779.58030
[45] Schiller, U. D. and J. J. Steil (2005). “Analyzing the weight dynamics of recurrent learning algorithms”.Neurocomputing63:5-23.
[46] Sillin, H. O., R. Aguilera, H. Shieh, A. V. Avizienis, M. Aono, A. Z. Stieg, and J. K. Gimzewski (2013). “A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing”.Nanotechnology 24(38):384004.
[47] Soto, Jos´e Manuel G´omez and Andrew Wuensche (2015).The X-rule: universal computation in a non-isotropic Life-like Cellular Automaton. eprint: arXiv:1504.01434[nlin.CG]. · Zbl 1344.68153
[48] Stepney, Susan (2012). “Nonclassical Computation: a dynamical systems perspective”.Handbook of Natural Computing, volume 4. Ed. by Grzegorz Rozenberg, Thomas B¨ack, and Joost N. Kok. Springer. Chap. 59, pp. 1979- 2025.
[49] Stieg, A. Z., A. V. Avizienis, H. O. Sillin, R. Aguilera, H. Shieh, C. MartinOlmos, E. J. Sandouk, M. Aono, and J. K. Gimzewski (2014). “Selforganization and Emergence of Dynamical Structures in Neuromorphic Atomic Switch Networks”.Memristor Networks. Springer, pp. 173-209.
[50] Strogatz, S. H. (1994).Nonlinear dynamics and chaos. Westview Press.
[51] Vandoorne, K., J. Dambre, D. Verstraeten, B. Schrauwen, and P. Bienstman (2011). “Parallel reservoir computing using optical amplifiers”.IEEE Transactions on Neural Networks22(9):1469-1481.
[52] Vandoorne, K., P. Mechet, T. Van Vaerenbergh, M. Fiers, G. Morthier, D. Verstraeten, B. Schrauwen, J. Dambre, and P. Bienstman (2014). “Experimental demonstration of reservoir computing on a silicon photonics chip”. Nature Communications5:3541.
[53] Verstraeten, D. and B. Schrauwen (2009). “On the quantification of dynamics in reservoir computing”.Artificial Neural Networks-ICANN 2009. Springer, pp. 985-994.
[54] von Neumann, John (1966).Theory of Self-Reproducing Automata. Ed. by Arthur W. Burks. Champaign, IL, USA: University of Illinois Press.
[55] Wainrib, Gilles and Mathieu N Galtier (2016). “A local Echo State Property through the largest Lyapunov exponent”.Neural Networks76:39-45. · Zbl 1418.62358
[56] Wolfram, Stephen (1984). “Universality and complexity in cellular automata”. Physica D: Nonlinear Phenomena10(1):1-35. · Zbl 0562.68040
[57] Yildiz, I. B., H. Jaeger, and S. J. Kiebel (2012). “Re-visiting the echo state property”.Neural Networks35:1-9. · Zbl 1258.68129
[58] Yilmaz, Ozgur (2014). “Reservoir Computing using Cellular Automata”. CoRR:eprint:arXiv:1410.0162[cs.NE]. · Zbl 1338.68238
[59] Yilmaz, Ozgur (2015).
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.