×

General recurrent neural network for solving generalized linear matrix equation. (English) Zbl 1373.92011

Summary: This brief proposes a general framework of the nonlinear recurrent neural network for solving online the generalized linear matrix equation (GLME) with global convergence property. If the linear activation function is utilized, the neural state matrix of the nonlinear recurrent neural network can globally and exponentially converge to the unique theoretical solution of GLME. Additionally, as compared with the case of using the linear activation function, two specific types of nonlinear activation functions are proposed for the general nonlinear recurrent neural network model to achieve superior convergence. Illustrative examples are shown to demonstrate the efficacy of the general nonlinear recurrent neural network model and its superior convergence when activated by the aforementioned nonlinear activation functions.

MSC:

92B20 Neural networks for/in biological studies, artificial life and related topics
15A24 Matrix equations and identities
93C10 Nonlinear systems in control theory
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Hernandez, V.; Quintana, E. S.; Marques, M., Solving linear matrix equations in control problems on distributed memory multiprocessors, Proceedings of the 33rd IEEE Conference on Decision and Control. Part 1 (of 4)
[2] Lev-Ari, H., Efficient solution of linear matrix equations with application to multistatic antenna array processing, Communications in Information & Systems, 5, 1, 123-130 (2005) · Zbl 1116.65050 · doi:10.4310/CIS.2005.v5.n1.a5
[3] Fang, Y.; Loparo, K. A.; Feng, X., New estimates for solutions of Lyapunov equations, Institute of Electrical and Electronics Engineers. Transactions on Automatic Control, 42, 3, 408-411 (1997) · Zbl 0866.93048 · doi:10.1109/9.557586
[4] Kwon, W. H.; Moon, Y. S.; Ahn, S. C., Bounds in algebraic Riccati and Lyapunov equations: a survey and some new results, International Journal of Control, 64, 3, 377-389 (1996) · Zbl 0852.93005 · doi:10.1080/00207179608921634
[5] Zhou, B.; Duan, G.-R., A new solution to the generalized sylvester matrix equation av-evf= bw, Systems & Control Letters, 2, 3, 193-198 (2006) · Zbl 1129.15300 · doi:10.1049/iet-cta:20070468
[6] Zhou, B.; Duan, G.-R., On the generalized sylvester mapping and matrix equations, Systems & Control Letters, 57, 3, 200-208 (2008) · Zbl 1129.93018
[7] Li, S.; Li, Y., Nonlinearly activated neural network for solving time-varying complex sylvester equation, IEEE Transactions on Cybernetics, 44, 8, 1397-1407 (2014) · doi:10.1109/TCYB.2013.2285166
[8] Zhang, Y.; Jiang, D.; Wang, J., A recurrent neural network for solving sylvester equation with time-varying coefficients, IEEE Transactions on Neural Networks, 13, 5, 1053-1063 (2002) · doi:10.1109/TNN.2002.1031938
[9] Ding, F.; Chen, T., Gradient based iterative algorithms for solving a class of matrix equations, IEEE Transactions on Automatic Control, 50, 8, 1216-1221 (2005) · Zbl 1365.65083 · doi:10.1109/TAC.2005.852558
[10] Xie, L.; Ding, J.; Ding, F., Gradient based iterative solutions for general linear matrix equations, Computers & Mathematics with Applications, 58, 7, 1441-1448 (2009) · Zbl 1189.65083 · doi:10.1016/j.camwa.2009.06.047
[11] Ding, F.; Chen, T., Iterative least-squares solutions of coupled Sylvester matrix equations, Systems & Control Letters, 54, 2, 95-107 (2005) · Zbl 1129.65306 · doi:10.1016/j.sysconle.2004.06.008
[12] Chen, Y.; Yi, C.; Qiao, D., Improved neural solution for the Lyapunov matrix equation based on gradient search, Information Processing Letters, 113, 22-24, 876-881 (2013) · Zbl 1284.68492 · doi:10.1016/j.ipl.2013.09.002
[13] Chen, K., Improved neural dynamics for online Sylvester equations solving, Information Processing Letters, 116, 7, 455-459 (2016) · Zbl 1356.68267 · doi:10.1016/j.ipl.2016.03.004
[14] Zhang, S.; Constantinides, A. G., Lagrange programming neural networks, IEEE Transactions on Circuits and Systems II: Analog and Digital Signal Processing, 39, 7, 441-452 (1992) · Zbl 0758.90067 · doi:10.1109/82.160169
[15] Hopfield, J. J., Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences of the United States of America, 79, 8, 2554-2558 (1982) · Zbl 1369.92007 · doi:10.1073/pnas.79.8.2554
[16] Wang, J., Recurrent neural networks for solving linear matrix equations, Computers & Mathematics with Applications, 26, 9, 23-34 (1993) · Zbl 0796.65058 · doi:10.1016/0898-1221(93)90003-E
[17] Zhang, Z.; Zhang, Y., Design and experimentation of acceleration-level drift-free scheme aided by two recurrent neural networks, IET Control Theory Applications, 7, 1, 25-42 (2013) · doi:10.1049/iet-cta.2011.0573
[18] Zhang, Z.; Li, Z.; Zhang, Y.; Luo, Y.; Li, Y., Neural-dynamic-method-based dual-arm CMG scheme with time-varying constraints applied to humanoid robots, IEEE Transactions on Neural Networks and Learning Systems, 26, 12, 3251-3262 (2015) · doi:10.1109/TNNLS.2015.2469147
[19] Mead, C.; Ismail, M., Analog VLSI Implementation of Neural Systems, 80 (2012), Springer Science & Business Media
[20] Sastry, S.; Bodson, M., Adaptive Control: Stability, Convergence and Robustness (2011), Courier Corporation
[21] Liu, J.; Brooke, M. A.; Hirotsu, K., A CMOS feedforward neural-network chip with on-chip parallel learning for oscillation cancellation, IEEE Transactions on Neural Networks, 13, 5, 1178-1186 (2002) · doi:10.1109/TNN.2002.1031948
[22] Roger, A. H.; Charles, R. J., Matrix Analysis (2012), New York, NY, USA: Cambridge University Press, New York, NY, USA
[23] Zhang, Y.; Chen, K., Global exponential convergence and stability of Wang neural network for solving online linear equations, Electronics Letters, 44, 2, 145-146 (2008) · doi:10.1049/el:20081928
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.