×

zbMATH — the first resource for mathematics

Neural networks. A comprehensive foundation. (English) Zbl 0828.68103
New York, NY: Macmillan College Publ. Co. xix, 696 p. (1994).
This book provides a most comprehensive and up-to-date treatment available of neural networks (NN). It is mostly written from an engineering perspective and includes many worked-out examples, computer- oriented experiments and applications. The book is organized into 15 chapters which examine most of the important aspects of the NN field, including: (i) The learning process; (ii) The correlation matrix memory; (iii) The perceptrons; (iv) Radial-basis function networks and recurrent networks; (v) Self-organizing systems; (vi) Modular networks; (vii) Temporal processing; (viii) Neurodynamics; and (ix) VLSI implementations of NN. It also contains 4 appendices that provide background material on the following topics: (a) Pseudoinverse matrix memory; (b) Convergence analysis of stochastic approximation algorithms; (c) Statistical thermodynamics; and (d) The Fokker-Planck equation. The book also includes end-of-chapter problems for the student, some of which are of challenging nature.
The author has strived to provide the classical or key references in all cases. The bibliography section is extensive (it extends over some 55 pages), and the book also includes well-written historical notes. In fact the book is so well documented that the reader gets the impression that its author must have been affiliated during the writing with both the communications research laboratory and the library at McMaster University, even though according to the backcover he was only a member of the latter! (Otherwise the book is remarkable free of typographical errors.)
The book is written at a level suitable for use in a graduate course on NN in engineering, computer science and physics. Its size (some 700 pages) might suggest that the full text could hardly be covered in such a course, and hence that the instructor should leave out some topics. Alternatively the instructor has the option to compress the discussion on some of the topics. This should be quite possible particularly for the students in engineering. For such students, who generally have a good background in signals, systems and estimation theory, many topics on NN can be presented in a compressed form as modelling or parameter estimation problems. In fact the book has already benefited, to some extent, from such a perspective. In summary, this is a well-organized book written in a cursive manner. The researchers on NN will certainly benefit from the treatment in this text, written from the perspective of a well-known and prolific author of works in the signal processing, modelling and parameter estimation areas.
Reviewer: P.Stoica (Uppsala)

MSC:
68T05 Learning and adaptive systems in artificial intelligence
68-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to computer science
Keywords:
neural networks
PDF BibTeX XML Cite