Identification of parametric models from experimental data. Transl. from an upd. French version by the authors, with the help of John Norton.

*(English)*Zbl 0864.93014
Communications and Control Engineering Series. Berlin: Springer. xviii, 413 p. (1997).

The book is produced as a natural consequence of the authors’ research and teaching colloboration for many years. It is aimed at two groups of readers. The first consists of students wishing to study the basic methods for system identification and parameter and/or state estimation. The other one consists of researchers and engineers who have to squeeze parameters out of experimental data.

The book consists of seven chapters, which cover choice of the structure of the mathematical model, choice of a performance criterion to compare models, optimization of the performance criteria, evaluation of the uncertainty contained in the estimated parameters, experiment designs, and critical analysis of the model.

Chapter 2 is devoted to the choice of a structure for the mathematical model. In particular, the authors present the methods for testing linear and nonlinear models for identifiability and distinguishability among linear and nonlinear models, continuous- and discrete-time models, and deterministic and stochastic models.

Chapter 3 is devoted to introducing various types of criteria and their properties. An emphasis on other criteria than the least squares one and the importance of robustness are presented too.

Chapter 4 is devoted to a detailed treatment of parametric optimization, including much more considerations of numerical aspects than usual (evaluation of the effect of rounding errors, generation of derivatives of the cost function with respect to the parameters, global optimization \(\dots\)). It takes more than one third of the book and presents most optimization methods in common use, such as the least squares method, the Kalman filter method, the gradient method, the one-dimensional optimization method, the Newton method, the Gauss-Newton method, the Levenberg-Marquardt method, the quasi-Newton method, the conjugated-gradient method, and so on.

Chapter 5 is devoted to evaluating the uncertainty contained in the estimated parameters. It is also important to evaluate the uncertainty, taking into account the one in the data and the numerical errors. The authors present deterministic and statistical methods of characterizing the uncertainty in the parameter resulting from the the one in the data.

Chapter 6 is devoted to experiment designs (in much more detail than usual) for the collection of numerical data to be used to estimate the parameters of a given model. The authors consider a scalar function of the Fisher information matrix as the criterion and present local designs, robust designs, and the designs for Bayesian estimation.

Chapter 7 is devoted to questioning the parameter estimation, i.e. testing the validation of the model produced. This step is of paramount importance. Most of the techniques are based on analysis of residuals. The testing items include the testing for homogeneity, normality, stationarity, and independence.

In a word, the book is practical and readable.

The book consists of seven chapters, which cover choice of the structure of the mathematical model, choice of a performance criterion to compare models, optimization of the performance criteria, evaluation of the uncertainty contained in the estimated parameters, experiment designs, and critical analysis of the model.

Chapter 2 is devoted to the choice of a structure for the mathematical model. In particular, the authors present the methods for testing linear and nonlinear models for identifiability and distinguishability among linear and nonlinear models, continuous- and discrete-time models, and deterministic and stochastic models.

Chapter 3 is devoted to introducing various types of criteria and their properties. An emphasis on other criteria than the least squares one and the importance of robustness are presented too.

Chapter 4 is devoted to a detailed treatment of parametric optimization, including much more considerations of numerical aspects than usual (evaluation of the effect of rounding errors, generation of derivatives of the cost function with respect to the parameters, global optimization \(\dots\)). It takes more than one third of the book and presents most optimization methods in common use, such as the least squares method, the Kalman filter method, the gradient method, the one-dimensional optimization method, the Newton method, the Gauss-Newton method, the Levenberg-Marquardt method, the quasi-Newton method, the conjugated-gradient method, and so on.

Chapter 5 is devoted to evaluating the uncertainty contained in the estimated parameters. It is also important to evaluate the uncertainty, taking into account the one in the data and the numerical errors. The authors present deterministic and statistical methods of characterizing the uncertainty in the parameter resulting from the the one in the data.

Chapter 6 is devoted to experiment designs (in much more detail than usual) for the collection of numerical data to be used to estimate the parameters of a given model. The authors consider a scalar function of the Fisher information matrix as the criterion and present local designs, robust designs, and the designs for Bayesian estimation.

Chapter 7 is devoted to questioning the parameter estimation, i.e. testing the validation of the model produced. This step is of paramount importance. Most of the techniques are based on analysis of residuals. The testing items include the testing for homogeneity, normality, stationarity, and independence.

In a word, the book is practical and readable.

Reviewer: Yu Wenhuan (Tianjin)

##### MSC:

93-02 | Research exposition (monographs, survey articles) pertaining to systems and control theory |

93B30 | System identification |

93E12 | Identification in stochastic control theory |

93E10 | Estimation and detection in stochastic control theory |