Nonlinear regression.

*(English)*Zbl 0721.62062
Wiley Series in Probability and Mathematical Statistics. Chichester etc.: John Wiley & Sons Ltd. xx, 768 p. (1989).

One of the irritations of fitting nonlinear models is that model fitting generally requires the iterative optimization (minimization or maximization) of functions. Unfortunately, the iterative process often does not converge easily to the desired solution. Computational questions are therefore important in nonlinear regression, and we have devoted three chapters to this area, which form a largely self-contained introduction to unconstrained optimization.

In Chapter 1, after discussing the notation, we consider the various types of nonlinear model that can arise. Methods of estimating model parameters are discussed in Chapter 2, and some practical problems relating to estimation, like ill-conditioning, are introduced in Chapter 3. Chapter 4 endeavors to summarize some basic ideas about curvature and to bring to notice the growing literature on the subject. In Chapter 5 we consider asymptotic and exact inferences relating to confidence intervals and regions, and hypothesis testing. The role of curvature is again considered, and some aspects of optimal design close the chapter. Autocorrelated errors are the subject of Chapter 6, and Chapters 7, 8, and 9 describe in depth, three broad families of popular models, namely, growth-curve, compartmental, and change-of-phase and spline-regression models. We have not tried to cover every conceivable model, and our coverage thus complements D. A. Ratkowsky’s [Handbook of nonlinear regression models. (1988); for a review of the 1990-edition see Zbl 0705.62060] broader description of families of parametric models. Errors- in-variables models are discussed in detail in Chapter 10 for both explicit and implicit nonlinear models, and nonlinear multivariate models are considered briefly in Chapter 11. Almost by way of an appendix, Chapter 12 gives us a glimpse of some of the basic asymptotic theory, and Chapters 13 to 15 provide an introduction to the growing literature on algorithms for optimization and least squares, together with practical advice on the use of such programs.

The book closes with five appendices, an author index, an extensive list of references, and a subject index. Appendix A deals with matrix results. Appendix B gives an introduction to some basic concepts of differential geometry and curvature, Appendix C outlines some theory of stochastic differential equations, Appendix D summarizes linear regression theory, and Appendix E discusses a computational method for handling linear equality constraints.

In Chapter 1, after discussing the notation, we consider the various types of nonlinear model that can arise. Methods of estimating model parameters are discussed in Chapter 2, and some practical problems relating to estimation, like ill-conditioning, are introduced in Chapter 3. Chapter 4 endeavors to summarize some basic ideas about curvature and to bring to notice the growing literature on the subject. In Chapter 5 we consider asymptotic and exact inferences relating to confidence intervals and regions, and hypothesis testing. The role of curvature is again considered, and some aspects of optimal design close the chapter. Autocorrelated errors are the subject of Chapter 6, and Chapters 7, 8, and 9 describe in depth, three broad families of popular models, namely, growth-curve, compartmental, and change-of-phase and spline-regression models. We have not tried to cover every conceivable model, and our coverage thus complements D. A. Ratkowsky’s [Handbook of nonlinear regression models. (1988); for a review of the 1990-edition see Zbl 0705.62060] broader description of families of parametric models. Errors- in-variables models are discussed in detail in Chapter 10 for both explicit and implicit nonlinear models, and nonlinear multivariate models are considered briefly in Chapter 11. Almost by way of an appendix, Chapter 12 gives us a glimpse of some of the basic asymptotic theory, and Chapters 13 to 15 provide an introduction to the growing literature on algorithms for optimization and least squares, together with practical advice on the use of such programs.

The book closes with five appendices, an author index, an extensive list of references, and a subject index. Appendix A deals with matrix results. Appendix B gives an introduction to some basic concepts of differential geometry and curvature, Appendix C outlines some theory of stochastic differential equations, Appendix D summarizes linear regression theory, and Appendix E discusses a computational method for handling linear equality constraints.

##### MSC:

62J02 | General nonlinear regression |

62-02 | Research exposition (monographs, survey articles) pertaining to statistics |

65C99 | Probabilistic methods, stochastic differential equations |