×

Infinite horizon optimal control. Theory and applications. (English) Zbl 0649.49001

The main problem of the research in the monograph is a qualitative analysis of the behavior of optimal trajectories, in the first place a convergence to the point of phase space which is a solution of a static optimization problem. The statements on this convergence are usually called turnpike theorems.
Chapter 1 gives examples of optimal control problems on unbounded time intervals in the field of economics, ecology and technology. The definitions of optimal control are introduced with a meaning for divergence of integrals in the optimality functional.
Chapter 2 presents necessary and sufficient conditions for optimality in the form of a maximum principle for trajectories with infinite horizon duration.
Chapter 3 presents some problems where turnpike behavior of optimal trajectories can be fixed with the help of simple devices.
Chapter 4 considers convex autonomous problems with the help of large variations of trajectories. Autonomous systems with a nonautonomous functional of optimality, where the nonautonomous component is a decreasing exponent, are considered in chapter 5.
Chapter 6 gives statements on the convergence of optimal trajectories for nonautonomous and nonconvex problems.
Chapter 7 is devoted to the existence of solutions on infinite time intervals for nonautonomous control systems.
Chapter 8 analyses optimal processes with infinite time for linear equations with distributed parameters interpreted as usual differential equations in Hilbert space.
The book will promote the development of a new trend in optimal control connected with qualitative research of optimal processes of large and infinite duration [see also, the reviewer, Avtom. Telemekh. 1983, No.9, 58-66 (1983; Zbl 0562.93055); ibid. 1981, No.8, 119-130 (1981; Zbl 0489.93036); and the reviewer together with A. I. Panasyuk, Prikl. Mat. Mekh. 49, 524-535 (1985; Zbl 0614.49015)].
Reviewer: V.Panasyuk

MSC:

49-02 Research exposition (monographs, survey articles) pertaining to calculus of variations and optimal control
49J15 Existence theories for optimal control problems involving ordinary differential equations
49K15 Optimality conditions for problems involving ordinary differential equations
93C15 Control/observation systems governed by ordinary differential equations
93C20 Control/observation systems governed by partial differential equations
93C25 Control/observation systems in abstract spaces
49J20 Existence theories for optimal control problems involving partial differential equations
49J27 Existence theories for problems in abstract spaces
49K20 Optimality conditions for problems involving partial differential equations
49K27 Optimality conditions for problems in abstract spaces