Quasidifferential calculus.

*(English)*Zbl 0712.49012
Translations Series in Mathematics and Engineering. New York: Optimization Software, Inc., Publications Div. xi, 289 p. (1986).

Given a function \(f:X\to {\mathbb{R}}\) and a point \(x\in X\), the statement “f is quasidifferentiable at x” by means, “both (a) and (b) hold,” where

(a) For every \(v\in X\), the directional derivative \(f'(x;v):=\lim_{t\to 0^+}[f(x+tv)-f(x)]/t\) exists as a finite real number; and

(b) The resulting function \(f'(x;\cdot):X\to {\mathbb{R}}\) has a decomposition as \(f'(x;v)=p(v)-q(v)\) for all \(v\in X\), where p,q: \(X\to {\mathbb{R}}\) are (finite-valued) sublinear functions.

In this case, the pair of compact convex sets (\(\partial p(0),\partial p(0))\) is called a quasidifferential of f at x. (Here, for example, \(\partial p(0)=\{x^*\in X^*: <x^*,v>\leq p(v)\forall v\in X\}\) is the subgradient of p in the sense of convex analysis.) Notice that every quasidifferentiable function admits infinitely many decompositions as shown in (b), and hence has infinitely many quasidifferentials; thus the detailed theory involves equivalence classes of such pairs.

This book concerns the calculus of quasidifferentiable functions defined on \(X={\mathbb{R}}^ n\). It starts slowly, with nine chapters (111 pages) of introductory material before quasidifferentiability is even defined. These chapters describe the fundamentals of multivalued mappings, convex functions and their subgradients, and directional differentiation, and introduce some new topics. For example, the authors describe an equivalence relation between pairs of compact convex sets which gives a geometric interpretation to the natural equivalence relation between pairs of sublinear functions suggested by (b) above, namely, that \((p_ 1,q_ 1)\sim (p_ 2,q_ 2)\) if and only if \(p_ 1-q_ 1=p_ 2-q_ 2\). They also discuss general upper convex and lower concave approximations to a given function f near a given point x, and describe the set-valued generalized gradients of Clarke in some detail. They note that Clarke’s directional derivative may differ from the classical directional derivative (see (a) above) - indeed, this can happen even for some quasidifferentiable functions. In such situations the quasidifferential gives a more reliable linear approximation to the function than does Clarke’s generalized gradient.

The middle part of the book (Chapters 10-14) concerns the definition and elementary calculus of quasidifferentials. Since every quasidifferentiable function is in particular directionally differentiable, all the classical differentiation rules have their counterparts in this theory. However, the theory applies not only to Gâteaux differentiable functions, but also to some others - concave and convex functions, and maxima of finitely many quasidifferentiable functions, for example. Implicit mapping theorems are discussed, but the authors treat only a directional form: they describe hypotheses under which the equation \(f(x,y)=0\), with known \(solution(x,y)=(0,0)\), can be solved for y as a function of x along a line segment. That is, they fix v, and then generate a function \(y_ v(t)\) such that \(f(tv,y_ v(t))=0\) for all \(t\geq 0\) sufficiently small.

Chapters 15-20 describe various applications of quasidifferentials. Quasidifferential necessary and sufficient conditions are given for constrained optimization problems of the form min\(\{\) f(x): \(x\in C\}\). The authors start from the well-known necessary condition that a solution point \(x_ 0\) must satisfy \(f'(x_ 0;v)\geq 0\) for every vector v in the contingent cone to C at the point \(x_ 0\), and then describe the contingent cone at \(x_ 0\) to the constraint set \(\{x\in {\mathbb{R}}^ n: g(x)\leq 0\}\) in terms of the quasidifferential of g at \(x_ 0\). They treat constraints of the form \(h(x)=0\) similarly, but problems involving both sorts of constraint at once are not discussed. In particular, the book contains no result which reproduces the classical Kuhn-Tucker necessary conditions when all the functions involved are smooth. Problems of optimization also motivate a theoretical discussion of rates and directions of steepest descent. (This is not a book about computing, however, and numerical examples are not given.) The book closes with short discussions of the Minkowski gauge for nonconvex (but star-shaped) sets, quasidifferentials of saddle functions, and “approximate quasidifferentials”.

(a) For every \(v\in X\), the directional derivative \(f'(x;v):=\lim_{t\to 0^+}[f(x+tv)-f(x)]/t\) exists as a finite real number; and

(b) The resulting function \(f'(x;\cdot):X\to {\mathbb{R}}\) has a decomposition as \(f'(x;v)=p(v)-q(v)\) for all \(v\in X\), where p,q: \(X\to {\mathbb{R}}\) are (finite-valued) sublinear functions.

In this case, the pair of compact convex sets (\(\partial p(0),\partial p(0))\) is called a quasidifferential of f at x. (Here, for example, \(\partial p(0)=\{x^*\in X^*: <x^*,v>\leq p(v)\forall v\in X\}\) is the subgradient of p in the sense of convex analysis.) Notice that every quasidifferentiable function admits infinitely many decompositions as shown in (b), and hence has infinitely many quasidifferentials; thus the detailed theory involves equivalence classes of such pairs.

This book concerns the calculus of quasidifferentiable functions defined on \(X={\mathbb{R}}^ n\). It starts slowly, with nine chapters (111 pages) of introductory material before quasidifferentiability is even defined. These chapters describe the fundamentals of multivalued mappings, convex functions and their subgradients, and directional differentiation, and introduce some new topics. For example, the authors describe an equivalence relation between pairs of compact convex sets which gives a geometric interpretation to the natural equivalence relation between pairs of sublinear functions suggested by (b) above, namely, that \((p_ 1,q_ 1)\sim (p_ 2,q_ 2)\) if and only if \(p_ 1-q_ 1=p_ 2-q_ 2\). They also discuss general upper convex and lower concave approximations to a given function f near a given point x, and describe the set-valued generalized gradients of Clarke in some detail. They note that Clarke’s directional derivative may differ from the classical directional derivative (see (a) above) - indeed, this can happen even for some quasidifferentiable functions. In such situations the quasidifferential gives a more reliable linear approximation to the function than does Clarke’s generalized gradient.

The middle part of the book (Chapters 10-14) concerns the definition and elementary calculus of quasidifferentials. Since every quasidifferentiable function is in particular directionally differentiable, all the classical differentiation rules have their counterparts in this theory. However, the theory applies not only to Gâteaux differentiable functions, but also to some others - concave and convex functions, and maxima of finitely many quasidifferentiable functions, for example. Implicit mapping theorems are discussed, but the authors treat only a directional form: they describe hypotheses under which the equation \(f(x,y)=0\), with known \(solution(x,y)=(0,0)\), can be solved for y as a function of x along a line segment. That is, they fix v, and then generate a function \(y_ v(t)\) such that \(f(tv,y_ v(t))=0\) for all \(t\geq 0\) sufficiently small.

Chapters 15-20 describe various applications of quasidifferentials. Quasidifferential necessary and sufficient conditions are given for constrained optimization problems of the form min\(\{\) f(x): \(x\in C\}\). The authors start from the well-known necessary condition that a solution point \(x_ 0\) must satisfy \(f'(x_ 0;v)\geq 0\) for every vector v in the contingent cone to C at the point \(x_ 0\), and then describe the contingent cone at \(x_ 0\) to the constraint set \(\{x\in {\mathbb{R}}^ n: g(x)\leq 0\}\) in terms of the quasidifferential of g at \(x_ 0\). They treat constraints of the form \(h(x)=0\) similarly, but problems involving both sorts of constraint at once are not discussed. In particular, the book contains no result which reproduces the classical Kuhn-Tucker necessary conditions when all the functions involved are smooth. Problems of optimization also motivate a theoretical discussion of rates and directions of steepest descent. (This is not a book about computing, however, and numerical examples are not given.) The book closes with short discussions of the Minkowski gauge for nonconvex (but star-shaped) sets, quasidifferentials of saddle functions, and “approximate quasidifferentials”.

Reviewer: P.D.Loewen

##### MSC:

49J52 | Nonsmooth analysis |

49-02 | Research exposition (monographs, survey articles) pertaining to calculus of variations and optimal control |

46G05 | Derivatives of functions in infinite-dimensional spaces |

49K27 | Optimality conditions for problems in abstract spaces |