Title: MATH 685 CSI 700 OR 682 Lecture Notes
1MATH 685/ CSI 700/ OR 682 Lecture Notes
- Lecture 9.
- Optimization problems.
2Optimization
3Optimization problems
4Examples
5Global vs. local optimization
6Global optimization
- Finding, or even verifying, global minimum is
difficult, in - general
- Most optimization methods are designed to find
local - minimum, which may or may not be global
minimum - If global minimum is desired, one can try several
widely - separated starting points and see if all
produce same - result
- For some problems, such as linear programming,
global - optimization is more tractable
7Existence of Minimum
8Level sets
9Uniqueness of minimum
10First-order optimality condition
11Second-order optimality condition
12Constrained optimality
13Constrained optimality
14Constrained optimality
15Constrained optimality
- If inequalities are present, then KKT optimality
conditions also require nonnegativity of Lagrange
multipliers corresponding to inequalities, and
complementarity condition
16Sensitivity and conditioning
17Unimodality
18Golden section search
19Golden section search
20Golden section search
21Example
22Example (cont.)
23Successive parabolic interpolation
24Example
25Newtons method
Newtons method for finding minimum normally has
quadratic convergence rate, but must be started
close enough to solution to converge
26Example
27Safeguarded methods
28Multidimensional optimization.Direct search
methods
29Steepest descent method
30Steepest descent method
31Example
32Example (cont.)
33Newtons method
34Newtons method
35Example
36Newtons method
37Newtons method
38Trust region methods
39Trust region methods
40Quasi-Newton methods
41Secant updating methods
42BFGS method
43BFGS method
44BFGS method
45Example
For quadratic objective function, BFGS with exact
line search finds exact solution in at most n
iterations, where n is dimension of problem
46Conjugate gradient method
47CG method
48CG method example
49Example (cont.)
50Truncated Newton methods
- Another way to reduce work in Newton-like methods
is to solve linear system for Newton step by
iterative method - Small number of iterations may suffice to produce
step as useful as true Newton step, especially
far from overall solution, where true Newton step
may be unreliable anyway - Good choice for linear iterative solver is CG
method, which gives step intermediate between
steepest descent and Newton-like step - Since only matrix-vector products are required,
explicit formation of Hessian matrix can be
avoided by using finite difference of gradient
along given vector
51Nonlinear Least squares
52Nonlinear least squares
53Gauss-Newton method
54Example
55Example (cont.)
56Gauss-Newton method
57Levenberg-Marquardt method
With suitable strategy for choosing µk, this
method can be very robust in practice, and it
forms basis for several effective software
packages
58Equality-constrained optimization
59Sequential quadratic programming
60Merit function
61Inequality-constrained optimization
62Penalty methods
This enables use of unconstrained optimization
methods, but problem becomes ill-conditioned for
large ?, so we solve sequence of problems with
gradually increasing values of , with minimum for
each problem used as starting point for next
problem
63Barrier methods
64Example constrained optimization
65Example (cont.)
66Example (cont.)
67Linear progamming
68Linear programming
69Examplelinear programming