Gradient Method - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

Gradient Method

Description:

In these instance, we say that the Lagrangian relaxation has a duality(relaxation) gap. To solve problems with a duality gap to completion (i.e. to find an opt. Sol. ... – PowerPoint PPT presentation

Number of Views:64
Avg rating:3.0/5.0
Slides: 13
Provided by: iePost
Category:

less

Transcript and Presenter's Notes

Title: Gradient Method


1
Gradient Method
  • Gradient of f(x)
  • For nonlinear objective function f(x) of
    n-dimensional vector,
  • Gradient ?f(x) defined as
  • Gradient Method
  • If we choose d so that ?f(x)d gt0 and change x to
    x ?d, for enough small ?, we move uphill.
  • Only useful when the objective function is
    differentiable.
  • However, L(?) is not differentiable when the
    optimal solutions are not unique
  • d direction( n-dimensional vector)
  • ? step length(scalar)

2
Subgradient Optimization Technique
  • Subgradient optimization technique
  • Suppose L(?) min cx ?(Ax b) x?X has a
    unique solution x and is differentiable.
  • The solution x remains optimal for small change
    of ?
  • ? ? ? ? (Ax b)
  • Intuitive interpretation
  • When (Ax b)i 0, the solution x uses up
    exactly the required units of the ith resource,
    we hold ?i.
  • When (Ax b)i lt 0, the solution x uses up less
    than the available units of the ith resource, we
    decrease ?i.
  • When (Ax b)i gt 0, the solution x uses up more
    than the available units of the ith resource, we
    increase ?i.

Ax b direction ? step size
3
Subgradient (Ax b)
  • Def Subgradient
  • Let f be concave, then is ? called a subgradient
    of f at x?S
  • If f(x) ?f(x) ?t(x-x) for
    all x ?S
  • Need not to be differential
  • Meaning of Ax b
  • Subgradient Steepest decent direction

4
Lagrangian Multiplier Updating
  • Lagrangian Multiplier Updating
  • ?k1 ? ? k ?k (Axk b)
  • ?0 any initial choice of the Lagrangian
    multiplier
  • xk any solution to the Lagrangian subproblem
    when ? ?k
  • ?k step length at the kth iteration
  • Choice of Step Sizes(?k )
  • Condition for convergence to an optimal solution
    of the multiplier problem
  • For example, ?k 1/k.

5
Choice of Step Sizes(?k )(1)
  • An adaptation of Newtons method
  • L(?k) cxk ?k (Axk b) where xk solves
    Lagrangian subproblem when ? ?k
  • L(?) ? r(?) cxk ?(Axk b) linear
    approximation
  • Suppose we know the optimum value L of the
    Lagrangian multiplier problem
  • r(?k1) cxk ?k1 (Axk b) L we hope
  • ?k1 ? ? k ?k (Axk b)
  • ?r(?k1) cxk ? k ?k (Axk b) (Axk b)
    L

6
Choice of Step Sizes(?k )(2)
  • Practically
  • However, we dont know the objective function
    value of L Lagrangian multiplier problem.
  • Popular heuristic for selecting the step length
  • UB upper bound on the optimal objective
    function z of the problem (P)
  • ?k scalar chosen (strictly) between 0 and 2
  • Choosing ?k
  • Starting ?k with 2 and then reducing ?k by a
    factor of 2 until failing to find the better
    solution.

7
Cases of Inequality Constraints
  • Subgradient Optimization and Inequality
    Constraints
  • The update formula ?k1 ? ? k ?k (Axk b)
    might cause ? to become negative. To avoid this
    possibility,
  • ?k1 ? ? k ?k (Axk b) where (y)i
    max(yi,0)
  • The other steps (i.e. the choice of ?k) with
    inequality constraints are the same as that with
    equality constraints.

8
Overview of Lagrangian Technique
LS(?k) Lagrangian Subproblem at ? ?k
Solve LS(?k) ? xk
If xk feasible, UB updating
?k1 ? ?k ?k (Axk b)
Yes
No
Done
9
Example of CSPP
  • CSPP with T14
  • Initial parameter setting
  • ?00, ?00.8

(1, 1)
2
4
(1, 7)
(1, 10)
(2, 3)
1
6
(1, 2)
(10, 1)
(5, 7)
(2, 2)
(10, 3)
3
5
(12, 3)
10
Example of CSPP
  • CSPP with T14
  • Initial parameter setting
  • ?00, ?00.8
  • Iteration 1
  • Path 1-2-4-6, L(0)3
  • UB 24 (path 1-3-5-6)
  • ?0 0.8(24-3)/16 1.05
  • ?1 0 1.05(4) 4.2

?00
1
2
4
1
1
2
1
6
1
10
5
2
10
3
5
12
11
Example of CSPP
  • CSPP with T14
  • Initial parameter setting
  • ?00, ?00.8
  • Iteration 1
  • Path 1-2-4-6, L(0)3
  • UB 24 (path 1-3-5-6)
  • ?0 0.8(24-3)/16 1.05
  • ?1 0 1.05(4) 4.2
  • Iteration 2
  • Path 1-3-2-5-6, L(4.2)57
  • UB 15 (path 1-3-2-5-6)
  • ?1 0.8(151.8)/16 0.84
  • ?2 4.2 0.84(-4)0.84

?14.2
5.2
2
4
23.2
43
14.6
1
6
9.4
14.1
34.4
10.4
22.6
3
5
24.6
12
Duality Gap of CSPP
  • Convergence in CSPP
  • Lagrangian obj. ftn. value ? L 7
  • Lagrangian multiplier ? ? 2
  • Decreasing ?k ? 0
  • Duality(relaxation) Gap
  • L 7 lt the length of the shortest constrained
    path 13
  • In these instance, we say that the Lagrangian
    relaxation has a duality(relaxation) gap.
  • To solve problems with a duality gap to
    completion
  • (i.e. to find an opt. Sol. and guarantee that it
    is optimal)
  • ? enumeration procedure (ex. branch and bound)
Write a Comment
User Comments (0)
About PowerShow.com