Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 12

Unconstrained

Optimization
Consider the minimization problem
• Definition : is said to be a
• global minima of if for all .
• a strict global minima of if for all .
• a local minima of if ther exists a neighbourhood of () such that for all .
• If is a convex function then every local minima of is a global minima of .
• If is a strict convex function then has unique minima.
• Descent direction: Suppose be a continuous function. is said to be a
decent direction of at if there exists such that

for all
Example: Suppose and is a descent direction of at since

for all
• is a local minima of if and only if there does not exist any descent
direction of at .
• Suppose be a differentiable function. If for some then is a descent
direction of at .
In the previous example So
Cleary
Hence is a descent direction of at .
• First order necessary condition: Suppose be a differentiable function. If is
a local minima of then
Proof: Suppose is a local minima of and
Let Then

This implies is a descent of at Hence there exists such that . This


contradicts that is a local minima of
• For then is a local minima.
Clearly .
• Note that does not imply is a local minima of
• For example let then . For we have . But is not a local minima of
since
• Suppose be a differentiable function and for some . Then is said to
be a stationary point of
• Clearly every local minima is a stationary point but every
stationary point is not a local minima.
• Second order sufficient condition: Suppose be a twice differentiable
function. If for and is positive definite, then is a local minima of .
• For example suppose

Then and
Clearly for , and
which is positive definite since leading principle minors of are and
Hence is a local minima of
Since is a convex function is a global minima of
Since is a strictly convex function is the unique global minima of
• Descent methods:
Input: Unconstrained objective function and initial approximation
Output: An approximate stationary point
Algorithm:
• Step 0 (Initialization): Supply , , and other related scalars. Set
• Step 1 (Optimality check): If . Otherwise go to Step 2.
• Step 2 (Descent direction): Find a suitable descent direction
• Step 3 (Step length): Select a suitable step length such that

• Step 4 (Update): Update Set and go to Step 1.


Selection of step length
• Exact line search: Suppose be a descent direction of at Step length
is as

• Suppose is a descent direction of at Using exact line search

Suppose . For of ,
This implies , i.e
Clearly .
• Inexact line search:
• In exact line search, we have to solve an optimization problem at every
iteration. This is computationally expensive.
• Inexact line search techniques are developed to overcome these limitations.
• We have to select step length to ensure:
• Sufficient decrease in objective function
• Step length not too small
• Following Armijo condition ensures sufficient decrease in objective function

• Wolfe condition ensure that step length is not too small

• Backtracking line search:


• Set
• While either (1) or (2) does hold
update for some
• Suppose . At , is a descent direction of since
• Set and
• For

Hence Armijo condition does not hold for . So update


• For

Hence Armijo condition does not hold for . So update


• For

Hence Armijo condition holds for


• Next verify Wolfe condition for

• Hence Armijo-Wolfe conditions are satisfied for


• Select and proceed further.

You might also like