Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

KIL1005: Numerical Methods for Engineering

Semester 2, Session 2023/2024

LECTURE 10 :OPTIMISATION: ONE DIMENSIONAL


UNCONSTRAINED OPTIMISATION

24-04-2024

Dr. Ahmed Halilu


Department of Chemical Engineering
University Malaya
Class schedule
Day Time Location
Tue 2.00 pm – 4.00 pm
Wed 11.00 am – 12.00 noon

2
OPTIMISATION
Optimization, in various contexts, refers to the process of making
something as effective or efficient as possible. It's a fundamental
concept across many disciplines, from mathematics and engineering to
business and economics. In engineering, optimization might involve
designing systems or processes to achieve the desired outcome with
minimal resources or waste.

Mathematical perspective: finding the


minimum or the maximum (optimum value)
of a function of several variables.
Optimum is the location where the curve is
flat:
Maximum: f ’( x)=0 and f ’’( x)<0
Minimum: f ’( x)=0 and f ’’( x)>0 3
OPTIMIZATION AND ENGINEERING PRACTICES
Typically deals with finding the ‘best result’ or optimum solution of a
problem. Hence for engineering practices, they are often termed as best
design or pre descriptive model.
However, in doing so, engineers are constrained by the limitations of
the physical world and also cost.

Cost Design parameters


Examples of optimizations of chemical engineering
problems:
1. Designing a pump and fan for the maximum efficiency at a
minimum cost.
2. Designing a heat transfer equipment with the minimum
surface area at a fixed efficiency.
3. Designing a waste water treatment system to meet water
quality standard at the minimum cost.

For an optimization problem, there are certain elements


that are needed to be fulfilled:
1. Objective function: designing a waste water treatment
system
2. Design variables: water quality standard, feed flow rate,
output concentration etc.
3. Constraints: Minimum cost.
ONE-DIMENSIONAL UNCONSTRAINED
OPTIMIZATION (1D-UO)
One-dimensional unconstrained optimization, also known as line search, involves
finding the minimum (or maximum) of a univariate function along a given direction
without any constraints. This optimization problem typically arises in scenarios where
the objective function depends on a single parameter or variable.
General approach to perform 1D-UO:

1.Select an Initial Guess: Choose an initial 4. Evaluate Convergence Criteria: Define


guess for the minimum (or maximum) of the convergence criteria to determine when to
function. This initial guess can be based on prior terminate the optimization process.
knowledge of the function or any available Common convergence criteria include
information. reaching a specified tolerance level for the
function value or the step size, or a
2. Choose a Search Direction: Determine the maximum number of iterations.
direction along which to search for the minimum
(or maximum) of the function. This direction can
5. Repeat or Terminate: Depending on the
be chosen based on heuristics, analytical
convergence criteria, repeat the line search process
considerations, or optimization algorithms.
with updated parameters if convergence has not been
achieved, or terminate the optimization process if
3. Perform Line Search: Perform a line search convergence is reached.
along the chosen direction to find the minimum (or
maximum) of the function. Line search methods
iteratively explore the function along the selected
direction, adjusting the step size or increment until
convergence to the optimal solution is achieved.
One-dimensional Multi-dimensional
unconstrained optimization unconstrained optimization
In multimodal functions, both local and global optima can
occur. In almost all cases, we are interested in finding the
absolute highest or lowest value of a function.
ALGORITHMS FOR ONE-DIMENSIONAL UNCONSTRAINED
OPTIMIZATION

1. Golden Section Search: A simple method that


repeatedly narrows down the search interval
based on the golden ratio.

2. Brent's Method: Combines the bisection method,


the secant method, and inverse quadratic interpolation
for efficient convergence.

3. Newton's Method: Uses local quadratic


approximations of the objective function to iteratively
refine the estimate of the minimum (or maximum).
GOLDEN-SECTION SEARCH
Search algorithm for finding a minimum on an interval with a single minimum (unimodal
interval)
Uses the golden ratio to determine location of two interior points 𝑥1 and 𝑥2 ; by using the
golden ratio, one of the interior points can be re-used in the next iteration.
› For a unimodal function:
1. First pick two points that will bracket your extremum [𝑥𝐿 , 𝑥𝑈 ].
2. Pick an additional third point within this interval to determine whether a maximum
occurred.
3. Then pick a fourth point to determine whether the maximum has occurred within the first
three or last three points
4. The key is making this approach efficient by choosing intermediate points wisely thus
minimizing the function evaluations by replacing the old values with new values.
DERIVATION OF GOLDEN RATIO

𝑙0 =𝑥𝑢 +𝑥𝐿
𝑙1 =𝑥2 +𝑥𝐿
𝑙2 =𝑥𝑢 +𝑥2
𝑙0 =𝑙1 +𝑙2 (1)
𝑙1 𝑙2
= (2)
𝑙0 𝑙1
𝑙1 𝑙
=2 (3)
Consider three length 𝑙1 +𝑙2 𝑙1

parameters (𝑙0 , 𝑙1 and 𝑙2 ) (1) ensures 𝑙1 +𝑙2 covers entire span


as defined in the figure above (2) Ensures the next iteration has the same
(3) proportional spacing as the current
iteration
13
Block Diagram of Golden Section
Search to Find a Minimum
We shall solve some
examples in the next class

You might also like