Aravind Rangamreddy 500195259 cs4

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 7

Title –

Overview of optimization algorithms involving the sensitivity analysis.

Introduction –

Sensitivity analysis is concerned with the uncertainty inherent in mathematical


models where the values for the inputs used in the model can vary. It is the companion
analytical tool to uncertainty analysis, and the two are often used together. All of the models
composed, and studies executed, to draw conclusions or inferences for policy decisions, are
based on assumptions regarding the validity of the inputs used in calculations.

Objectives/Research questions –

1) Discuss the purpose of sensitivity analysis in the context of optimization algorithms.


2) Discuss the class of optimization algorithms where the sensitivity analysis can be applied.
3) Discuss the tools used for sensitivity analysis.
4) Provide a case study (your own example) of solving an optimization problem and performing a
sensitivity analysis.

Findings/Analysis –

Purpose of sensitivity analysis in the context of optimization algorithms

The sensitive analysis is used to determine how independent variable values will impact
a particular dependent variable under a given set of assumptions. Its usage will depend on one
or more input variables within specific boundaries, such as the effect that changes in interest
rates will have on a bond’s price. Sensitivity analyses study how various sources of uncertainty
in a mathematical model contribute to the model’s overall uncertainty.

It is also known as the what-if analysis. It can be used for any activity or system. It is used in the
business world and in the field of economics. In general, sensitivity analysis is used in a wide
range of fields, ranging from biology and geography to economics and engineering.

There are various ways in which the sensitive analysis is helpful in solving optimization
problems.

 A sensitivity analysis determines how different values of an independent variable affect a


particular dependent variable under a predefined set of assumptions.
 It is used to make predictions in the share prices of publicly traded companies or how interest
rates affect bond prices.
 It allows for forecasting using historical, true data.

This includes analyzing changes in:

1. An Objective Function Coefficient (OFC)


2. A Right-Hand Side (RHS) value of a constraint

Objective Function Coefficient (OFC):

When we solve an LP and then wish to solve another problem with the same constraints but
a slightly different objective function. When we change the objective function, it turns out that
there are two cases to consider. The first case is the change in a non-basic variable a variable
that takes on the value zero in the solution. What happens to your solution if the coefficient of
a non-basic variable decreases? For example, suppose that the coefficient of x1 in the objective
function above was reduced from 2 to 1.

Right-Hand Side (RHS) value of a constraint:


The dual prices capture the effect of a change in the amounts of available resources.
When we changed the amount of resource in a non-binding constraint, then increases never
changed your solution. Small decreases also did not change anything, but if you decreased the
amount of resource enough to make the constraint binding, your solution could change.

Sensitivity analysis works on the simple principle, change the model, and observe the behavior.

The parameters which affect the sensitivity analysis are:

1. Experimental design: — It includes combination of parameters that are to be varied. This
includes a check on which and how many parameters need to vary at given point in time,
assigning values before the experiment, study the correlations: positive or negative and
accordingly assign values for the combination.

2. What to vary: The different parameters that can be chosen to vary in the model could be:
a. the number of activities
b. the objective in relation to the risk assumed and the profit expected
c. technical parameters
d. number of constraints and its limits
3. What to observe:
1) The value of the objective as per the strategy
2) Value of the decision variables
3) Value of the objective function between two strategies adopted.

Class of optimization algorithms where the sensitivity analysis can be applied –

Strength Pareto Evolutionary Algorithm (SPEA) :

Evolutionary algorithms (EA), which are random exploring optimization algorithms based on the
idea of the biological evolutionary, are widely used and well suited for MOOP to look for the
global optimum

Many algorithms have been proposed based on the basic concept of the GA, such as Vector
Evaluated Genetic Algorithm (VEGA).

 Multiobjective Genetic Algorithm (MOGA)

 Nondominated Sorting Genetic Algorithm (NSGA)

 Niched Pareto Genetic Algorithm (NPGA)


 Strength Pareto Evolutionary Algorithm (SPEA)

 Nondominated Sorting Genetic Algorithm II (NSGA II)

 Pareto Envelope-based Selection Algorithm (PESA)

 Pareto Archived Evolution Strategy (PAES)

 Micro-Genetic Algorithm (Micro-GA)

The core of the above methods is the GA and differences are primarily the selection mechanism
and fitness evaluation.

1. Strength Pareto Evolutionary Algorithm (SPEA)

o Evolutionary algorithms (EA), which are random exploring optimization

algorithms which are based on the idea of the biological evolutionary, are very

widely used and well suited for MOOP to look for the global optimum.
o Genetic algorithm (GA), which simulates the natural selection and survival of the

fittest, is one of a typical examples of EA.


2. Spearman Rank Correlation Coefficient (SRCC)
o The correlation coefficient can be used for sensitivity analysis, as mentioned in

literatures. Related evaluation methods have different linear correlation

coefficients with fixed variable distance and different rank correlation coefficients

with fixed variable order, which is also called the sequential correlation
coefficient.
o In the present investigation, the Spearman Rank Correlation Coefficient (SRCC) is

used. The concept of SRCC is inherited from the Pearson Product-Moment

Correlation Coefficient (PMCC). In statistics, they are frequently used as tools

which are used to analyze the correlation between the input variable X and the

output variable Y. For PMCC, X-Y pairs must also follow a normal distribution.

However, this assumption is not feasible for each generation of optimization

variables. SRCC obtains correlation coefficients which are based on the parameter
rank rather than the raw value as PMCC. This operation is also described as rank

transformation.
o According to the SRCC characteristic, the two straight lines which are with

different slopes have the same sensitivity, which is as shown in Figure.

3. Rank Correlation Coefficient


o The correlation coefficient can be used for sensitivity analysis, as mentioned in

literatures.
o Related evaluation methods have linear correlation coefficients with fixed

variable distance and rank correlation coefficients with fixed variable order, which

is also called the sequential correlation coefficient.


o Rank is defined as the increasing (or descending) sort value of the raw

parameters.
o If two parameters have the same sort of value, an average value will be adopted.

Table 1 gives a simple example of ranking.


Tools used for sensitivity analysis -
 Sensitivity analysis helps to determine which risks have the most potential impact
on the project or program or portfolio. It considers two-variable at a time; they
are Influencing variable and Dependent variable. For example, consider,
influencing variables here are all identified risk in the risk register and cost
baseline is a Dependent variable of a project. We will take one risk at a time and
will analyze its impact on the cost baseline and will repeat for all the risks. Now,
this analysis of risk impact can be done through multiple ways such as
brainstorming sessions, expert judgment, interview, 3-point estimates, etc., The
result of sensitivity analysis can be represented with a tornado graph. Imagine, on
the horizontal axis of the graph, we are portraying cost baseline value in terms of
Billions dollar and on vertical axis all the risk. The risk which has the highest
impact on cost baseline value will come first and so on. Finally, the result of all risk
analyses on the cost baseline in this graph will look like a tornado diagram.
 This representation can also be considered as the Pareto diagram (80:20 rule)
where the Pareto diagram is also called an ordered bar chart or ordered
histogram. Here (Pareto Diagram) we will plot the Influencing variable (Risk) on X-
axis and Dependent variable (Cost) on Y-axis.
 In both representations, the length of the bar will depict the degree of impact on
the dependent variable. The higher the impact the bigger the size of the bar.
Remember once again, the bigger the bar size that will be plotted first.
 This tornado graph or Pareto diagram later then will be reviewed for the risk
response plan. The risk at the top or left most of the graph will be prioritized first
for actions.
 We can use sensitivity analysis as a tool to control cost, control schedule, control
scope, control quality, etc.,
 The drawback of sensitivity analysis is, it only takes two-variable at a time so if we
must analyze risk impact on schedule or scope or quality, etc., then we must plot
multiple tornado diagrams for each constraint of the project.

You might also like