Professional Documents
Culture Documents
In Depth Critique
In Depth Critique
In Depth Critique
Ravi Ravindran
Supply Chains discusses the impact of supply chain risks on global sourcing and
presents multicriteria models and methods to address these risks. Dr. A. Ravi
Ravindran is the Emeritus Professor and former Department Head of Industrial and
the University of Oklahoma for 15 years (1982- 97). His research interests are in
make their supply chains more resilient. The presentation begins by the impact of
supply chain and how consumers realize its role in their daily life. He also sheds some
light on the government policies introduced by leaders as they realize its importance
discusses the important sourcing decisions that have been made by major corporates
such as Walmart and General Motors. He then discussed the practices that are
affecting the supply chain such as global sourcing, outsourcing core and non-core
functions, supplier consolidation and the lean approach. He gives examples of key
supply chain disruptions such as 8 day strike at Delphi Brake Plant in 1996 idled 26
GM plants, costing $900 million, 1997 fire at Toyota’s sole source brake supplier
disrupted 20 assembly plants, costing Toyota $1.8B in sales, US West Coast Ports
lockout lasted11 days in 2002, causing $11-22B in lost sales, airfreight, and spoilage.
Then discusses the need for a good vendor management strategy, it speaks of rare and
frequent events such as the Global pandemic, natural disasters which directly impact
this strategy. While these are few of the rare events, commonly occurring events are
also in a prolonged run affect the vendor, management strategy. He discusses how
supply risk management is the process of identifying, assessing, and mitigating risks
to a company's supply chain. The steps involved in supply risk management include
identifying and assessing supply risks, developing a supply risk map, mitigating
supply risks, and monitoring and reviewing the supply risk management program. In
his presentation he further discusses the risk mitigation strategies that divide these
into high, low impact and high low occurrence. He discusses the objective of the
incorporating supplier risk when making sourcing decisions. The study developed two
different risk models, Value-at-Risk (VaR) for rare events and Miss-the-target (MtT)
risk for others. The study also developed a two-phase risk-adjusted supplier selection
model. In Phase 1, suppliers are screened and shortlisted. In Phase 2, suppliers are
selected and their order quantities are determined. The solution methods were
demonstrated using case scenarios and company staff as decision makers. The
models incorporating supplier risk when making sourcing decisions. For this objective
two risk models were created, Value-at-Risk (VaR) for rare events, and Miss-the-
target (MtT) risk for others. The work begins with quantification of risk such that it
can be divided into the two strategies. Supply chain risks can be either natural or man-
made events. Value-at-Risk (VaR) is a rare but severe risk, such as a hurricane, strike,
fire, or terrorist attack. Miss-the-Target (MtT) risk is a more frequent but less severe
quantifying risks, companies can better understand the potential impact of these
events and develop strategies to mitigate them. After which different suppliers were
selected test multi criteria optimization methods to rank suppliers using rating of each
attribute (1-10) scale, pairwise comparison of attributes and strength of preference (1-
9 scale) for pairwise comparisons. He explains that phase 1 includes the rating method
where each criterion is rated on a scale of 1 to 10, with 10 being the most important.
The weights associated with each criterion are then obtained through normalization.
Second the pair-wise comparison method using Borda count: This method is based on
pair-wise comparison of criteria. If there are P criteria, the most important criterion
gets P points, the second most important gets (P-1) points, and so on. The weights are
then calculated via normalization. And the analytic Hierarchy Process (AHP): This
1-9 scale. The AHP method is a more complex method than the other methods, but it
is also more accurate. He explains that the choice of which MCDM method to use
relatively simple, the rating method may be sufficient. If the decision-making problem
is more complex, the pair-wise comparison method using Borda count or the AHP
method may be more appropriate. From phase 1 results with the smaller set of
suppliers the phase 2 was conducted. For phase 2 goal programming was conducted
and based on the results Including conflicting criteria in supplier selection improves
Goal programming models provide multiple solutions that can be discussed, while the
Value Path Approach can effectively visualize tradeoff information. He concludes the
presentation by giving key points using which resilient supply chains can be built.
N-person Games and Linear Programming by Allen L. Soyster
compares Zero-sum games and n person games, where zero sum games are situations
where the gain of one player is directly offset by the loss of the other player. The total
payoff is constant, and any gain made by one player comes at the expense of the other.
These games are often used to model competitive situations, such as in sports, military
where players can work together to achieve a common goal. The payoff depends on
the collective efforts of all the players, and the players can gain more by cooperating
than by acting alone. He explains how they can be used to model situations where
multiple parties must work together to achieve a common goal, such as in a supply
chain where different companies cooperate to produce and deliver a product to the
market. They can also be used to study the behavior of countries in international
relations, where multiple nations may need to cooperate to address global issues such
these games where the number of players is n, a value function defined on all possible
2^n power subsets, and the requirement that the game is super-additive, meaning that
the value of the union of two subsets is greater than or equal to the sum of their
individual values. Thus, the condition ensures that the value of working together is
greater than the sum of the values of working separately. Dr Soyster gives a
demonstration in class. In this example he uses, four political parties with varying
percentages of votes. There are 16 possible coalitions that can be formed among these
parties. But, to pass anything and gain control of $100 million, a coalition needs to
have at least 51% of the votes. The value function is defined such that if a coalition is
formed, it will receive a payout of $100 million. In this case, there are three coalitions
that can pass something: A-B, A-C, and A-D. The other coalitions do not have the
required 51% majority to pass anything. However, since A has the most votes and can
form a coalition with any other party, it has the most power. Therefore, A can dictate
the terms of the coalition and will receive the largest payout. The other parties will
each having a certain power value, and the combination or merger of two parties can
be done to reach the threshold power value. After establishing the foundation on how
n person games work, he moves ahead and shows real life application of these games,
game where N=435, the number of representatives in the House. The objective of the
game is to form coalitions to reach a majority of 218 representatives and gain control
zero if it has less than 218 members and 4 trillion dollars if it has 218 members or
more, which is the US annual budget. The value of a coalition is defined by the
amount of power it can exert. In this case, each coalition has a value of 4, which
means that they can control the budget if they have 218 members or more. The value
of a coalition is determined based on its power to make decisions and influence the
outcome. In this game, the objective is to form a coalition that can control the House
and therefore the budget. He then explains what the core of the game will be. He
explains how the core of an n-person cooperative game refers to the set of imputations
that are stable and cannot be improved upon by any coalition of players. An
imputation is a set of allocation or payoffs to each player in the game. The core of the
game is “all those imputations whose allocations equal or exceed their respective sub-
coalitions”. For an imputation to be considered as part of the core, it must satisfy the
their minimum payoff, and it is collectively rational, meaning the sum of payoffs to all
players is at least the value of the grand coalition. This problem is then converted into
a linear program, where the core can be defined as the set of inequalities. He further
explains that if all members are in the core, then the program will have stability as
they will be willing to accept the payout. But if there is no core then that means that
there is no deep agreed upon strategy. When there is no core, it means that there is no
stable outcome or agreed-upon strategy among the players in the game. In such
situations, other incentives can be used to reach a solution. He uses the work of Lloyd
Shapley who had proposed an idea to distribute the grand coalition (all players
together) based on each player's marginal contribution to each coalition. This concept
is known as the Shapley value and has become an important tool for analyzing
cooperative games. The Shapley value is a way of distributing the total payoff or
contributions to the game. It considers all possible ways of forming coalitions and
calculates each player's contribution to each coalition. This value provides a fair way
of dividing the payoff among the players, considering each player's contribution to the
game. In terms of a linear program, we can use Shapely which will require 2^n LPs
and n calculations n! times. Few of next steps of this research will be formulating a
special type of n-person cooperative game is a Convex Game, understanding when
does an LP generate a convex game, to name a few. The major application of this is
Steven J. Greybush
patterns on Earth and Mars. His presentation is about the 10 steps that must be
plays a crucial role in generating accurate weather forecasts, which have significant
implications for decision-making across various sectors. Each day, millions of new
observations are ingested into numerical weather prediction models, which are
annually. The economic value from weather is estimated at $13 billion across sectors,
and the public values weather information at $280 per household per year, or $30
billion annually. Since 1980, there have been 258 weather/climate disasters exceeding
$1 billion dollars, with a total cost exceeding $1.75 trillion. Weather also impacts
related disciplines, including risk to life and property, insurance and reinsurance,
systems and infrastructure, and health and disease risk. He explains that it is a
powerful technique that combines information from predictive models and new
Where we give inputs to the model and get outputs. He also links this to the operation
research field and emphasis how the link between these two fields lies in the fact that
data assimilation can provide valuable input data for operations research models. By
can create more accurate and reliable models that consider real-world data. This can
the basics of data assimilation which includes estimating the state of the system by
spreading information across variables (and in space) using covariances and Using
explains how data assimilation can answer the three major questions that are: the best
forecast skill, the state of the system in the past or predicted in the future (a forecast).
He starts by first step which is to understand the goal of data assimilation. One goal
may be to create a reanalysis, which is an estimate of the past state of the system that
is consistent with both observations and a model. Another goal may be to improve
predictions of the future state of the system, by incorporating current observations into
a forecast model. The physical phenomena that data assimilation aims to capture will
also depend on the specific application. For example, in weather forecasting, the goal
is to capture the atmospheric state variables such as temperature, humidity, and wind,
which are critical for predicting future weather conditions. In oceanography, the goal
may be to estimate ocean currents, sea level, or the distribution of heat and salt. The
success of data assimilation depends on several factors, including the quality and
quantity of the observations, the accuracy of the model, the appropriateness of the
assimilation algorithm, and the ability to verify the results. DA is most successful
when the system being studied is well observed, and there is a set of known equations
that describe its time evolution. He takes an example of a snowstorm on earth and
Mars how Data assimilation can be used to predict and forecast weather. The second
step includes coming up with prognostic equations and then defining the model state
variables, the dynamic system behavior. He uses the Lorenz model is a mathematical
meteorologist Edward Lorenz in 1963. The fifth step uses synthetic observations
based on a known truth. The OSSE allows for more accurate comparisons between
analyses and the true state. OSSE is a useful tool for evaluating the performance of
and observation errors. Forecast skill improvement from assimilation analyses over
free model simulations can be assessed using OSSE. The model is based on a set of
Navier-Stokes equations that describe fluid flow. One of the key insights from the
Lorenz model is that small differences in initial conditions can lead to vastly different
outcomes over time, a concept known as chaos. This helps understand predictability.
From the models he extracts the observations, performance and ensemble diagnostics,
the correlation structure, and the forecasting skill. Finally, he concludes by discussing