Print

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

c   is a branch of applied mathematics that is used in the social sciences, most notably in economics, as well as in biology

(particularly
evolutionary biology and ecology), engineering, political science, international relations, computer science, social psychology, philosophy and
management. Game theory attempts to mathematically capture behavior in strategic situations, or @ , in which an individual's success in making
choices depends on the choices of others (Myerson, 1991). While initially developed to analyze competitions in which one individual does better at
another's expense (zero sum games), it has been expanded to treat a wide class of interactions, which are classified according to several criteria.
Today, "game theory is a sort of umbrella or 'unified field' theory for the rational side of social science, where 'social' is interpreted broadly, to include
human as well as non-human players (computers, animals, plants)" (Aumann 1987).
Traditional applications of game theory attempt to find equilibria in these games. In an equilibrium, each player of the game has adopted a strategy that
they are unlikely to change. Many equilibrium concepts have been developed (most famously the Nash equilibrium) in an attempt to capture this idea.
These equilibrium concepts are motivated differently depending on the field of application, although they often overlap or coincide. This methodology is
not without criticism, and debates continue over the appropriateness of particular equilibrium concepts, the appropriateness of equilibria altogether, and
the usefulness of mathematical models more generally.
Although some developments occurred before it, the field of game theory came into being with ÉmileBorel's researches in his 1938 book  

      , and was followed by the 1944 book 

    

  
 by John von Neumann and Oskar Morgenstern. This
theory was developed extensively in the 1950s by many scholars. Game theory was later explicitly applied to biology in the 1970s, although similar
developments go back at least as far as the 1930s. Game theory has been widely recognized as an important tool in many fields. Eight game theorists
have won the Nobel Memorial Prize in Economic Sciences, and John Maynard Smith was awarded the Crafoord Prize for his application of game theory
to biology.
c 
 
 
As a method of applied mathematics, game theory has been used to study a wide variety of human and animal behaviors. It was initially developed in
economics to understand a large collection of economic behaviors, including behaviors of firms, markets, and consumers. The use of game theory in the
social sciences has expanded, and game theory has been applied to political, sociological, and psychological behaviors as well.
Game-theoretic analysis was initially used to study animal behavior by Ronald Fisher in the 1930s (although even Charles Darwin makes a few informal
game-theoretic statements). This work predates the name "game theory", but it shares many important features with this field. The developments in
economics were later applied to biology largely by John Maynard Smith in his book 
 
  

  .
In addition to being used to predict and explain behavior, game theory has also been used to attempt to develop theories of ethical or normative
behavior. In economics and philosophy, scholars have applied game theory to help in the understanding of good or proper behavior. Game-theoretic
[5]
arguments of this type can be found as far back as Plato.
Economics and business
economists have long used game theory to analyze a wide array of economic phenomena, including auctions, bargaining, duopolies, fair division,
[7]
oligopolies, social network formation, and voting systems and to model across such broad classifications as mathematical economics, behavioral
[8] [9] [10]
economics, political economy, and industrial organization.
This research usually focuses on particular sets of strategies known as equilibria in games. These "solution concepts" are usually based on what is
required by norms of rationality. In non-cooperative games, the most famous of these is the Nash equilibrium. A set of strategies is a Nash equilibrium if
each represents a best response to the other strategies. So, if all the players are playing the strategies in a Nash equilibrium, they have no unilateral
incentive to deviate, since their strategy is the best they can do given what others are doing.
The payoffs of the game are generally taken to represent the utility of individual players. Often in modeling situations the payoffs represent money, which
presumably corresponds to an individual's utility. This assumption, however, can be faulty.
A prototypical paper on game theory in economics begins by presenting a game that is an abstraction of some particular economic situation. One or
more solution concepts are chosen, and the author demonstrates which strategy sets in the presented game are equilibria of the appropriate type.
Naturally one might wonder to what use should this information be put. Economists and business professors suggest two primary uses:   and
 .
å   is the mathematical study of waiting lines, or Ô  . The theory enables mathematical analysis of several related processes,
including arriving at the (back of the) queue, waiting in the queue (essentially a storage process), and being served at the front of the queue. The theory
permits the derivation and calculation of several performance measures including the average waiting time in the queue or the system, the expected
number waiting or receiving service, and the probability of encountering the system in certain states, such as empty, full, having an available server or
having to wait a certain time to be served.
[1] [2] [3]
Queueing theory has applications in diverse fields, including telecommunications, traffic engineering, computing and the design of factories, shops,
[4]
offices and hospitals.

   
  
The public switched telephone network (PSTN) is designed to accommodate the offered traffic intensity with only a small loss. The performance of loss
[8]
systems is quantified by their grade of service, driven by the assumption that if sufficient capacity is not available, the call is refused and lost.
Alternatively, overflow systems make use of alternative routes to divert calls via different paths ² even these systems have a finite traffic carrying
[8]
capacity.
However, the use of queueing in PSTNs allows the systems to queue their customers' requests until free resources become available. This means that if
[9]
traffic intensity levels exceed available capacity, customer's calls are not lost; customers instead wait until they can be served. This method is used in
queueing customers for the next available operator.
[9]
A queueing discipline determines the manner in which the exchange handles calls from customers. It defines the way they will be served, the order in
[9][10]
which they are served, and the way in which resources are divided among the customers. Here are details of four queueing disciplines:
First in first out
[10]
This principle states that customers are served one at a time and that the customer that has been waiting the longest is served first.
Last in first out
[10]
This principle also serves customers one at a time, however the customer with the shortest waiting time will be served first. Also known as a
stack.
Processor sharing
[10]
Customers are served equally. Network capacity is shared between customers and they all effectively experience the same delay.
Priority
[10]
Customers with high priority are served first.
[9][10]
Queueing is handled by control processes within exchanges, which can be modelled using state equations. Queueing systems use a particular form
[9]
of state equations known as a Markov chain that models the system in each state. Incoming traffic to these systems is modelled via a Poisson
[8]
distribution and is subject to Erlang¶squeueing theory assumptions viz.
V ÿ     ± Call arrivals and departures are random and independent events.[8]
V     Ô   ± Probabilities within the system do not change.[8]
V [      ± All incoming traffic can be routed to any other customer within the network.[8]
V -
@ 
    

   .[8]
Classic queueing theory involves complex calculations to determine waiting time, service time, server utilization and other metrics that are used to
[9][10]
measure queueing performance.
‰ å  
[11]
Networks of queues are systems which contain an arbitrary, but finite, number  of queues. Customers, sometimes of different classes, travel through
the network and are served at the nodes. The state of a network can be described by a vector , where  is the number of customers at
queue . In open networks, customers can join and leave the system, whereas in closed networks the total number of customers within the system
remains fixed.
The first significant result in the area was Jackson networks, for which an efficient product form equilibrium distribution exists.
‰ 
      
    
A useful queueing model represents a real-life system with sufficient accuracy and is analytically tractable. A queueing model based on the Poisson
process and its companion exponential probability distribution often meets these two requirements. A Poisson process models random events (such as
a customer arrival, a request for action from a web server, or the completion of the actions requested of a web server) as emanating from a memoryless
process. That is, the length of the time interval from the current time to the occurrence of the next event does not depend upon the time of occurrence of
the last event. In the Poisson probability distribution, the observer records the number of events that occur in a time interval of fixed length. In the
(negative) exponential probability distribution, the observer records the length of the time interval between consecutive events. In both, the underlying
physical process is memoryless.
Models based on the Poisson process often respond to inputs from the environment in a manner that mimics the response of the system being modeled
to those same inputs. The analytically tractable models that result yield both information about the system being modeled and the form of their solution.
Even a queueing model based on the Poisson process that does a relatively poor job of mimicking detailed system performance can be useful. The fact
that such models often give "worst-case" scenario evaluations appeals to system designers who prefer to include a safety factor in their designs. Also,
the form of the solution of models based on the Poisson process often provides insight into the form of the solution to a queueing problem whose
detailed behavior is poorly mimicked. As a result, queueing models are frequently modeled as Poisson processes through the use of the exponential
distribution.
‰       
The assumptions of classical queueing theory may be too restrictive to be able to model real-world situations exactly. The complexity of production lines
with product-specific characteristics cannot be handled with those models. Therefore specialized tools have been developed to simulate, analyze,
visualize and optimize time dynamic queueing line behavior.
For example; the mathematical models often assume infinite numbers of customers, infinite queue capacity, or no bounds on inter-arrival or service
times, when it is quite apparent that these bounds must exist in reality. Often, although the bounds do exist, they can be safely ignored because the
differences between the real-world and theory is not statistically significant, as the probability that such boundary situations might occur is remote
compared to the expected normal situation. Furthermore, several studies show the robustness of queueing models outside their assumptions. In other
cases the theoretical solution may either prove intractable or insufficiently informative to be useful.
Alternative means of analysis have thus been devised in order to provide some insight into problems that do not fall under the scope of queueing theory,
although they are often scenario-specific because they generally consist of computer simulations or analysis of experimental data. See network traffic
simulation
 
Social network analysis has been used in epidemiology to help understand how patterns of human contact aid or inhibit the spread of diseases such as
HIV in a population. The evolution of social networks can sometimes be modeled by the use of agent based models, providing insight into the interplay
between communication rules, rumor spreading and social structure.
SNA may also be an effective tool for mass surveillance ± for example the Total Information Awareness program was doing in-depth research on
strategies to analyze social networks to determine whether or not U.S. citizens were political threats.
Simulation is the imitation of some real thing, state of affairs, or process. The act of simulating something generally entails representing certain key characteristics or
behaviours of a selected physical or abstract system.
Simulation is used in many contexts, such as simulation of technology for performance optimization, safety engineering, testing, training, education, and video games.
Training simulators include flight simulators for training aircraft pilots. Simulation is also used for scientific modeling of natural systems or human systems in order to
gain insight into their functioning.[1] Simulation can be used to show the eventual real effects of alternative conditions and courses of action. Simulation is also used
when the real system cannot be engaged, because it may not be accessible, or it may be dangerous or unacceptable to engage, or it is being designed but not yet
built, or it may simply not exist .[2]
Key issues in simulation include acquisition of valid source information about the relevant selection of key characteristics and behaviours, the use of simplifying
approximations and assumptions within the simulation, and fidelity and validity of the simulation outcomes.
[9]
À @ @  (or      
) have been finding favour in business education in recent years. Business simulations that incorporate a
dynamic model enable experimentation with business strategies in a risk free environment and provide a useful extension to case study discussions.
 
      
Manufacturing represents one of the most important applications of Simulation. This technique represents a valuable tool used by engineers when
evaluating the effect of capital investment in equipments and physical facilities like factory plants, warehouses, and distribution centers. Simulation can
[39]
be used to predict the performance of an existing or planned system and to compare alternative solutions for a particular design problem.
Another important goal of manufacturing-simulations is to quantify system performance. Common measures of system performance include the
[40]
following:
V Throughput under average and peak loads;
V System cycle time (how long it take to produce one part);
V Utilization of resource, labor, and machines;
V Bottlenecks and choke points;
V Queuing at work locations;
V Queuing and delays caused by material-handling devices and systems;
V WIP storage needs;
V Staffing requirements;
V Effectiveness of scheduling systems;
V Effectiveness of control systems.
[inance simulation
Main articles: Monte Carlo methods in finance and Mathematical finance
In finance, computer simulations are often used for scenario planning. Risk-adjusted net present value, for example, is computed from well-defined but not always
known (or fixed) inputs. By imitating the performance of the project under evaluation, simulation can provide a distribution of NPV over a range of discount rates and
other variables.
Simulations are frequently used in financial training to engage participates in experiencing various historical as well as fictional situations. There are stock market
simulations, portfolio simulations, risk management simulations or models and forex simulations. Using these simulations in a training program allows for the
application of theory into a something akin to real life. As with other industries, the use of simulations can be technology or case-study driven.

  

Main article: Sales process engineering
Simulations are useful in modeling the flow of transactions through business processes, such as in the field of sales process engineering, to study and
improve the flow of customer orders through various stages of completion (say, from an initial proposal for providing goods/services through order
acceptance and installation). Such simulations can help predict the impact of how improvements in methods might impact variability, cost, labor time,
and the quantity of transactions at various stages in the process. A full-featured computerized process simulator can be used to depict such models, as
can simpler educational demonstrations using spreadsheet software, pennies being transferred between cups based on the roll of a die, or dipping into a
[55]
tub of colored beads with a scoop.
r     in economics, psychology, philosophy, mathematics, and statistics is concerned with identifying the values, uncertainties and other
issues relevant in a given decision, its rationality, and the resulting optimal decision. It is very closely related to the field of game theory
          
!    
This area represents the heart of decision theory. The procedure now referred to as expected value was known from the 17th century. Blaise Pascal
invoked it in his famous wager (see below), which is contained in his ÿ , published in 1670. The idea of expected value is that, when faced with a
number of actions, each of which could give rise to more than one possible outcome with different probabilities, the rational procedure is to identify all
possible outcomes, determine their values (positive or negative) and the probabilities that will result from each course of action, and multiply the two to
give an expected value. The action to be chosen should be the one that gives rise to the highest total expected value. In 1738, Daniel Bernoulli
published an influential paper entitled 
 

  

 À  
 !, in which he uses the St. Petersburg paradox to show
that expected value theory must be normatively wrong. He also gives an example in which a Dutch merchant is trying to decide whether to insure a
cargo being sent from Amsterdam to St Petersburg in winter, when it is known that there is a 5% chance that the ship and cargo will be lost. In his
[1]
solution, he defines a utility function and computes expected utility rather than expected financial value (see for a review).
[2]
In the 20th century, interest was reignited by Abraham Wald's 1939 paper pointing out that the two central procedures of sampling±distribution based
statistical-theory, namely hypothesis testing and parameter estimation, are special cases of the general decision problem. Wald's paper renewed and
synthesized many concepts of statistical theory, including loss functions, risk functions, admissible decision rules, antecedent distributions, Bayesian
[3]
procedures, and minimax procedures. The phrase "decision theory" itself was used in 1950 by E. L. Lehmann.
The revival of subjective probability theory, from the work of Frank Ramsey, Bruno de Finetti, Leonard Savage and others, extended the scope of
expected utility theory to situations where subjective probabilities can be used. At this time, von Neumann's theory of expected utility proved that
expected utility maximization followed from basic postulates about rational behavior.
The work of Maurice Allais and Daniel Ellsberg showed that human behavior has systematic and sometimes important departures from expected-utility
maximization. The prospect theory of Daniel Kahneman and Amos Tversky renewed the empirical study of economic behavior with less emphasis on
rationality presuppositions. Kahneman and Tversky found three regularities ² in actual human decision-making, "losses loom larger than gains";
persons focus more on  @ in their utility±states than they focus on absolute utilities; and the estimation of subjective probabilities is severely
biased by anchoring.
[ 
] [ 
]
Castagnoli and LiCalzi (1996), Bordley and LiCalzi (2000) recently showed that maximizing expected utility is mathematically
equivalent to maximizing the probability that the uncertain consequences of a decision are preferable to an uncertain benchmark (e.g., the probability
that a mutual fund strategy outperforms the S&P 500 or that a firm outperforms the uncertain future performance of a major competitor.). This
[ 
]
reinterpretation relates to psychological work suggesting that individuals have fuzzy aspiration levels (Lopes & Oden), which may vary from
choice context to choice context. Hence it shifts the focus from utility to the individual's uncertain reference point.
Pascal's Wager is a classic example of a choice under uncertainty. The uncertainty, according to Pascal, is whether or not God exists. Belief or non-
belief in God is the choice to be made. However, the reward for belief in God if God actually does exist is infinite. Therefore, however small the
probability of God's existence, the expected value of belief exceeds that of non-belief, so it is better to believe in God. (There are several criticisms of the
argument.)
ë  
 
This area is concerned with the kind of choice where different actions lead to outcomes that are realised at different points in time. If someone received a
windfall of several thousand dollars, they could spend it on an expensive holiday, giving them immediate pleasure, or they could invest it in a pension
scheme, giving them an income at some time in the future. What is the optimal thing to do? The answer depends partly on factors such as the expected
rates of interest and inflation, the person's life expectancy, and their confidence in the pensions industry. However even with all those factors taken into
account, human behavior again deviates greatly from the predictions of prescriptive decision theory, leading to alternative models in which, for example,
objective interest rates are replaced by subjective discount rates.
!       
Some decisions are difficult because of the need to take into account how other people in the situation will respond to the decision that is taken. The
analysis of such social decisions is more often treated under the label of game theory, rather than decision theory, though it involves the same
mathematical methods. From the standpoint of game theory most of the problems treated in decision theory are one-player games (or the one player is
viewed as playing against an impersonal background situation). In the emerging socio-cognitive engineering, the research is especially focused on the
different types of distributed decision-making in human organizations, in normal and abnormal/emergency/crisis situations.
Signal detection theory is based on decision theory.
!
    
Other areas of decision theory are concerned with decisions that are difficult simply because of their complexity, or the complexity of the organization
that has to make them. In such cases the issue is not the deviation between real and optimal behaviour, but the difficulty of determining the optimal
behaviour in the first place. The Club of Rome, for example, developed a model of economic growth and resource usage that helps politicians make real-
[ 

life decisions in complex situations

You might also like