Professional Documents
Culture Documents
Operational Reseach in Defence
Operational Reseach in Defence
DEFENSE
Almost all countries spend sizable portions of their budgets in the procurement and
development of weapons with increasing lethalities. This is because each country
desires to enhance its military power to maintain supremacy over its adversaries and
deter them from undertaking any provocative action. To meet this objective, each
country acquires more and more weapon systems either through procurement or
through design and development. However, acquisition of weapons involves
enormous expenditure, and therefore it is imperative for defense decision makers to
assess the long term consequences of their acquisition plans well in advance.
Besides the issues pertaining to weapons acquisition, defense executives may be
concerned with the following types of questions: Can the threat to a country be
quantified? What are the chances of war between the two countries? Who is likely
to win? What will be the consequences if the force levels are reduced on a
unilateral, bilateral or multi-lateral basis?
understand and interpret the scenario, visualize the consequences and communicate
appropriate decisions to their staff.
situations wherein certain weapons and equipment were not utilized in an optimal
manner, and it was found that better technical performance could have been
realized if their deployment had been judiciously chosen through scientific analysis.
One of the classical examples is that of depth setting of a 'depth charge' (a kind
of bomb used for underwater explosion at pre-specified depth) to be dropped by
aircraft against submarines. Initially the attacks were judged to be unsuccessful as
not many submarines were reported sunk. As an alternative, EJ.WiIliams, a
member of Professor Blackett's team, was asked to consider the possibility of
designing a depth charge with an influence fuze that could be detonated by
proximity to the submarine rather than by water pressure (see McCloskey I987b).
Thus the depth charge would explode, as it fell through the water near the
submarine, irrespective of its depth. The depth setting of depth charge was based
on the assumption that a submarine would, on the average, sight the attacking
aircraft some two minutes before the instant of attack and that in this time, it could
dive to a depth of about ISO ft. Consequently an explosion at a depth of ISO ft.
would be fatal to the submarine.
The fallacy in the above argument was that in cases when the submarine sighted
the aircraft a long way off, it disappeared out of sight of the air crew, and therefore,
the air crew could not know where to drop the depth charge thus making the
effective accuracy in the attack plan very low. Also, in a few cases, when the
submarine failed to detect the aircraft and was therefore on the surface, an
explosion of a depth charge at ISO ft. failed to damage the submarine, as the lethal
radius of the depth charge was about 20 ft. Thus the existing method of setting the
depth charge at 150 ft. failed to sink deep submarines owing to the low bombing
accuracy, and failed to sink surface submarines due to small lethal radius.
Using available data and simple analysis it became clear that if the depth setting
was reduced from 150 ft. to 25 ft. and the pilots were instructed not to drop the
depth charge if the submarine had already submerged for more than half a minute
(see Morse and Kimball 1951), one would expect the average number of
submarines sunk for a given number of attacks to increase by more than two times.
Thus there was no need to develop an influence fuze to improve the effectiveness
of depth charge. This recommendation was accepted and implemented.
Subsequently enemy intelligence reported that perhaps a much more powerful depth
charge had been put into operation, which enhanced the casualties of their
submarines by more than two times!
depth charge with an influence fuze and (ii) Changing the depth setting of the depth
charge. Obviously, option (ii) led to an inexpensive and immediate solution.
After the war, the scientists and engineers associated with Professor Blackett
moved to various sectors such as transport, health, industry, etc. Each of these
scientists and engineers was convinced that the operations under his control could
be analyzed scientifically and a better method for carrying out the operation could
be investigated.
While applying this concept to the operations under the control of managers in
different sectors, it became clear that the operations in different sectors had
commonalties. For example, problems of congestion in military workshops,
handling of traffic at dockyards and examining patients at hospitals constitute the
same queueing operation. Therefore, an operation in the defense sector or the
transport sector or the health sector may require the same type of analysis. This led
to the concept of Operations Research as a collection of tools and techniques which
could be used for improving systems under the control of a manager irrespective of
his field of activity. Some of these techniques are : Queueing Theory,
Mathematical Programming, Inventory Control, PERT/CPM, Search Theory, Game
Theory, Simulation, etc.
not depend only on the specific requirement of a particular Service that sponsored
the proposal but needed to be analyzed in the wider context of national security.
Due to these reasons it became clear that the decision making problems of weapon
acquisition were no more amenable to structured analysis of OR. This led to the
birth of a new scientific discipline named Systems Analysis (SA).
The basic thought behind Systems Analysis can be explained as " a systematic
approach to helping a decision maker choose a course of action by investigating his
full problem, searching out objectives and alternatives, and comparing them in the
light of their consequences, using an appropriate framework - insofar as possible
analytic - to bring expert judgment and intuition to bear on the problem" (see
Quade and Boucher 1968). This suggests that a cycle consisting of definition of
objectives, exploration of alternatives, and evaluation of alternatives in terms of
their costs and effectiveness, needs to be considered. The cycle may be repeated in
the light of new information required while redefining the objectives and,
identification and evaluation of alternatives, till the total spectrum is clearly
understood. Consequently, during the early sixties, a large number of weapon
evaluation and cost effectiveness studies were carried out, mainly in USA, to advise
decision makers regarding weapon acquisition programs. Thereafter Systems
Analysis became a recognized activity in the defense sector.
As in the case of OR, Systems Analysis, after its initial success in defense, found
applications in civil sectors. Much debate followed subsequently whether OR and
SA were the same or different. It is true that initially OR and SA evolved from
short (tactical) and long (strategic) range planning perspectives respectively.
However, this distinction faded out soon. We, therefore, do not make any
distinction between these two decision making sciences in what follows and address
them as OR only.
1.3 Military OR
Another important aspect of war is that it occurs rarely and has large
uncertainties and complexities. Due to this, the analysis and planning of military
operations are more complicated and difficult than civilian operations. Availability
and reliability of war data pose another problem in analyzing military operations.
Problem Fonuulation
Data Collection
and
Model Solution
Evaluation
and
Implementation
(i) Problem Formulation: This is the first and the most important step of an OR
analysis in which the analyst holds discussions with the management to understand
and appreciate the problem to be solved, identifies the objectives and generates the
alternatives (Keeney 1994). The analyst also selects the variables affecting the
problem, the constraints imposed due to various limitations, and detenuines the
Measure of Effectiveness (MOE). The initial definition of the problem or the
measure(s) of effectiveness or the COAs may change after discussions.
For the example of AD gun system selection, the word 'best' may mean a gun
system that has maximum Cumulative Kill Probability (CKP) against specified
enemy aircraft averaged over a number of likely scenarios. Thus an average CKP
may be defined as the MOE for this problem. It may be emphasized that a number
of different MOEs are possible to define. For example, one may define the MOE in
tenus of the value of the surviving assets of the friendly forces or in tenus of the
cost of enemy offensive weapons required to inflict a specified damage on the
8 Military Operations Research
(ii) Model Development: The word 'model' is familiar to all of us. Most of us
would be able to recall that we use toys as models to represent and explain real life
objects to our children. Our teachers used models of atoms and molecules to
explain molecular structures in our chemistry classes. These are called 'scalar' or
'iconic' models. Similarly, the maps, diagrams and charts used in various classes or
seminar presentations are 'schematic' models. In OR analysis, 'symbolic' models
are generally used which are in the form of mathematical equations and represent
relationships between various 'uncontrollable' (which can not be changed by the
management in the normal course) and 'controllable' variables. The analyst aims at
determining the values of controllable variables so as to optimize the Measure of
Effectiveness. For the example of AD gun system selection, if the objective is
maximization of CKP, the model development involves establishing a mathematical
relationship between CKP and the probability of hit, number of rounds fired during
engagement, probability of detection, probability of fuze functioning, number of
barrels in the gun, etc. Such a relation can be easily established for this simple
problem and is given, for a specific threat scenario, by
b
CKP= 1- n(I-SSKPJ
i= I
and
forces should be deployed so as to ensure its win over Blue while minimizing the
number of casualties suffered. For an interesting discussion on predictive and
prescriptive models, reference may be made to Simon (1990).
(iii) Data Collection and Model Solution: Once the model has been formulated,
relevant data are collected and the model is solved either analytically or through
simulation. The analytical method helps to explicitly obtain an expression between
the MOE and the system variables. This method is better since it helps to study the
effect of controllable variables on the system effectiveness. The simulation is an
experimental approach on the model and is preferred in situations where analytic
solution is not possible or difficult.
For the AD gun problem, a mix of analytical and simulation methods may be
adopted. The probabilities of hit and damage could be evaluated either through
simulation or analytically. The probability of fuze functioning can be determined
from experimental data. .
(iv) Model Validation: A model is a representation of the real world situation and
therefore the model solution is expected to closely predict the behavior of the real
world system. One may like to know the closeness of this prediction. If the results
of the system operation are available, we can statistically compare model results
with those obtained from system operation. Statistical analysis, therefore, plays an
important role in model validation. This includes analysis of variance, tests for
goodness of fit, regression and correlation analysis. Military system studies,
however, suffer from lack of historical data and therefore in most cases, the model
validation gets limited to the perception of the military experts.
The problem of model validation has been extensively discussed by Gass (1983)
who mentions the following concepts for validating models:
(a) Face validity or expert opinion: When the model is demonstrated to experts
who are aware of the system being modeled, do they feel satisfied with the behavior
of the model, i.e., is the model credible?
(c) Hypothesis validity: If pairwise or higher level relationships of the model are
studied, do these correspond to similar relationships in the real world system?
Military systems may lead to casualties or destruction and therefore are difficult
to replicate. At best one can use data from field trials in which dummy targets are
used. However, the data from field trials may not have the details as required for a
10 Military Operations Research
scientific study. Presumably, the expert opinion, sensitivity analysis and hypothesis
validity are the only possibilities for model validation in military systems.
For the example discussed above, the sub-models for detection, acquisition,
tracking and hit probabilities, fuze functioning and damage probability may be
validated individually using past data. Also, the model may be validated from field
trial results for existing AD gun systems.
(v) Evaluation and Implementation : Once the model has been validated, the
available alternatives or COAs are evaluated by determining their MOEs. The
analyst then recommends the preferred alternatives in the order of their suitability
for meeting the defined goal. The responsibility of deciding or rejecting the
preferred course of action, however, lies on the decision maker or the executive. In
view of this, Military OR provides an aid to the military executive in rational
decision making.
In this section, we emphasize on some of the factors which should be kept in view
while organizing a military OR study.
(iii) Mutual Understanding between the Analyst and the Decision Maker :
Enhancing 'mutual understanding' between the analyst and the decision maker is
vital for implementation of an OR study (see Churchman and Schainblatt 1965).
Transportability of software, interactive computing and graphics features of
Personal Computers (PCs) have already played an important role in enhancing the
Operations Research in Defense 11
(iv) Duration of an OR Study: Military OR studies are time bound and complex.
The pressure from decision makers on an OR analyst seems to be much more in
military sector than in other sectors. Current developments in parallel computing,
faster algorithms and heuristics should be fully utilized by military OR analysts in
meeting the deadlines set by the decision makers.
REFERENCES
Bell, Peter c., Visual Interactive Modelling : The Past, the Present, and the Prospects,
European Journal ofOperational Research, Vo1.54, No.3, 274-286, October 1991.
Brodheim, Eric, Herzer, Ivo and Russ, Laurence M., A General Dynamic Model for Air
Defense, Operations Research. Vo1.15, No.5, 779-796,1967.
Churchman, C. and Schainblatt, A. H., The Researcher and the Manager : A Dialectics of
Implementation, Management Science, Vol. 1I, No.4, 69-87, 1965.
Christopherson, Derman and Baughan, E. C., Reminiscences of Operational Research in
World War II by Some of its Practitioners: II, Journal of the Operational Research
Society. Vo1.43, No.6, 569-577,1992.
Cunningham, W. Peyton, Freeman, Denys and McCloskey, Joseph F., Of Radar and
Operations Research : An Appreciation of A. P. Rowe (1898- I976), Operations
Research, Vo1.32, No.4, 958-967, 1984.
Falconer, N., On the Size of Convoys : An Example of the Methodology of Leading Wartime
OR Scientists, Operational Research Quarterly. Vol 27, No.2, 315-327, 1976.
Gass, Saul I., Decision-Aiding Models: Validation, Assessment, and Related Issues of Policy
Analysis, Operations Research, Vo1.31, No.4, 603·631, 1983.
Keeney, Ralph L., Using Values in Operations Research, Operations Research, Vo1.42,
No.5, 793-813, 1994.
Larnder, Harold, The Origin of Operational Research, Operations Research, Vo1.32, No.2,
465-475,1984.
Lovell, Sir Bernard, Blackett in War and Peace, Journal of the Operational Research
Society, Vo1.39, No.3, 221-233, 1988.
McCloskey, Joseph F., The Beginnings of Operations Research: 1934-1941, Operations
Research, Vo1.35, No.1, 143-152, 1987a.
McCloskey, Joseph F., British Operational Research in World War II, Operations Research,
Vo1.35, No.3, 453-470, 1987b.
McCloskey, Joseph F., U.S. Operations Research in World War II, Operations Research,
Vo1.35, No.6, 910-925, 1987c.
12 Military Operations Research
Morse, Philip M. and Kimball, George E., Methods ofOperations Research, The MIT Press,
Cambridge, Mass. and John Wiley, New York, 1951.
Quade, E.S. and Boucher, W.\., (eds.) Systems Analysis and Policy Planning: Applications
in Defense, American Elsevier, New York, 1968.
Sawyer, F. L., Charlesby, A, Easterfield, T. E. and Treadwell, E. E., Reminiscences of
Operational Research in World War II by Some of its Practitioners, Journal of the
Operational Research SOCiety, Vo1.40, No.2, 115-136, 1989.
Simon, Herbert A, Prediction and Prescription in Systems Modeling, Operations Research.
Vo1.38, No.1, 7-14,1990.
Washburn, A.R., Military Operations Research, in Handbooks in OR and MS, Vol. 6, (eds.)
Pollock, S.M., Rothkopf, M.H. and Barnett, A, Elsevier Science BY, 1994.