Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Reliability Engineering and System Safety 71 (2001) 201±208

www.elsevier.com/locate/ress

On the ALARP approach to risk management


R.E. Melchers*
Department of Civil, Surveying and Environmental Engineering, The University of Newcastle, Newcastle, NSW 2308 Australia
Accepted 8 September 2000

Abstract
There is an increasing trend by regulatory authorities for the introduction of the as low as reasonably practicable (ALARP) approach in
dealing with risk management of proposed or existing complex hazardous systems. For these, decisions about acceptability or tolerability of
risks and consequences can have very signi®cant ®nancial, economic and other consequences for the proponents. Conversely, there may be
very signi®cant social and socio-economic implications. ALARP as a guide to achieving a satisfactory outcome has a certain intuitive appeal
for the practical management of industrial and other risks. However, as suggested herein, there are a number of areas of concern about the
validity of this approach. These include representativeness, morality, philosophy, political reality and practicality. An important, and in some
respects fundamental, dif®culty is that the risk acceptance criteria are not fully open to public scrutiny and can appear to be settled by
negotiation. q 2001 Elsevier Science Ltd. All rights reserved.
Keywords: ALARP; Risk; Probability; Safety; Hazard; Regulations; Morality; Decision-making

1. Introduction the most common route in attempting to exert control over


potentially hazardous activities. This trend is being
The management of risks associated with potential hazar- followed in a number of countries. It is appropriate, there-
dous activities in society remains a matter of profound fore, to review some aspects of these directions. In parti-
public and technical interest. There has been and continues cular, the present paper will focus on the use of the so-
to be considerable development in the range and extent of called as low as reasonably practicable (ALARP)
regulatory activity. Many new regulatory frameworks have approach [also sometimes known as the as low as reason-
been established. Except for public input to risk assessments ably attainable/achievable (ALARA) approach]. It will be
for very speci®c and contentious projects, there appears to viewed primarily from the perspective of so-called
have been remarkably little public debate (and perhaps even `Common Law' countries, that is those with a legal
understanding) of the more general and philosophical issues system parallel to that of the USA or the UK. For coun-
involved. This is despite the rather spectacular failure in tries such as Norway, where ALARP is also very exten-
recent years of electricity, gas and other services over sively used, some of the comments to follow may not be
large regional areas and the occurrence of several major completely applicable. However, it is considered that the
industrial accidents. bulk of the discussion is suf®ciently general.
One issue which might have been expected to have The ALARP approach grew out of the so-called safety
received some public discussion is how decisions about case concept ®rst developed formally in the UK [2]. It was a
hazardous facilities and activities are to be regulated. major innovation in the management of risks for potentially
Should it be through regulatory or consent authorities, and hazardous industries. It requires operators and intending
if so, what form and allegiances should such bodies have? operators of a potentially hazardous facility to demonstrate
Alternatively, should it be through `self-regulation', or that (i) the facility is ®t for its intended purposes, (ii) the
should there be some other mechanism(s)? These options risks associated with its functioning are suf®ciently low and
have been explored in an interesting discussion paper [1]. (iii) suf®cient safety and emergency measures have been
However, it appears largely to have been ignored in instituted (or are proposed). Since in practice there are
practice. Perhaps by default, the regulatory approach is economic and practical limits to which these actions can
be applied, the actual implementation has relied on the
* Tel.: 161-4921-6058; fax: 161-4921-6991. concept of `goal setting' regulations. The ALARP approach
E-mail address: cerem@cc.newcastle.edu.au (R.E. Melchers).

0951-8320/01/$ - see front matter q 2001 Elsevier Science Ltd. All rights reserved.
PII: S 0951-832 0(00)00096-X
202 R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208

Fig. 1. Levels of risk and ALARP, based on UK experience [3].

is the most well known of these. It is claimed by some as These issues include: (i) risk de®nition and perception,
being a more `fundamental' approach to the setting of toler- (ii) risk tolerance, (iii) the decision-making framework,
able risk levels [3,4]. and (iv) its implementation in practice.
Conceptually the ALARP approach can be illustrated as
in Fig. 1. This shows an upper limit of risk that can be
tolerated in any circumstances and a lower limit below 2. Risk perception
which risk is of no practical interest. Indicative numbers
for risks are shown only for illustration Ð the precise values 2.1. Risk understanding and de®nition
are not central to the discussion herein but can be found in
relevant country-speci®c documentation. The ALARP Increased levels of education, awareness of environmen-
approach requires that risks between these two limits must tal and development issues and greater political maturity on
be reduced to a level `as low as reasonably practicable'. In the part of society generally has led to a much keener inter-
relevant regulations it is usually required that a detailed est in industrial risk management practices, policies and
justi®cation be given for what is considered by the applicant effectiveness. Apart from hazardous industries, public inter-
to satisfy this `criterion'. est derives also from notable public policy con¯icts over the
As a guide to regulatory decision-making the ALARP siting of facilities perceived to be hazardous or environmen-
concept suggests both `reason' and `practicality'. It tally unfriendly. Despite this, `risk' as a concept perceived
conveys the suggestion of bridging the gap between tech- by the general public appears to be rather poorly de®ned,
nological and social views of risk and also that society has with confusion between probability, something involving
a role in the decision-making process. In addition, it has a both probability and consequences and something implying
degree of intuitive appeal, conveying feelings of reason- monetary or other loss.
ableness amongst human beings. As will be argued in Vlek and Stallen [5] gave some ten different de®nitions of
more detail below, these impressions are somewhat `risk' or riskiness, using various ways of `mixing' all or
misleading. There are also considerable philosophical parts of the two main component ideas. Traditional decision
and moral short-comings in the ALARP approach. Perhaps analysis, of course, simply multiplies the chance estimate by
rather obliquely, the discussion will suggest what should the consequence estimate. This is only a `®rst-order'
be done to improve the viability of ALARP or what char- approach, with both the chance estimate and the conse-
acteristics need to be embodied in alternatives. However, quence estimate being mean values. It is possible, at the
it is acknowledged that this is not a paper offering `solu- expense of greater complexity in analysis, but perhaps
tions' but rather one which it is hoped will focus more re¯ecting more accurately personal and societal perception,
attention on the issues and stimulate discussion in order to invoke measures of uncertainty, such as the standard
to bring about solutions. deviation of each estimate [6]. Nevertheless, there is likely
To allow attention to be focussed more clearly on the to remain some disagreement over a core de®nition of risk
dif®culties with the philosophy of ALARP, it is neces- (as there appears to be in most sociological and psycholo-
sary ®rst to review some matters fundamental to the gical works about any term) depending on ones view-point
interpretation and management of risk in society. and stake in the eventual outcome [1].
R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208 203

In the mathematical/statistical literature and in most engi- operated¼º and while this view is useful for making
neering oriented probability discussions, risk is simply comparative statements about riskiness or for comparison
taken as another word for probability of occurrence or to standards, this interpretation is inconsistent with ªall
`chance', with consequences, however they might be standard philosophical theories of probability ¼º [12].
measured, kept quite separate. Herein the approach will be
adopted to use `risk' as a generic term, implying both 2.3. Factors in risk perception
probabilities and consequences without specifying how
these are to be combined. There are many factors involved in risk perception [1].
These include:
2.2. Risk as an objective matter (i) the likely consequences should an accident occur;
It has become increasingly clear that `risk' is not an (ii) the uncertainty in that consequence estimate;
objective matter. Thus all risk assessment involves both (iii) the perceived possibilities of obviating the conse-
`objective' and `subjective' information. Matters generally quences or reducing the probability of the consequences
considered to be capable of `objective' representation, such occurring, or both;
as physical consequences, are seldom completely so, since (iv) familiarity with the `risk';
in their formulation certain (subjective, even if well (v) level of knowledge and understanding of the `risk' or
accepted) decisions have had to be made regarding data consequences or both; and
categorization, its representation, etc. This also applies to (vi) the interplay between political, social and personal
areas of science once considered to be `objective', a matter in¯uences in forming perceptions.
which is now considered brie¯y.
The last two items in particular deserve some comment.
In the development of mathematical and numerical
Knowledge and understanding of risk issues on the part of
models in science, model `veri®cation' is the proof that
individuals and society generally implies that (risk) commu-
the model is a true representation. It may be possible to
nication exists, that it is utilized to convey meaningful
do this for so-called `closed' systems. These are completely
information and that the capacity exists to understand the
de®ned systems for which all the components of the system
information being conveyed and to question it. Perhaps the
are established independently and are known to be
most critical issue is the actual availability of relevant and
correct. But this is not the general case or the case for
accurate information. For a variety of reasons, there has
natural systems. For these `veri®cation' is considered to
been an increasing requirement placed on governments
be impossible [7].
and industry to inform society about the hazards to which
Model `validation', on the other hand, is the establish-
its members might be exposed. There has developed also
ment of legitimacy of a model, typically achieved through
greater possibility for access to government and government
contracts, arguments and methods. Thus models can be
agency ®les under `Freedom of information'-type legisla-
con®rmed by the demonstration of agreement between
tion. Whether these developments have been helpful in
observation and prediction, but this is inherently partial.
creating a better informed public is not entirely clear, as it
ªComplete con®rmation is logically precluded by the
involves also issues such as truthfulness in communications
fallacy of af®rming the consequent ¼ and by incomplete
and the trust which society is willing to place in the
access to natural phenomena ¼ Models can only be eval-
available information.
uated in relative termsº [7]. Philosophical arguments also
That there will be an interplay between individual and
point to the impossibility of proving that a theory is correct
societal perceptions of risk follows from individuals being
Ð it is only possible to disprove it [8,9]. Moreover, in
social beings. Their very existence is socially and psycho-
developing scienti®c work, models are routinely modi®ed
logically intertwined with that of others. Formal and infor-
to ®t new or recalcitrant data. This suggests that models can
mal relationships and institutions ªset constraints and
never be `perfect' [10]. It follows that for theories and
obligations upon people's behavior, provide broad frame-
models to be accepted, there is necessarily a high degree
works for the shaping of their attitudes and beliefs, and are
of consensus-forming and personal inter-play in their devel-
also closely tied to questions both of morality and of what is
opment and the scienti®c understanding underpinning them
to be valued and what is not. There is no reason to suppose
[11]. Some of this can be brought about by `peer' reviews of
that beliefs and values relating to hazards are any different
risk assessments and procedures, such as widely practiced in
from other more general beliefs and values¼º [1].
the nuclear industry.
These concepts carry-over directly to risk estimation
since risk estimates are nothing but models of expectation 3. Decision frameworks
of outcomes of uncertain systems (i.e. `open' systems),
couched in term of the theory of probability. Thus, in the 3.1. New technology
context of PSA, ª¼ often the probabilities are seen as
physical properties of the installation and how it is Society as a whole is constantly faced with the need to
204 R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208

make decisions about existing hazardous or potentially tunities for contributing to the discussion, for discourse and
hazardous projects. Usually these decisions are delegated for criticism. It also requires truthfulness of viewpoint and
to organizations with recognized expertise in the area. For the absence of power inequalities. Although these might
existing technology, that expertise will rely on past seem like tall orders indeed, Habermas argues that there
experience, including accident statistics and `incident' (or are very few situations where these conditions are not met
`near-miss') statistics for hazardous facilities. In many cases or cannot be met eventually since open and free discourse
hazard scenario and contingency planning also will be will uncover the limitations which might exist. The implica-
carried out. It is in this area that the techniques of probabil- tion for risk analysis and evaluation is that the rationality of
istic risk analysis are recognized to have validity in the the criteria and the degree to which risk might be accepted
sense of Section 2.2 [6]. should be based, ultimately, on the agreed position of
For the potential risks associated with new technologies, society obtained through internal and open transactions
however, the problem of management is more acute. This is between knowledgeable and free human beings.
because the basis for making decisions, that is a base of Such a position has been put in different, but essentially
accumulated knowledge and experience, is not available. analogous ways by others [1]. The importance of giving
The dilemma can be seen clearly in the earlier writings consideration to public opinion underlies much writing on
related to nuclear risks, prior to the occurrence of the acci- risk criteria. However, the practical dif®culties of ªarriving
dents at Three Mile Island, Chernobyl and the like. For at consensus decisions over the question of acceptable risk
example, Stallen [13], in reviewing the works of Hafele in society º are considerable. According to Lay®eld [16] in
and Groenewold notes that the only solutions for the control commenting on Britain's Sizewell B reactor ¼ªThe
of risks caused by new technology tend to involve extensive opinions of the public should underlie the evaluation of
use of other (and older) forms of technology. risk. There appears to be no method at present for ascertain-
History suggests that a new technology will only survive ing the opinions of the public in such a way that they can be
if it has no major catastrophes early in its development. reliably used as the basis for risk evaluation. More research
Thereafter, the risks are apparently small because: (i) the on the subject is needed.º
operating experience base is small; (ii) particular care tends Moreover, society is a complex mix of sub-groups with
to be taken; and (iii) there has not been enough time for in- differing aims, ambitions, views, opinions and allegiances.
service problems to become suf®ciently evident. This may It is not surprising then that when faced with most matters
lead to the false sense that the actual risks involved are about which profound decisions need to be made society
small. Further, for new technologies it is generally the responds with a variety of view-points and courses of action.
case that the scienti®c understanding of the total socio-tech- Although there are always inter-plays between short-term
nical system, its limitations and assumptions, is rather and longer-term self-interests and morally `high-ground'
incomplete, adding further to the dif®culties of satisfactory views, it appears in many cases that the diversity of views
risk estimation. The `trial-and-error' underpinning much of and the convictions with which they are held is inversely
the understanding of conventional and well-developed tech- related to the knowledge sub-groups of society have about
nology is missing. the matter being considered.
In connection with the development of science, Popper Lay®eld [16] noted ¼ªAs in other complex aspects of
[8,9] has argued that only falsi®cations (i.e. failures) lead to public policy where there are bene®ts and detriments to
new developments Ð veri®cations of existing ideas merely different groups, Parliament is best placed to represent the
add to our apparent con®dence in them, but they could be public's attitude to risks.º In practice, of course, such a
wrong. The inferences for risk analysis are not dif®cult to course of action might be taken only for major policy deci-
make [14]. sions, such as whether the nation should have nuclear power
or not, etc.
3.2. A wider perspective However, Wynne [17] and others have argued that Parlia-
ment is ill-equipped both in time and expertise to fully
Under these circumstances, how can society deal with the appreciate the implications and changes likely to be brought
evaluation of risks imposed by new technology? It is about by the introduction or further development of new
suggested that some light may be thrown on this question technologies. In his view, particularly for major new tech-
by an examination of the parallel issue of the rationality of nology issues, the political process can only be considered
science. Noted philosopher Habermas [15] has argued that to be defective.
the rationality of science stems not from any objective, A historical review of the introduction of any really new
external measures such as `truth' but from agreed formal- technology shows, however, just how ill-informed and ill-
isms (see also Section 2.2). This involves transactions equipped parliaments tend to be, mostly being even unaware
between knowledgeable human beings and agreement of the changes taking place around them. For most major
between them about what can be considered to be `rational', technological innovations (irrespective of their hazard
given the base of available knowledge and experience. It potential) parliamentary interest tends to follow well after
presupposes a democratic and free society with equal oppor- the technologies have been introduced. There are many
R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208 205

examples of this in the developing Industrial Revolution project that might be acceptable to, or tolerated by, an indi-
[18] and more recent examples include IVF technology, vidual or society or sub-groups is an extremely complex
gene technology, internet technology, etc. issue, about which much has been written. It is not possible
Moreover, even within society more generally there is to deal with this matter here, but see Reid [19] for a useful
seldom much awareness of potential problems and hence summary and critique.
little or no debate or detailed consideration of it. Usually Of course, `tolerability' and `acceptability' are not neces-
only after the technology has been established and some of sarily the same, although it has been common in risk
its problems have become evident does public perception analysis to loosely interchange the words. According to
become active. This suggests that risk assessment in the HSE [3], ª`tolerability'¼ refers to the willingness to
general, and approaches such as ALARP, can deal only live with a risk to secure certain bene®ts and in the con®-
with the control of the further development of already estab- dence that it is being properly controlled. To tolerate a risk
lished technology. means that we do not regard it as negligible or something we
might ignore, but rather as something we need to keep under
3.3. Practical decisions review and reduce still further if and when we canº. Accept-
ability, on the other hand, implies a more relaxed attitude to
Whatever the idealized situation ought to be, the need to risk and hence a lower level of the associated risk criterion.
make day-to-day decisions about lesser hazards in society According to Lay®eld [16], in terms of the nuclear power
has invariably led to regulatory approaches as more conve- debate, the term `acceptable' fails to convey the reluctance
nient substitutes for public or parliamentary debate. One that individuals commonly show towards being exposed to
reason sometimes given for leaving the decisions to public certain hazardous activities.
servants is that the public is uneducated, ill-informed and Although the distinction between the terminology
irrational in dealing with complex issues; arguments which `acceptability' and `tolerability' is important, it is also the
can hardly be sustained as essential in a modern society. case that the term `acceptable' has been used in relation to
However, to invoke public debate and discussion ideally consent or acceptance of a proposed risk situation on the
requires time and, for many individuals, much back-ground part of regulatory authorities. This suggests by implication
education when the discussion is about complex issues. that the decisions of the regulatory authorities in some
None of these conditions tends to be met in practice, for a manner re¯ect `tolerability' on the part of society.
variety of reasons (see also Section 2.3). Often regulators
will facilitate some form of public participation, such as
through making available documents and through providing
5. ALARP
back-ground brie®ngs. Unfortunately, in advancing along
this line, there is a danger that there may no longer be
5.1. De®nition of terms
much left of Habermas's vision of transactions between
knowledgeable and free individuals in coming to a consensus. As noted, the ALARP approach has been advocated as a
The methods which have evolved for the solution of accep- more fundamental approach to the setting of tolerable risk
table or tolerable risk problems in a bureaucratic setting may levels, particularly suitable for regulatory purposes [20]. Fig.
be categorized broadly to include (see [1, Chapter 5]): 1 summarizes the approach, in which the region of real inter-
est lies between the upper and lower limits. This is the region
1. professional judgement as embodied in institutionally
in which risks must be reduced to a level ALARP. Since this
agreed standards (such as engineering codes of practice)
objective is central to the approach a very careful discussion
or as in commonly accepted professional skills;
and explanation of terms might be expected. However, apart
2. formal analysis tools such as cost-bene®t analysis or deci-
from appeals to sensible discussion and reasonableness and
sion analysis, with or without public discussion opportu-
the suggestion that there are legal interpretations, there is
nities; and
little in print which really attempts to come to terms with
3. so-called `boot-strapping' approaches employing techni-
the critical issues and which can help industry focus on what
ques such as `revealed preferences' as used in social±
might be acceptable [3].
psychology, or using extrapolations from available
The critical words in ALARP are `low', `reasonably' and
statistical data about risks currently accepted in other
`practicable'. Unfortunately, these are all relative terms Ð
areas of endeavor.
standards are not de®ned. `Reasonably' is also an emotive
word, implying goodness, care, consideration etc. However,
Aspects of all three are commonly in use. As will be seen,
as will be discussed below, what may be reasonable in some
the ALARP approach falls essentially in the third category.
situations can be seen as inappropriate in others.
Regarding `practicable', the Oxford Dictionary refers to
4. Risk tolerability `that can be done, feasible¼', i.e. what can be put into
practice. Of course, many actions can be implemented,
The levels of risk associated with a given facility or provided the ®nancial rewards and resources are suf®cient.
206 R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208

Thus there are a very clear ®nancial/economic implications nents. How is consistency between the `approvals' or
Ð ª `reasonable practicability' is not de®ned in legislation `consents' to be attained? Irrespective of the care and effort
but has been interpreted in legal cases to mean that the expended by the regulatory authority, there is a real danger
degree of risk can be balanced against time, trouble, cost that an applicant with a proposal which needs to be further
and physical dif®culty of its risk reduction measures. Risks re®ned or which is rejected, will cry `foul'. Without open-
have to be reduced to the level at which the bene®ts arising ness and without explicit criteria, such dangers are not easily
from further risk reduction are disproportionate to the time, avoided. Is there not also a danger of corruption?
trouble, cost and physical dif®culty of implementing further
risk reduction measuresº [3]. 5.3. Morality and economics
It is therefore clear that ®nancial implications are recog-
nized Ð ªin pursuing any safety improvement to demon- The issue of morality and how this is addressed by the
strate ALARP, account can be taken of cost. It is possible, in ALARP approach can be brought most clearly into focus by
principle, to apply formal cost-bene®t techniques to assist in a discussion based around the nuclear power industry. That
making judgement(s) of this kind.º [3]. This assumes that all industry took a major blow in the USA with the Three Mile
factors involved can be converted to monetary values. Island and other incidents. Currently there are no new facil-
Unfortunately, it is well-known that there are not inconsi- ities planned or under construction. This is possible in the
derable dif®culties and hence implied value judgements in USA because there are alternative sources of electric power
evaluating or imputing monetary values for both bene®ts with perhaps lower perceived risks, including political risks.
and costs. This problem is particularly acute for the analysis Opposition to nuclear power and the potential consequences
of hazardous facilities where the value of human life and the associated with it are clearly in evidence. Such an open
(imputed) cost of suffering and deterioration of the quality opposition may not always be tolerated in some other coun-
of life may play a major role in the analysis. tries, nor may there be viable alternative power sources.
Further, an approach based on cost analysis implicitly Thus there may be pressures for public opposition to be
assumes equal weighting for each monetary unit, a proposi- ignored and to be discredited and for access to information
tion known to cause dif®culties with cost bene®t analysis to be less easy to obtain. For example, there have been
when applied to issues with social implications. It is consid- claims of `cover-ups', such as over UK nuclear accidents.
ered that the selection of tolerable risk is of this type. Value Whatever the precise reasons, it is clear that in some coun-
judgements which society might make are subsumed in the tries the nuclear industry remains viable. Comparison to the
valuations required for cost analysis. US situation suggests that what might be considered
In addition, there is also the problem that the optimum `reasonable and practical' in some countries is not so
obtained in cost bene®t analyses is seldom very sensitive to considered in the US, even though the technology, the
the variables involved. This means that cost bene®t analysis human stock and intellect and the fear of nuclear power
alone is unlikely to provide a clear guide to the selection of appear to be much the same. The only matters which appear
appropriate policy. to be different are: (i) the economic and political necessities
Finally, it is unclear how value judgements such as of provision of electrical power; and perhaps (ii) acquies-
`low', `reasonably' and `practicable' correlate with a cence to a cultural system as re¯ected in the political author-
minimum total cost outcome. The value judgements ity and legal systems and which preclude or curtail the
required involve issues well beyond conventional cost possibility of protracted legal battles apparently only possi-
bene®t analysis, a matter well recognized in dealing ble on Common Law countries. Do these matters then ulti-
with environmental issues [21]. mately drive what is `reasonable and practical'? And if they
do, is the value of human life the same?
5.2. Openness The dichotomy between socio-economic matters and
morality issues has other implications also. It is known
In the expositions of the ALARP approach it appears that that in some countries the nuclear power system is of vari-
the speci®c tolerable probability levels which would qualify able quality, with some installations known to have a
for acceptance by a regulatory authority are not always in considerable degree of radiation leakage Ð far in excess
the public domain. The tolerable risk criterion may not be of levels permitted under international standards. Even if, as
known to the applicant and some process of negotiation is likely, the costs to bring the facilities to acceptable stan-
between the regulatory authority and the applicant is dards are too high, there will be economic pressures to keep
needed. Societal groups concerned about openness in the facilities in operation, despite the possibility that some
government might well view this type of approach with plant workers would be exposed to excessive radiation. It is
concern. known that in some case maintenance problems in high
A related problem with implementation of the ALARP radiation areas have been carried out through hiring, on a
approach can arise in the evaluation of two similar projects daily basis, members of the lowest socio-economic classes
assessed at different times, possibly involving different to do the work. Because the remuneration was good by local
personnel within the regulatory body and different propo- standards there was no shortage of willing workers, even
R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208 207

though it has come to be known that many develop radiation in society Ð a major industrial accident is one such event.
sickness and serious tumors within weeks of being exposed. The implication for the ALARP approach might well be as
Although somewhat starkly, this illustrates that the follows. What would have been considered suf®ciently
criteria of `reasonableness' and `practicability' so essential `low' for a particular type of facility prior to an `accident'
in the ALARP approach are ultimately issues of morality. might not be considered suf®cient for other generally similar
While for projects having the potential for only minor or facilities after an accident. Yet there will be very consider-
rather limited individual or social consequences there is able societal and political pressures for changing the accep-
probably no need to be concerned, for other, more signi®- tance criteria. Is it appropriate to do so?
cant projects the question must be asked whether it is accep- Following an accident, there is, usually, a call for an
table for decisions about such issues to be left for private investigation, better safety measures, more conservative
discussion between a regulatory authority and project design approaches, better emergency procedures etc.
proposers. However, some accidents must be expected. The fact that
it is admitted at the consent, approval or design stage of a
5.4. Public participation project that there is a ®nite probability of failure associated
with the project implies that an accident is likely to occur
As noted earlier, for many systems in common usage
sooner or later. The fact that the probability might have been
there is a long and established base of experience (both
shown to be extremely low does not alter this fact. Perhaps
good and bad) upon which to draw. This is not necessarily
unfortunately, probability theory cannot suggest, usually,
the case for all facilities and projects, particularly those
when an event might occur. Rationality demands that
subject to risk assessment requirements. It would seem to
`knee-jerk' political and regulatory responses might well
be precisely these projects for which risk analysis should be
be inappropriate Ð yet this is implicit in the `reasonable'
open to public scrutiny and debate so that the issue of their
and `practical' aspect of ALARP.
rationality in respect to society can be considered. As noted,
the ALARP approach would appear to permit a small group
of people making decisions about a potentially hazardous
project, away from public scrutiny, and in consultation with 6. Discussion and possibilities
the proponents of the project. According to the Royal
In science, it is recognized that progress comes in rela-
Society report [1, p. 93], ªThe (ALARP) approach has
tively slow steps, learning by trial-and-error and modifying
¼been criticised on the grounds that it does not relate bene-
the body of theory and understanding in the light of apparent
®ts clearly enough to tolerability. More importantly,
contradictions. Similarly, in the more practical arts such as
however, it does not address the critical issue of how public
engineering, progress comes about through a slow progres-
input to tolerability decisions might be achieved, beyond an
sion, carefully learning from past mistakes. Major problems
implicit appeal to the restricted, and now much criticised ¼
in engineering are likely when past observations and under-
revealed-preferences criterion º¼and¼`The question of
standing appear to have been forgotten or ignored [22,23]. It
how future public input to tolerability decisions might be
may be that an appropriate strategy for risk management lies
best achieved is also closely related to recent work on risk
along these lines also. Moreover, it is increasingly being
communication¼º.
recognized that such matters are best treated using risk
It is acknowledged that public debate and participation at
analysis and that risk analysis is best performed using
a level leading to worthwhile input is not always practical.
probabilistic methods [24].
As noted earlier, only some participants will have the time,
Even then, the issues dealt with in probability-based risk
energy and capability to become fully acquainted with the
management have, however, an added problem when it has
technical intricacies involved in signi®cant projects. There
to deal with low probability Ð high consequence events.
are the dangers also of politicizing the debate and perhaps
These, morally and practically, do not allow the luxury of a
trivializing it through excessive emotional input. Neverthe-
trial and error learning process. There may be just too much
less, there are strong grounds for not ignoring non-super-
at stake Ð hence advocates of the `precautionary principle'.
®cial public participation and involvement in risk-based
Nevertheless, it is generally the case that the technology
decisions [1].
involved is not totally new but rather is a development of
5.5. Political reality existing technology for which there is already some, or
perhaps already extensive, experience. Associated with
Risk tolerability cannot be divorced from wider issues in that existing technology are degrees of risk acceptance or
the community. It is intertwined in matters such as risk tolerance re¯ected in the behavior of society towards them.
perception, fear of consequences and their uncertainty etc. It is then possible, in principle, to `back-calculate' [25,26]
as well as various other factors which in¯uence and change the associated, underlying, tolerance levels, even if the
society with time. Societal risk tolerability would be analysis used for this purpose is recognized to be imperfect.
expected to change also. Change can occur very quickly The new technology should now be assessed employing, as
when there is a discontinuity in the normal pattern of events much as possible, the information used to analyze the
208 R.E. Melchers / Reliability Engineering and System Safety 71 (2001) 201±208

existing technology and using a risk analysis methodology, References


as much as possible, similar in style and simpli®cations to
that used to determine the previous tolerance levels. [1] Royal Society Study Group. Risk: analysis, perception and manage-
The process sketched above is one which elsewhere has ment. London: Royal Society, 1992.
[2] Cullen The Hon Lord. The public inquiry into the Piper Alpha
been termed `calibration' [25,26], i.e. the assessment of one disaster. London: HMSO, 1990.
project against another, minimizing as much as possible the [3] HSE. The tolerability of risk from nuclear power stations. London:
differences in risk analysis and data bases and not necessa- Health and Safety Executive, 1992.
rily attempting to closely anchor the assessment in societal [4] Kam JCP, Birkinshaw M, Sharp JV. Review of the applications of
tolerable risk levels. The risk levels employed are derived structural reliability technologies in offshore structural safety.
Proceedings of the 1993 OMAE, vol. 2, 1993. p. 289±96.
from previously accepted technology only, using admittedly [5] Vlek CJH, Stallen PJM. Rational and personal aspects of risk. Acta
simpli®ed models, and are of a nominal nature, having no Psychologica 1980;45:273±300.
strong validity outside the framework in which they have [6] Stewart MG, Melchers RE. Probabilistic risk assessment of engineer-
been employed. ing systems. London: Chapman and Hall, 1997.
A somewhat similar approach is already implicit in the [7] Oreskes N, Shrader-Frechette K, Belitz K. Veri®cation, valididty, and
con®rmation of numerical models in the earth sciences. Science
nuclear industry, with professionally agreed or accepted 1994;263(4):641±6.
models being used for probability and other representations [8] Popper K. The logic of scienti®c discovery. Basic Books: New York.
and with a strong culture of independent (`peer') reviews of [9] Popper K. The growth of scientifc knowledge. New York: Basic
risk analyses. The resulting probability estimates are likely Books, 1963 (see also Magee B., Popper. Fontana Modern Masters,
to be internally consistent, and to have a high degree of 1987).
[10] Kuhn TS. The structure of scienti®c revolution. Chicago, IL:
professional acceptance, even though they may not relate University of Chicago Press, 1970.
very closely to underlying (but perhaps unknowable) prob- [11] Ravetz JR. Scienti®c knowledge and its social problems. Oxford:
abilities of occurrence. Clarendon Press, 1971.
[12] Watson SR. The meaning of probability in probabilistic safety analy-
sis. Reliability Engineering and System Safety 1994;45:261±9.
[13] Stallen PJM. In: Conrad J, editor. Risk of science or science of risk ?,
7. Conclusions Society, technology and risk assessment. London: Academic Press,
1980. p. 131±48.
[14] Blockley DI, editor. Engineering safety. London: McGraw-Hill, 1990.
Risk management should embody fundamental principles [15] Pusey M. Jurgen Habermas. Chichester, UK: Ellis Horwood/
such as societal participation in decision-making. It is Tavistock, 1987.
recognized that this may be dif®cult for a variety of reasons [16] Lay®eld F. Sizewell B public inquiry: summary of conclusions and
and that alternative decision-making procedures are recommendations. London: HMSO, 1987.
required. The current trend appears to be one of increasing [17] Wynne B. In: Conrad J, editor. Society and risk assessment-an attempt
at interpretation, Society, technology and risk assessment. London:
involvement of regulatory authorities, with acceptance Academic Press, 1980. p. 281±7.
criteria not always open to the public or the applicants [18] Lischka JR. Ludwig Mond and the British alkali industry. New York:
and in some cases settled by negotiation. This is also the Garland, 1985.
case with the ALARP approach. It is suggested that there are [19] Reid SG. In: Blockley DI, editor. Acceptable risk, Engineering
a number of areas of concern about the validity of this Safety. London: McGraw-Hill, 1992. p. 138±66.
[20] Sharp JV, Kam JC, Birkinshaw M. Review of criteria for inspection
approach. These include representativeness, morality, philo- and maintenance of North Sea structures. Proceedings of the 1993
sophy, political reality and practicality. It is suggested that OMAE, vol. 2, 1993. p. 363±8.
risk assessments recognize peer review and the incremental [21] Layard PRG. Cost±bene®t analysis: selected readings. Harmonds-
nature of technological risks. worth: Penguin, 1972.
[22] Pugsley AC. The prediction of proneness to structural accidents. The
Structural Engineer 1973;51(6):195±6.
[23] Sibley PG, Walker AC. Structural accidents and their causes. Proceed-
Acknowledgements ings of the Institute of Civil Engineers. Part. 1. 1977. p. 191±208.
[24] Kirchsteiger C. On the use of probabilistic and deterministic methods
in risk analysis. Journal of Loss Prevention in the Process Industries
The support of the Australian Research Council under 1999;12:399±419.
grant A89918007 is gratefully acknowledged. Some parts [25] Melchers RE. Structural reliability analysis and prediction. 2nd ed..
of this paper appeared in an earlier conference contribution. Chichester, UK: Wiley, 1999.
The author appreciates the valuable comments on a number [26] Melchers RE. In: Melchers RE, Stewart MG, editors. Probablistic
calibration against existing practice as a tool for risk acceptability
of issues made by the reviewers. Where possible their assessment, Integrated risk and assessment. Rotterdam: Balkema,
comments have been addressed. 1995. p. 51±56.

You might also like