Benchmarking A Process For Learning - Robert CAmp 10 Steps

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/234698095

Benchmarking: A Process for Learning or Simply Raising the Bar?

Article  in  Evaluation Journal of Australasia · January 2009


DOI: 10.1177/1035719X0900900203

CITATIONS READS

6 1,284

1 author:

Michael Cole
ANU College of Health & Medicine
5 PUBLICATIONS   30 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Michael Cole on 28 May 2014.

The user has requested enhancement of the downloaded file.


REFEREED ARTICLE
Evaluation Journal of Australasia, Vol. 9, No. 2, 2009, pp. 7–15

Benchmarking: a process for


learning or simply raising the bar?

Benchmarking has been a credible approach to Michael J Cole


quality improvement and program evaluation for
more than 30 years. However, the term has been
used and abused so much since the 1970s that
now an author might be describing anything from
the establishment of performance indicators, to
competitor ranking, or gap analysis, through to a
continuous quality improvement process. Equally
confusing is the plethora of different benchmarking
models published. A literature review was conducted
in which 65 publications from 1998–2008 were
analysed and synthesised in order to untangle the
confusion surrounding the contemporary theory
and practice of benchmarking. This article clarifies
Michael J Cole is an Aid Effectiveness
the terminology and concepts of benchmarking, Adviser in the Bangkok Regional Office
synthesises a concise practical definition, examines of the Australian Agency for
various approaches in the current theory and International Development.
application of benchmarking, and explicates the Email: <michael.cole@dfat.gov.au>
features most consistent with the goal of continuous
quality improvement and organisational learning.

Introduction
Benchmarking is a commonly used term and an integral part of the modern
lexicon of evaluation; in spite of this, a recent review of the literature by
the author revealed a great deal of confusion about what is actually meant
by benchmarking. Originally it was conceived as a continuous quality
improvement process, in which an organisation striving for best practice
compares its performance with another in an effort to learn and progress.
However, it seems it is often operationalised now as a simple ranking process.
Nevertheless, its greatest potential remains as a vehicle for the sharing and
transfer of best-practice knowledge, along with its tangible contribution to
the development of learning organisations ‘… that can analyse, reflect, learn
and change based on experience’ (Bhutta & Huq 1999, p. 256).
The purposes of this exploration are to determine from the literature: how
the terminology of benchmarking is being used and to untangle some of the
confusion surrounding the concept; to synthesise a concise definition that
is inclusive and has utility; to examine and compare various approaches in
the current theory and application of benchmarking; and finally, to elucidate
the features that are most likely to fulfil the promise of continuous quality
improvement and organisational learning outcomes.

Cole—Benchmarking: a process for learning or simply raising the bar? 7


REFEREED ARTICLE

The search process The adoption of benchmarking by private and


The area of enquiry was the theory and practice public sector organisations seems to have grown
in benchmarking for program evaluation over the rapidly from the mid-1980s to a peak in the mid-
decade, 1998 to 2008. Publications were sought 1990s, which is reflected in the volume of literature
and collected primarily using Internet searches of on the subject during this decade. From that period
multiple databases and online journals, such as onwards its popularity seems to have waned to some
those of the American Evaluation Association. The degree, with the practice often either replaced by or
libraries of University of Melbourne and Bangkok absorbed into other evolving quality systems.
University also supplemented the electronic
resources with a modest number of relevant hard- Terminology
copy publications. Keywords used for the search
were ‘benchmark’, ‘benchmarking’, ‘program’, The terminology used by organisations and
‘assessment’ and ‘evaluation’. The publication researchers to discuss benchmarking is often unclear
dates were limited to 1998 to 2008 in order to and confusing, as has been noted by various authors
select the most recent material for determining (Alstete 2008; Fine & Snyder 1999). Terms such as
current practice, theory and trends. Publications ‘standard’, ‘target’, ‘benchmark’, ‘benchmarking’,
representing divergent fields of practice were ‘performance measurement’, ‘performance statistics’
incorporated intentionally so that a broad view of and ‘performance indicators’ were often used
current practice and opinion could be consulted. In interchangeably (Bowerman et al. 2002; Hinton,
all, 65 publications were reviewed and analysed. Francis & Holloway 2000). Over the period of
The rich descriptions in each document were the literature reviewed particular distinctions seem
compared, contrasted and then synthesised to to have grown, quite organically as it were, from
produce the findings and discussion presented here. author to author about the meaning and context of
these various terms.
It is worth spending a little time exploring this
Background aspect because it will help refine the focus of this
Originally championed in the private sector, first in article, as well as illustrate some pertinent issues in
manufacturing, then management and marketing, the recent development of the theory and practice
benchmarks and the benchmarking process are of benchmarking for program evaluation. It is
currently used in a vast and diverse array of fields also critical to untangling some of the confusion
including engineering, family and community surrounding the topic. A quote from the AusAID
services, financial services, higher education, human Performance Assessment and Evaluation Policy
resources, information technology, insurance, seems pertinent at this point:
hospitals, local government and public utilities
(Alstete 2008). Confusion about the meaning of particular words
There is evidence that the process was first is by no means the most significant challenge we
used in Japan where it was termed ‘Dantotsu’. In face in implementing the performance agenda, but
fact, during the post-World War II reconstruction it can affect how we think about performance,
period the USA often accused Japan of studying how effectively we use … tools for monitoring and
their products and processes, copying them, and reporting, and the value we get from training and
then improving upon them. This was essentially the guidance in this field (AusAID 2007, p. 9).
core of what later became known as benchmarking
(Bank 2000; Boulter 2003). The Australian Modern Oxford Dictionary
Xerox is frequently credited with initiating the (Ludowyk & Moore 2000, p. 67) defines a
formal practice of benchmarking in 1979 (Anand benchmark as ‘a standard or point of reference’. It
& Kodali 2008; Ford & Hibbard 2000; Smith also defines a standard as both ‘the specification by
2006;). Moreover, Robert Camp at Xerox was which something is measured’ and as ‘the required
certainly the first to formulate, test and publish a level of quality’.
cohesive model of benchmarking, which he elucidates The misuse of the term by some authors means
in his modestly titled publication, ‘A Bible for that a benchmark could be a specification, a
Benchmarking, by Xerox’ (1993). Challenges by variable, a required level of quality or a reference
Japanese firms had led to a sharp decrease in revenue point. When using common parlance, the required
and market share for Xerox and so prompted the level of quality for a variable benchmarked as a
drive for quality as a means of survival. Xerox target could translate as: ‘the standard standard of
studied its direct competitors to identify how a standard’. This is not meant to be glib, but simply
their performance compared, and how they could to illustrate the potential for circular definitions
improve. Furthermore, imaginatively, Xerox also and general confusion, which can occur over time
studied similar processes of high achievers outside in the absence of a common unifying framework
their own industry. In 1986 the decline in market and technical language, or where everyday language
share had begun to reverse and by 1991 Xerox had is used interchangeably with terms that have a
significantly revised its quality strategy, from meeting precise meaning within a specific context. It also
benchmarked competitors, to becoming the leader in demonstrates the confusion that arises from the use
the European market (Zairi & Whymark 2000a). of tautological constructions such as a benchmark

8 Evaluation Journal of Australasia, Vol. 9, No. 2, 2009


REFEREED ARTICLE

standard, which simply add to the jargon rather attractive to partnerships, rather than the more
than our clarity or understanding. resource-intensive and time-consuming processes
of reflection and development.’ The true irony, as
will be described later in this article, is that when
A ‘benchmark’ or ‘benchmarking’ understood and implemented as a whole, the
It is important to make the distinction between Benchmarking process is indeed a time-consuming,
the establishment of a point of reference and reflective practice that promotes development.
undertaking a comprehensive continuous quality The comparative analysis of performance is
improvement process. This distinction is frequently sometimes also confused with true Benchmarking.
lacking in the literature, where too often the On this point Fine and Snyder (1999) clarify that
establishment of a benchmark is presented as performance measurement is one of the first steps
though a complete benchmarking process has been in the quality improvement process involving the
undertaken. To assist the reader in following the definition, selection and application of performance
discussion, as well as to emphasise the distinction indicators, whereas Benchmarking is a continuous
between a benchmark as a point of reference and process including implementation of reforms,
Benchmarking as a distinct formalised model evaluation of outcomes and review of the previous
of quality improvement, henceforth the term performance indicators or benchmarks.
‘Benchmarking’ will be capitalised in this article. In his bluntly titled article ‘Measurement
More than a quarter of the publications reviewed benchmarks or “real” benchmarking?’, Alstete (2008,
simply discussed Benchmarking as the establishment p. 79) throws down the gauntlet and expounds
of performance indicators or Benchmarking targets. unambiguously that ‘… organizations of all types
In addition, some authors limited their description today are concerned with comparing performance
of Benchmarking to the stage of ranking their firm for internal assessment, publicity and marketing
with competitors (Dawkins, Feeny & Harris 2007). purposes, and perhaps pride. True benchmarking …
Others equated Benchmarking only with the [is] … focused on specific process improvements and
process of comparative or gap analysis (Kumar increased organizational effectiveness’.
et al. 2008). In respect to customer service for The remarkable variation in the definitions
example, Bordley (2001) states that the benchmark and descriptions of Benchmarking evident in the
is simply customers’ product expectations, and literature, could be conceptualised as a linear
describes determining the gap between customers’ progression of increasing comprehensiveness
perceptions of the actual product and their advancing through the stages of Benchmarking; that
expectation as effectively being Benchmarking. is, from the definition of performance indicators to
Meanwhile Halliday, Asthana and Richardson the description of an institutionalised continuous
(2004, p. 286) refer to Benchmarking disparagingly quality improvement process. This could be
as a simple ‘health check’. They then go on to say, represented visually (Figure 1) with the various terms
‘Ironically, it is the least satisfactory dimension, used to define and describe Benchmarking taken from
the ability to benchmark, that is often the most the literature and presented on a continuum.

Figure 1: Visually depicts with the various terms used to define and describe Benchmarking
presented as a continuum
Establishment of performance indicators
Definition of performance indicators

Continuous improvement process


Ranking competitors/comparison
Performance measurement

Implementation of reforms
Performance comparison
Setting targets

Gap analysis
organisation

Cole—Benchmarking: a process for learning or simply raising the bar? 9


REFEREED ARTICLE

Benchmarking defined of a comparative being used as a superlative and


At Xerox in the early 1980s, Benchmarking was vice versa, it is conceptually more useful to describe
described as ‘… the continuous process of measuring the quest for continual quality improvement as
our products, services and practices against those of searching for and trying to achieve ‘better practice’,
our toughest competitors or companies renowned or ‘even better practice’.
as leaders’ (Camp 1993, p. 23). Other definitions Synthesising from the literature reviewed, a brief
abound. As early as 1992, Spendolini (cited in but complete and operational definition would be:
Anand & Kodali 2008) had found 49 distinct Benchmarking is a continuous quality
definitions for Benchmarking in the literature. improvement process. It is used to identify and
Denkena, Apitz & Liedtke (2006, p. 191) understand the practices exhibited by the best in
observed that ‘many regard benchmarking as a their field; to adapt and improve those practices,
method to compare key figures, often financial for the purpose of reaching the targeted level of
ones, for the purpose of ranking the organization in excellence, and then surpassing it with even better
relation to competitors or the industry average’, but practice.
then go on to refute that, saying ‘… improvement is Figure 2 shows the gap between the actual
the ultimate objective of any benchmarking study’. organisation performance and the desired
It is essentially concerned with understanding how performance of the benchmark, that is, the
processes work, to learn from the observation of benchmark gap. An action plan is put in place to the
best practice, and to make improvements necessary meet the benchmark. Note also that the benchmark
to reach those standards of excellence (Smith 2006). is not static but continually reviewed and redefined
Definitions in the recent literature have (Price 2005). The diagram illustrates that ideally the
also become increasingly sophisticated and aim of the organisation is to continue to improve
comprehensive. The essential ingredients to performance and surpass the benchmark and thus
any functional definition of the Benchmarking ‘best practice’ becomes ‘even better practice’.
process are the ideas of measurement,
comparison, identification of best practices, Purpose of Benchmarking
learning, implementation, review and continuous
improvement (Anand & Kodali 2008; Bank 2000; Benchmarking prevents the organisation or
Bhutta & Huq 1999; Evans 2008; Reider 2000). organisational unit from becoming insular and
A related issue is the use of the term ‘best stagnant. Bhutta & Huq (1999, p. 254) call it ‘…
practice’. Best practice suggests an ultimate goal, a way to move away from tradition’. It enables
and a static one at that. On the contrary, in organisations to look more objectively at their
reality so-called best practice is continually being strengths and weaknesses (Evans 2008). Moreover,
transformed into, or surpassed by, better practice. it facilitates the organisation to become a more open
While some might want to argue over the semantics system, increasing interactions and information

Figure 2: Representation of the Benchmarking process (adapted from Reider 2000)

QUALITY
Even better practice

Benchmark (best practice)


Benchmark gap
Surpassing the benchmark

Organisation (current practice) Meeting the benchmark

Action plan

TIME

10 Evaluation Journal of Australasia, Vol. 9, No. 2, 2009


REFEREED ARTICLE

flows within its operating environment. This A more recent analysis (Zairi & Whymark
enhances identification of customer requirements, 2000b) of the continuing development of
development of accurate measures of productivity Benchmarking at Xerox did not increase the number
and the establishment of effective goals and objectives of steps, but did add a fifth phase, Maturity, in
(Bank 2000). Identifying that someone has solved which the company has attained a leadership
a problem and obtaining ideas from those who are position and Benchmarking has become an essential
getting superior results also motivates higher levels of and embedded continuing process. The researchers
performance (Anderson & Fagerhaug 2000). Bank also described the objectives of each phase. Their
(2000) puts it very simply and powerfully when he analysis and discussion has been reproduced for ease
says the purpose of Benchmarking is to be better than of examination in Table 1.
the best competitor. Over time, the original 10-step process
established by Camp at Xerox appears to have
been abbreviated. Most of the authors in the
Models and processes literature reviewed referred to models developed by
The Benchmarking model developed at Xerox previous researchers, usually one of various five-step
(Camp 1993) had four phases: Planning, Analysis, models developed in the early to mid 1990s. The
Integration and Action. Within these four phases the most commonly cited five-step model was that of
model listed a sequence of 10 essential steps: Spendolini (1992, cited in Shen, Tan & Xie 2000;
1 Identify what is to be benchmarked. Anand & Kodali 2008; Bowerman et al. 2002;
Denkena, Apitz & Liedtke 2006).
2 Identify comparative companies. Of those authors who proposed their own
3 Determine data collection method and collect adaptation or formulation, various interpretations
data. were presented ranging from Watson’s (2005) three-
step process, which concluded with a comparative
4 Determine current performance levels. analysis, through to Owen’s (2006) eight-step
5 Project future performance levels. model, which firmly imbeds a practice of continuous
quality improvement within the organisation.
6 Communicate benchmark findings and gain A comparison of the steps in various models is
acceptance. presented in Table 2. At times authors truncated
7 Establish functional goals. the steps in the Benchmarking process and in some
circumstances they compressed steps together.
8 Develop action plans. Meszaros and Benson (2003), for example, present
9 Implement specific actions and monitor progress. a four-step Benchmarking process in which they
actually compressed two distinctly disparate
10 Recalibrate benchmarks. activities ‘compare and contrast data’ and ‘plan

Table 1: Benchmarking at Xerox as a five-phase model with objectives of each phase described

Phase Objectives
1 Planning The objective of this phase is to prepare a plan for Benchmarking.
2 Analysis This phase will assist companies to understand competitors’ strengths and assess their
performance against these strengths.
3 Integration The objective of this phase is to use the data gathered to define the goals necessary to gain
or maintain superiority and to incorporate these goals into the company’s formal planning
process.
4 Action During this stage, the strategies and action plan established through the Benchmarking
process are implemented and periodically assessed (recalibrated) with reports of the
company’s progress in achieving them.
5 Maturity The objective of this is to determine when the company has attained a leadership position
and to assess whether Benchmarking has become an essential, ongoing element of its
management process.

Cole—Benchmarking: a process for learning or simply raising the bar? 11


Table 2: A comparison of Benchmarking models in recent literature

Reference Watson 2005 Meszaros & Smith 2006 Bhutta & Huq Bank 2000 Evans 2008 Bhutta. & Huq Maire, Bronet & Ammons 1999 Reider 2000 Owen 2006
Benson 2003 1999 1999 Pillet 2008
Sector HVAC University Human resources Manufacturing Management and Business Kodak Small to Public Management and General

12
engineering business manufacturing— medium-size administration business
case study enterprises
Number of 3 4 4 5 5 5 6 6 7 7 9
steps
Stage Plan Plan Plan Plan Plan Plan Plan Plan Plan Plan Plan
Step 1 Defining Plan the study— Planning— Determine what Decide what is to Identify the area What to Plan—determine Decide on what Identification of Identification
parameters use consensus investigation, to benchmark be benchmarked: of operation to be benchmark critical processes process to core issues and of the area of
to select similar measurement products, benchmarked to compare, benchmark critical areas for operation to be
REFEREED ARTICLE

departments for and examination services or define data to improvement benchmarked


comparison of strengths and business be collected and
weaknesses processes plan various
of current steps of the
processes project
Stage Plan Collect and Collect and Plan Plan Plan Plan Plan Plan Plan Plan
analyse analyse
Step 2 Reaching Conduct the Analysis— Establish Select Identify of the Establish teams Research— Study the Detailing the Identification of
consensus on research— identifying benchmarking competitors key performance determine process in your workflow and ‘best practice’
the most critical agreement on potential teams who are the indicators to measurements own organisation process steps in selected
elements to data collection, benchmarking best in terms of measure to be used, to organisations
measure and parameters and partners and those products, identify the future or sections of
track timelines then exchanging services or partners and to organisations
information, business collect available
and observing processes data
and comparing
processes
Stage Collect and Collect and Plan Plan Plan Plan Plan Collect and Plan Plan Collect and
analyse analyse analyse analyse
Step 3 Data collected Analyse data— Implementation Identify partners Decide most Identify of the Identify partners Observe— Identify Questioning each Collect and
and comparisons compare and — adaptation to benchmark appropriate best-in-class and identify collect the Benchmarking function, activity analyse
made to contrast data and modification against and measurements companies critical measures complementary partners or work step information
identify the top from each school of processes identify critical to define data, observe to determine
performers and analyse based on measures performance similarities and common
ranking. Plan learning from the and develop a differences in characteristics of
new goals for analysis stage strategy for data processes best practice
following year collection
Stage Implement Plan Collect and Collect and Collect and Collect and Collect and Collect and Collect and Collect and
analyse analyse analyse analyse analyse analyse analyse analyse
Step 4 Adapt best Review— Collect data Determine Measure best-in- Collect data Analyse— Analyse the Data collection Develop best
practices to ongoing review competitors class companies analyse current process partners practice
guide operations and refinement strengths and and compare practices and to identify indicators and
and achieve with the intention assess against results to own to decide differences levels to be
department goals of achieving own company performance operational that account achieved on

Evaluation Journal of Australasia, Vol. 9, No. 2, 2009


continuous performance or strategic for superior these indicators
improvements practices to use performance
Table 2: A comparison of Benchmarking models in recent literature (CONTINUED)

Reference Watson 2005 Meszaros & Smith 2006 Bhutta & Huq Bank 2000 Evans 2008 Bhutta. & Huq Maire, Bronet & Ammons 1999 Reider 2000 Owen 2006
Benson 2003 1999 1999 Pillet 2008
Sector HVAC engineering University Human resources Manufacturing Management and Business Kodak Small to Public Management and General
business manufacturing medium-size administration business
—case study enterprises
Number of 3 4 4 5 5 5 6 6 7 7 9
steps
continued from previous page
Stage Implement Collect and Implement Collect and Implement Implement Collect and Plan
analyse analyse analyse
Step 5 Adapt and Develop an Define and take Gap analysis Adapt— Adapt and Analysing and Communicate
improve action plan. Use actions to meet understand new implement ‘best interpreting the best-practice
analysed data or exceed the practices and practices’ information dimensions in
to set company best performance adapt them to the organisation to
goals to achieve new context gain acceptance
superiority
Stage Review Review and Review Implement Implement
institutionalise
Step 6 Feedback and Improve— Monitor and A plan is properly Development and
review implement and revise developed and implementation
follow up the best practice of plans to
implementation implemented achieve these
of these practices levels
Stage Review Review and Review
institutionalise

Cole—Benchmarking: a process for learning or simply raising the bar?


Step 7 Feedback and Management— Progress
review redefine monitoring
benchmarks and
review operations
as continuous
improvement
Stage Plan or
institutionalise
Step 8 Full integration of
practice into the
functioning of the
organisation
REFEREED ARTICLE

13
REFEREED ARTICLE

new goals for following year’ into their third Figure 3: Anand & Kodali’s (2008) 12-phase
step. Predominantly though, the explanation for conceptual model of Benchmarking
the difference in number of steps is quite simple,
and again goes back to the authors’ understanding 1 Team formation
and definition of Benchmarking. A number of them
operated under the assumption that Benchmarking 2 Subject Identification
meant either: the establishment of performance 3 Customer validation
indicators, a comparative/gap analysis, or a ranking
exercise, and so simply stopped describing the process 4 Management validation
at those points. Some also presented Benchmarking 5 Self-analysis
as a single event rather than a continuous process and
so, even though they described it through to 6 Partner selection
planning and implementation, they left off any 7 Pre-benchmarking activities
recalibration of benchmarks or further review of the
Benchmarking process. 8 Benchmarking
Ultimately, it is when Benchmarking has 9 Gap analysis
become institutionalised as a continuous quality
improvement process that it can achieve its 10 Action plans
potential in contributing to the development of a 11 Implementation
learning organisation. In this situation it could be
viewed as an example of a practical participatory 12 Continuous improvement
evaluation approach (Cousins & Earl 1995, cited
in Alkin 2004).
Conclusions
Table 2 lists the publication, reference,
sector if specified, and the steps described in the The theory and practice of Benchmarking has grown
Benchmarking process. The various models reviewed rapidly and organically over the past decade.
were analysed for component stages, these were As a result, there is a great deal of diversity in
then synthesised to produce common stages in the approaches, along with confusion in the terminology
Benchmarking processes. All steps in each model used by organisations and researchers to discuss
were then classified according to the following Benchmarking.
stages: As it continues to progress, one of the challenges
Benchmarking faces is whether or not it can
■■ planning the Benchmarking process
evolve and adapt in diverse contexts, yet maintain
■■ the collection and analysis of data a common language that facilitates the transfer
of research findings and promotes professional
■■ implementation of the adapted practice
development and organisational learning. More
■■ review of the new practice precise terminology should be promoted by
management leaders, educators and researchers.
■■ institutionalisation of the Benchmarking process
It seems that the evolution of Benchmarking has
itself.
now passed the point where the development of a
Anand and Kodali (2008) reviewed the best unifying model or common theory of Benchmarking
practices in Benchmarking and proposed a is possible. However, an examination of the various
model that consists of 12 phases and includes 54 models illustrates that a comprehensive process
steps (see Figure 3). While they claim that this can and must be used if one is to undertake an
was a Benchmarking exercise on Benchmarking authentic Benchmarking exercise that will result in
practice, they did not implement it and so by improvement in both quality and learning outcomes.
definition it was not a true Benchmarking process. A key question is whether Benchmarking will
However, they candidly acknowledged that ‘… come to be used mechanically to raise higher targets
one of the limitations of the model is that it is that drive management and staff to report against.
highly conceptual and has not been validated Or will it maintain its integrity as a holistic approach
by implementing it in industries to assess its to evaluation that can be a pathway towards
effectiveness.’ (p. 282). The model was not included collaborative continuous improvement and the
in Table 2 for this reason, but this not intended to development of a true learning organisation.
imply that it does not have merit, and in fact it does Ultimately, it has potential as a vehicle for
warrant testing in the field. objective feedback, challenging mental models,
Under each phase, several detailed steps are encouraging reflective practice and ensuring
described. While its meticulous detail could be continuous improvements that all contribute to the
considered too prescriptive by veteran evaluators development of learning organisations. Senge (1990,
experienced in Benchmarking, conversely its p. 3) envisioned organisations ‘… where people
inclusiveness would make it easy for a novice continually expand their capacity to create the results
practitioner from virtually any field to saunter they truly desire, where new and expansive patterns
through the Benchmarking process, confident of thinking are nurtured, where collective aspiration
that they had not missed any critical component in is set free, and where people are continually learning
the sequence. how to learn together’.

14 Evaluation Journal of Australasia, Vol. 9, No. 2, 2009


REFEREED ARTICLE

References acclaimed benchmarks?’, Public Works Management


Alkin, MC (ed.) 2004, Evaluation roots: tracing & Policy, vol. 4, no. 3, pp. 184-195.
theorists’ views and influences, Sage, Thousand Oaks, Halliday, J Asthana, SN & Richardson, S 2004,
California. ‘Evaluating partnership: the role of formal assessment
Alstete, JW 2008, ‘An examination of current tools’, Evaluation, vol. 10, no. 3, pp. 285–303.
perspectives’, Benchmarking: an International Journal, Hinton, M, Francis, G & Holloway, J 2000, ‘Best practice
vol. 15, no. 2, pp. 178–186. benchmarking in the UK’, Benchmarking:
Ammons, DN 1999, ‘A proper mentality for an International Journal, vol. 7, no. 1, pp. 52–61.
benchmarking’, Public Administration Review, Kumar, M, Antony, J, Madu, CN, Montgomery, DN
vol. 59, no. 2, pp. 105–109. & Park, SH 2008, ‘Common myths of Six Sigma
Anand, G & Kodali, R 2008, ‘Benchmarking the demystified’, International Journal of Quality &
benchmarking models’, Benchmarking: an Reliability Management, vol. 25, no. 8, pp. 878–895.
International Journal, vol. 15, no. 3, pp. 257–291. Ludowyk, F & Moore, B (eds) 2000, The Australian
Anderson, B & Fagerhaug, T 2000, ‘Root cause analysis: Modern Oxford Dictionary, Oxford University Press,
simplified tools and techniques’, ASQ Quality Press, Melbourne.
Milwaukee, Wisconsin. Maire, JL, Bronet, V & Pillet, M 2008, ‘Benchmarking:
AusAID 2007, Performance assessment and evaluation: methods and tools for SME’, Benchmarking: an
policy, December 2007, Office of Development International Journal, vol. 15, no. 6, pp. 765–781.
Effectiveness, AusAID, Canberra. Meszaros, PS & Benson, K 2003, ‘Benchmarking in
Bank, J 2000, The essence of total quality management, family and consumer sciences’, Journal of Family and
2nd edn, Pearson Education, London. Consumer Sciences, vol. 95, no. 4, pp. 95–96.
Bhutta, KS & Huq, F 1999, ‘Benchmarking—best Owen, JM 2006, Program evaluation: forms and
practices: an integrated approach’, Benchmarking: approaches, 3rd edn, Allen & Unwin, New South
an International Journal, vol. 6, no. 3, pp. 254–268. Wales.
Bordley, RF 2001, ‘Integrating gap analysis and utility Price, CP 2005, ‘Benchmarking in laboratory medicine: are
theory in service research’, Journal of Service we measuring the right outcomes?’, Benchmarking: an
Research, vol. 3, no. 4, pp. 300–309. International Journal, vol. 12, no. 5, pp. 449–466.
Boulter, L 2003, ‘Legal issues in benchmarking’, Reider, R 2000, Benchmarking strategies: a tool for profit
Benchmarking: an International Journal, vol. 10, no. improvement, John Wiley & Sons, New York.
6, pp. 528–537. Senge, PM 1990, The fifth discipline: the art and practice
Bowerman, M, Francis, G, Ball, A & Fry, J 2002, ‘The of the learning organization, Currency Doubleday,
evolution of benchmarking in UK local authorities’, New York.
Benchmarking: an International Journal, vol. 9, no. 5, Shen, XX, Tan, KC & Xie, M 2000, ‘Benchmarking in
pp. 429–449. QFD for quality improvement’, Benchmarking: an
Camp, RC 1993, ‘A bible for benchmarking, by Xerox’, International Journal, vol. 7, no. 4, pp. 282–291.
Financial Executive, July/August, pp. 23–27. Smith, I 2006, ‘Benchmarking human resource
Dawkins, P, Feeny, S & Harris, MN 2007, ‘Benchmarking development: an emerging area of practice’, paper
firm performance’, Benchmarking: an International presented at the Shanghai International Library
Journal, vol. 14, no. 6, pp. 693–710. Forum, August, Shanghai.
Denkena, B, Apitz, R & Liedtke, C 2006, ‘Knowledge- Watson, JJ 2005, ‘Improving system performance through
based benchmarking of production performance’, benchmarking’, ASHRAE Journal, vol. 47, no. 3,
Benchmarking: an International Journal, vol. 13, pp. 56–61.
no. 1/2, pp. 190–199. Zairi, M & Whymark, J 2000a, ‘The transfer of best
Evans, JR 2008, Quality and performance excellence: practices: how to build a culture of benchmarking
management, organisation and strategy, 5th edn, and continuous learning—part 1’, Benchmarking: an
Thompson South-Western, New York. International Journal, vol. 7, no. 1, pp. 62–78.
Fine, T & Snyder, L 1999, ‘What is the difference between Zairi, M & Whymark, J 2000b, ‘The transfer of best
performance measurement and benchmarking?’, Public practices: how to build a culture of benchmarking
Management, vol. 81, no. 1, pp. 24–25. and continuous learning—part 2’, Benchmarking: an
International Journal, vol. 7, no. 2, pp. 146–167.
Ford, M & Hibbard, T 2000, ‘Choosing transportation
investments in Oregon: what role for the state’s

Cole—Benchmarking: a process for learning or simply raising the bar? 15

View publication stats

You might also like