Download as pdf or txt
Download as pdf or txt
You are on page 1of 192

EIP

E I P

Guidelines for the Application of Best Practice in Australian University Libraries


G uidelines for the
Application of Best Practice in
Australian University Libraries

Intranational and international benchmarks

00/11

Anne Wilson
Leeanne Pitman
Isabella Trahn

Intranational and international benchmarks

ISBN 0 642 44474 9


Evaluations and Investigations Programme
Higher Education Division
00/11

Department of Education,
Training and Youth Affairs
DETYA No. 6478.HERC 00A
ABN: 51 452 193 160
CONTENTS

Department of Education,
Training and Youth Affairs

Guidelines for the


Application of Best
Practice in Australian
University Libraries
Intranational and international benchmarks

Anne Wilson
Leeanne Pitman
Isabella Trahn

00/11 August 2000

Evaluations and Investigations Programme


Higher Education Division

Instructions for pdf navigation


• Use the arrows on the Acrobat menu bar to navigate forwards or backwards page by page

• Alternatively, use the arrow icons on your keyboard to navigate through the document.

• To enlarge the viewing screen either:

– use the magnifying glass by clicking on the area you wish to enlarge or by forming
a marquee over the area you wish to view (ie. hold the mouse button down and
drag the magnifying glass over the area); or

– use the view options menu bar at the bottom of the Acrobat screen.

• To pan out from the page, hold down the option button on your keyboard to change the
+ve symbol on the magnifying glass to a –ve symbol , then click the mouse.

• To search for a word or phrase use the binoculars icon on the menu bar.

• The Contents pages are live, ie. if you click on a topic you will go to that page.

• You can return to the Contents page by clicking your mouse on ‘Contents’ on the top of
each page.
CONTENTS

© Commonwealth of Australia 2000

ISBN 0642 444749

ISBN 0642 444757 (Online version)

DETYA No. 6478.HERCOOA

This work is copyright. Apart from any use as permitted under the Copyright
Act 1968, no part may be reproduced by any process without permission
from Ausinfo. Requests and inquiries concerning reproduction and rights
should be addressed to the Manager, Legislative Services, Ausinfo, GPO Box 84,
Canberra ACT 2601.

The report is funded under the Evaluations and Investigations Programme of


the Department of Education, Training and Youth Affairs.

The views expressed in this report do not necessarily reflect the views of the
Department of Education, Training and Youth Affairs.
Contents

Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .vii

Abbreviations and acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .ix

Council of Australian University Librarians—members . . . . . . . . . . . . .xi

Executive summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .xiii

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
1.1 Origin and background of the investigation . . . . . . . . . . . . . . . . . . . .1
1.2 Project description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
1.3 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.4 Research methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.4.1 Literature review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.4.2 Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3
1.4.3 Site visits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3
1.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4

2 Literature review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5
2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5
2.2 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6
2.2.1 Australasian sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6
2.2.2 International sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8
2.3 Performance indicators and measurement . . . . . . . . . . . . . . . . . . . .11
2.3.1 Australasian sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11
2.3.2 International sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13
2.4 Quality/best practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
2.4.1 Australasian sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
2.4.2 International sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17

3 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
3.1 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
3.2 Methodologies for library benchmarking . . . . . . . . . . . . . . . . . . . . .21
3.2.1 CHEMS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .23
3.2.2 Universitas 21 methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24
3.2.3 AQC benchmarking methodology . . . . . . . . . . . . . . . . . . . . . . . . . .24

MORE CONTENTS

iii
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

3.3 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .25


3.3.1 Early attempts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26
3.3.2 1995—Current situation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .27
3.4 Survey findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29
3.4.1 Partners . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .30
3.4.2 Projects: scope and variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .31
3.4.3 Timeframe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .32
3.4.4 Why benchmark: purpose? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .32
3.4.5 Achievement of purpose/lessons for
successful outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .33
3.4.6 Training (see also Chapter 6) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .36
3.5 Future directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .37
3.5.1 McKinnon project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .37
3.6 International comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .38
3.6.1 The large, traditional European university library—
University of Muenster . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .39
3.6.2 A centralised, highly progressive and flexible university
library: Danish Business School Library . . . . . . . . . . . . . . . . . . . . . . .40
3.6.3 The British defence university campus and
the SCONUL Benchmarking group . . . . . . . . . . . . . . . . . . . . . . . . . .41
3.6.4 The US scene: Purdue University Libraries . . . . . . . . . . . . . . . . . . . . .42

4 Performance indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .45


4.1 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .45
4.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .45
4.2.1 Progress in Australia in the 1990s . . . . . . . . . . . . . . . . . . . . . . . . . .46
4.3 Survey findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .49
4.3.1 CAUL indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .49
4.3.2 Ratings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
4.3.3 Additional/important indicators . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
4.3.4 Areas where CAUL could undertake further developmental work . . . . .54
4.3.5 Comments/changes to current CAUL indicators . . . . . . . . . . . . . . . . .56
4.3.6 Use of information gathered from performance indicators
in benchmarking, quality management . . . . . . . . . . . . . . . . . . . . . . .57
4.3.7 Training for performance measurement . . . . . . . . . . . . . . . . . . . . . .58
4.4 Future directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .59
4.5 International comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .60
4.5.1 Practising what one preaches: University of Muenster Libraries . . . . . . .60

MORE CONTENTS

iv
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4.5.2 Performance indicators in a quality context:


Danish Business School Library . . . . . . . . . . . . . . . . . . . . . . . . . . . .61
4.5.3 The European researchers: Manchester Metropolitan
University. Centre for Research in Library and
Information Management (CERLIM) . . . . . . . . . . . . . . . . . . . . . . . . . .62
4.5.4 SCONUL Performance indicator project at Cranfield University . . . . . .63

5 Quality/best practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .65


5.1 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .65
5.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .66
5.3 Survey findings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .68
5.3.1 Why ‘quality'? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .68
5.3.2 Criteria for successful quality programs . . . . . . . . . . . . . . . . . . . . . . .69
5.3.3 Impact of programs on staff . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .71
5.3.4 Features of the organisation/structure . . . . . . . . . . . . . . . . . . . . . . . .72
5.3.5 Quality management frameworks . . . . . . . . . . . . . . . . . . . . . . . . . .73
5.3.6 Relationship between quality frameworks and tools . . . . . . . . . . . . . . .76
5.3.7 Informing future practice and integration into library operations . . . . . .77
5.4 Case studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .78
5.4.1 The Balanced Score Card (BSC)—Deakin University library . . . . . . . .78
5.4.2 Australian Quality Awards—Business Excellence
Framework—University of Wollongong and University of
Melbourne libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .82
University of Melbourne Library . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .88
5.4.3 Swinburne Quality Management System (SQMS) . . . . . . . . . . . . . . . .89
5.4.4 Internally developed quality framework—
University of Queensland . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .94
5.5 Future directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .98
5.6 International comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .99
5.6.1 University of Central Lancashire, Preston, UK,
Library and Learning Resources Services, LLRS . . . . . . . . . . . . . . . . .100
5.6.2 TQM/CQI program adapted to a large US university:
University of Michigan and the M-Quality program and
Libraries involvement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .102

6 Staff competencies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .107


6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .107
6.2 Competency standards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .108
6.3 Training for benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .110
6.4 Training for performance measurement . . . . . . . . . . . . . . . . . . . . .111
MORE CONTENTS

v
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

6.5 Training for quality/best practice . . . . . . . . . . . . . . . . . . . . . . . . .113


6.6 Future directions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .114
6.7 International comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .115
6.7.1 Training for Quality (CQI) in a US university: Purdue: a case study . . .115
6.7.2 Using library competency profiles: the Danish Business School library .117
6.7.3 Large scale training using M-Quality program to
introduce leadership training: University of Michigan . . . . . . . . . . . .117

7 Conclusions and recommendations . . . . . . . . . . . . . . . . . . . . . .119


7.1 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .119
7.2 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .119
7.2.1 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .119
7.2.2 Performance measurement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .120
7.2.3 Quality/best practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .123
7.2.4 Staff competencies and training . . . . . . . . . . . . . . . . . . . . . . . . . . .124
7.2.5 Overall . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .125

Appendix A: Benchmarking survey . . . . . . . . . . . . . . . . . . . . . . . .127

Appendix B: Performance indicators survey . . . . . . . . . . . . . . . . . . .129

Appendix C: Quality/best practice/performance measurement survey 131

Appendix D: Institutions visited . . . . . . . . . . . . . . . . . . . . . . . . . . .133

Appendix E: Useful sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .135

Appendix F: Suggested outline for a ‘Best Practice


Handbook for Australian University Libraries’ . . . . . . . .163

Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .167

Tables
Table 4.1 Use of CAUL and other performance indicators . . . . . . . . . . . . . . . . . . . .50
Table 4.2 Breakdown of ratings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
Table 4.3 Additional and important performance indicators . . . . . . . . . . . . . . . . . . .54
Table 4.4 Further developmental work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .55
Table 4.5 Use of performance indicators in benchmarking and
quality management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .57

ACKNOWLEDGEMENTS
vi
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Acknowledgments
The project team thanks the representatives of the following institutions who
gave up their time to host the team in a series of site visits as part of the
information gathering process.

In particular we would like to thank the staff members responsible for


coordinating the visits, preparing the program and collecting and distributing
information packages for the team:
• Curtin University of Technology Library
• Deakin University Library
• Queensland University of Technology Library
• State Library of Victoria
• Swinburne University of Technology, Library and Information Resources
• University of Melbourne Library
• University of New South Wales, Division of Information Services
• University of Queensland Library
• University of Western Australia Library
• University of Wollongong Library

Thanks also to all the institutions who responded to the benchmarking,


performance indicator and best practice/quality/performance measurement
surveys and who have given permission for their comments/views to be
included in the Report. The information proved invaluable in the preparation
of this report.

Thanks to colleagues too numerous to mention by name from each of our


respective institutions, who provided advice and assistance in both the
information gathering and report preparation phases. We are extremely
grateful for their help and encouragement.

Finally we acknowledge Margaret Sparks at Queensland University of


Technology for her administrative support and advice.

We would also like to acknowledge the interest and support of the


Advisory Team, in particular our Project Director, Gaynor Austen, for
keeping us on track and coordinating the administrative components, and
Jenny Christmass and Catherine Moore from DETYA for advice on the
technical and reporting requirements.

ABBREVIATIONS AND ACRONYMS

vii
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Abbreviations and acronyms


AARL Australian Academic and Research Libraries
ABN Australian Bibliographic Network
ACLIS Australian Council of Libraries and Information Services
ALIA Australian Library and Information Association
AQA Australian Quality Awards
AQC Australian Quality Council
BSC Balanced Score Card
CASL Council of Australian State Librarians
CAUL Council of Australian University Librarians
CAVAL Cooperative Action for Victorian Academic Libraries
CSF Critical success factors
CHEMS Commonwealth Higher Education Management Services
IFLA International Federation of Library Associations
ILL Inter library loan
KPI Key performance indicator
KRA Key result area
LIS Library and information services
QULOC Queensland University Libraries' Office of Cooperation
SCONUL Standing Conference of National and University Libraries
SQMS Scottish Quality Management System
UNILINC A not-for-profit organisation which coordinates the provision of
library technologies in the higher education sector with the aim
of saving costs and facilitating resource-sharing
UNISON University librarians in the State of New South Wales
U21 Universitas 21
WAGUL Western Australian Group of University Libraries

COUNCIL OF AUSTRALIAN UNIVERSITY MEMBERS - MEMBERS

ix
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Council of Australian University


Librarians—members
ACU Australian Catholic University
Adelaide University of Adelaide
ADFA Australian Defence Force Academy
ANU Australian National University
Ballarat University of Ballarat
BOND Bond University
Canberra University of Canberra
CQU Central Queensland University
CSU Charles Sturt University
Curtin Curtin University of Technology
Deakin Deakin University
ECU Edith Cowan University
Flinders Flinders University of South Australia
Griffith Griffith University
JCU James Cook University
La Trobe La Trobe University
Macquarie Macquarie University
Melbourne University of Melbourne
Monash Monash University
Murdoch Murdoch University
Newcastle University of Newcastle
NTU Northern Territory University
Queensland University of Queensland
QUT Queensland University of Technology
RMIT Royal Melbourne Institute of Technology
SCU Southern Cross University
Swinburne Swinburne University of Technology
Sydney University of Sydney
Tasmania University of Tasmania
UNE University of New England
UOW University of Wollongong
UniSA University of South Australia
USQ University of Southern Queensland
UTS University of Technology, Sydney
UWA University of Western Australia
UWS University of Western Sydney
VUT Victoria University of Technology EXECUTIVE SUMMARY

xi
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Executive summary
The project ‘Best Practice for Australian University Libraries' has investigated
current ‘best practice' activities within Australian academic libraries and
compared these with those in selected overseas countries. Within the context
of this report, ‘best practice activities' are considered to encompass the
implementation of quality frameworks, and the use of benchmarking and
performance measurement as tools for the continuous improvement of
products, processes and services. Staff competencies required for the effective
application of these frameworks and tools were also investigated.

Through a combination of surveys, site visits and an extensive review of


Australian and overseas literature on the topic, the project team has been able
to evaluate the extent of implementation of quality frameworks in Australian
academic libraries, and the degree and range of use of quality management
tools such as benchmarking and performance measurement.

Information for the report was gathered from a number of sources,


which included:
• an extensive review of the published literature on the topics of quality/best
practice, benchmarking and performance measurement;
• findings from three surveys conducted on the aforementioned topics;
• site visits to institutions identified through the survey responses as having
best practice frameworks in place; and
• site visits to overseas academic libraries in the United Kingdom, United
States of America and Europe in 1998 as part of an earlier and separate
exercise undertaken by one of the project team from the University of
New South Wales Library.

The survey population included all Australian (38) and New Zealand
university libraries (6), and the major Australian non academic research
libraries (State Libraries, National Library of Australia, CSIRO).

The project identifies a number of strategies for the pursuit of quality/best


practice in Australian academic libraries, through a review of the literature,
discussion of the survey findings and the inclusion of Australian and
overseas case studies. Evaluation of currently available methodologies for
library benchmarking, academic library performance indicators, the
applicability of quality management principles to academic library
management, and the application and usefulness of library staff competencies
have also been addressed.

xiii
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

To enhance the practical value of the project, guidelines for the application of
best practice will be developed in the form of a practical manual which
Australian academic libraries can use to assist them in implementing ‘best
practice' initiatives. A number of recommendations, formulated as a result of
this investigation, are referred to the Council of Australian University
Librarians (CAUL), DETYA and the Australian higher education community.

xiv
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

1 Introduction

1.1 Origin and background of the investigation


Today many Australian academic libraries are actively involved in the
implementation of quality frameworks and are utilising quality management
tools such as benchmarking and performance measurement. However, much
of what is happening within these libraries is not well known nor
communicated outside the institution. There is generally a lack of published
literature within the Australian scholarly journals on these topics, with the
result that libraries may either be duplicating effort in the development of
performance indicators, undertaking projects in isolation unaware of work
already being done elsewhere, or are being deterred from action by the
considerable task of investigating what is available and appropriate for the
institutional situation. Time factors are critical within university libraries and
there is a certain level of perception that it is all too hard to be worth the
effort. There are obvious benefits in a greater sharing of experience and
ideas, particularly for the smaller, regional and often less generously
funded institutions.

The Council of Australian Librarians (CAUL) has been concerned to facilitate


access by Australian university libraries to information which would assist
them with the implementation of best practice initiatives. The CAUL Executive
therefore developed a proposal to the Department of Education, Training and
Youth Affairs (DETYA) for funding to carry out an investigation into such
areas. This proposal was accepted, and resulted in the conduct of this project.

1.2 Project description


The project has investigated current examples of ‘best practice’ in Australian
academic libraries. The investigation has encompassed an examination of the
selection and implementation of quality frameworks, and the extent of
benchmarking and performance measurement activity. Comparisons with
selected overseas institutions in the form of case studies have also been
included. Through the data gathered, an analysis of best practice activity in
Australia and comparable overseas countries has been possible. A number of
key issues have been addressed:

1
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• The importance of, and methodologies for, library benchmarking;


• The identification of library activities appropriate for benchmarking;
• The development and use of academic library performance indicators,
including the continuing applicability of the current CAUL performance
indicators;
• The applicability of total quality management (TQM) and quality
management processes to academic library management; and
• The application and usefulness of library staff competencies.

1.3 Terminology
One of the challenges of the project investigation has been achieving
consensus on terminology. There are no universally accepted definitions for
best practice, benchmarking and performance measurement. However, it was
considered important that terminology be agreed at the outset to ensure that
survey respondents were all working with the same definitions. The literature
also supports the difficulty of finding agreement on terms, but stresses the
importance of finding and using an accepted/endorsed definition. The
problem surrounding the identification of a definition of benchmarking
highlights the whole problematic area of ‘quality’ terminology. Garrod and
Kinnell state ‘Many LIS managers balk at the use of jargon and view it as
cant. It is often regarded as the way in which professed ‘experts’ describe
what is basic common sense and good practice. Jargon tends to alienate the
uninitiated by combining the elements of exclusion and exclusivity. There is
therefore a need to find vocabulary to describe ‘quality’ methods and
practices that is acceptable to the majority of the LIS profession’ (1997,
p. 115). Experience has shown, both within Australia and internationally, that
once terms become more widely used and applied in a practical sense, they
become more acceptable.

1.4 Research methodology


The information required to address the key issues was gathered from a
number of sources.

1.4.1 Literature review


An extensive review of both Australian and overseas case studies and
literature published on the topics of best practice, quality management,
benchmarking and performance measurement was undertaken.

2
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The references included in the literature review were gathered from national
and international databases, the Internet, contacts with colleagues, visits both
within Australia and overseas, email discussion lists and other appropriate
sources. (Please note, URLS cited in the Literature Review, Useful Sources list
and Bibliography may change over time. They were last verified in June 1999).

In reviewing this material, emphasis has been on providing some insight into
the current level of activity of academic and research libraries involved in the
development and implementation of quality frameworks and tools, both
within Australian and internationally. For this reason, the focus for the
material reviewed and included in both the full Bibliography and Useful
Source List has been, with the exception of some key resources, on material
published within the literature over the last five years.

1.4.2 Surveys
Three surveys were developed and distributed to all Australian and
New Zealand university libraries and the major Australian non academic
research libraries. The surveys aimed to identify best practice activities being
undertaken within this sector, and focused on the identification of quality,
benchmarking, and performance measurement implementation, and self-
identified areas of best practice.

1.4.3 Site visits


On the basis of the survey findings, a series of site visits were undertaken to
those sites identified as exemplars of quality, performance measurement
implementation and best practice activity. The visits allowed for the
exploration and clarification of issues raised in the survey findings, and
enabled the documentation of best practice activities within these libraries
which may either be transferable to other sites or appropriate for inclusion in
a best practice Handbook.

As a result of an unrelated institutional initiative of the University of


New South Wales, a member of the project team undertook a series of visits
to a number of overseas sites in the United kingdom, Europe and the
United States in 1998. Through this exercise, information on existing
initiatives including the development and use of academic library
performance indicators (ISO, IFLA, EU funded projects), library benchmarking
(U21, CHEMS) and the applicability of quality assurance processes to
academic library management was fed into this project. This provided an up
to date summary which mirrored the Australian investigations. International
comparisons have been presented in the form of case studies in each of the
following chapters.

3
CONTENTS

1.5 Results
From the investigations described above, the report has been able to:
• provide a detailed evaluation of currently available methodologies for
library benchmarking;
• analyse the effectiveness of the current CAUL performance indicators with
recommendations for the amendment and extension of these indicators.
(A report on the matter will be submitted to CAUL later in 1999); and
• demonstrate the applicability of TQM, quality assurance processes and
quality management frameworks to academic library management.

From the overseas case studies and the survey findings, it is evident that
Australian and New Zealand university libraries are pursuing similar initiatives
to overseas libraries, and implementing in many cases similar frameworks and
programs. Both sides of the globe could benefit from a greater sharing of
these experiences. The project team will present a paper on the project and its
outcomes at the 3rd International conference on Performance Measurement in
Libraries and Information Services, in England in August 1999.

4
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

2 Literature review

2.1 Overview
In undertaking a review of the literature related to the key focus areas of this
project, the project team has concentrated on locating material to provide

contextual information and background. Information has also been identified


which examines the relationship between theory and practice in the areas
of university library best practice, quality management, benchmarking,
performance indicators and measures, and the use of competency standards.
Where possible, appropriate electronic resources are included.

In addition, the team also aimed to identify key material of benefit to those
requiring a concise introduction to the practical frameworks, resources and
the experiences of others in implementing best practice. The focus for the
material reviewed and included in both the full Bibliography and Useful
Source List (Appendix E) has therefore been, with the exception of some key
resources, on material published within the literature over the last five years.
In reviewing this material, emphasis has been placed on providing some
insight into current level of activity. As well, material provided by those
involved in the development and implementation of quality frameworks and
tools, both within Australian and internationally, has been included.

Those wishing to learn more of the historical background to the development


of quality systems and processes within Australian and overseas university
libraries will find some context in the chapters following. In addition, there
already exists a number of substantial reviews of the literature, which trace
the full development of these areas over the past thirty years.

Some of the most significant Australian items include the article by


Williamson and Exon (1996), which provides an overview of the quality
movement in Australian university libraries dating back to the late 1980’s.
The use of TQM in libraries, both in Australia and overseas, is reviewed
in Cooper (1996), whilst Harman, (1998), focuses on quality assurance
mechanisms for higher education, and their use as policy instruments by
Australian government over the last decade. This overview includes an
outline of the national Australian quality assurance program adopted by the
Commonwealth government over the period 1993–1995, and subsequent
changes to this program by the current Coalition Government.

5
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Internationally, some sources provide access to retrospective material. Garrod


and Kinnell (1995) compiled annotated entries on Total Quality Management,
Benchmarking, Quality Assurance, Quality Management, Business Process Re-
engineering, Performance Indicators/Measurement, Database quality
measurement, Quality systems, Self-assessment, SERVQUAL, Statistical Process
Control, and Training for quality. Sumsion (1996, 1997), in addition to
compiling a comprehensive matrix of performance indicators from major
sources to 1997, also provides an annotated listing of major reports,
frameworks and toolkits. The Poll and te Boekhorst IFLA manual of 1996
includes a comprehensive bibliography of sources. American College and
Research Libraries has a list of Sources of Information on Performance and
Outcomes Assessment, prepared by their Standards and Accreditation
Committee, at http://www.ala.org/acrl/sacguid.html

Even such a selective review of the literature may omit useful sources. This
review, and the chapters which follow, provides additional references to key
theoretical papers, standards and frameworks. References cited within the
report are included in the full bibliography. The annotated material in the
Useful Source List, (Appendix E) is included to provide those interested with
additional content from a small number of sources of particular interest.

A very real difficulty with evaluating the literature in this field is the
interchangeable and, to the novice, potentially confusing use of the
terminology. Terms ‘are sometimes used almost as synonyms’ (Calvert
1995, p.438). Each source needs to be approached with caution to clarify
exactly what the author defines as ‘statistic’, ‘metric’, ‘benchmark’,
‘performance indicator’, ‘performance measure’, ‘performance management’,
‘best practice’, etc.

2.2 Benchmarking

2.2.1 Australasian sources


There is a small but growing amount of material related to benchmarking in
the Australian and overseas literature. An understanding of benchmarking
practice, and the basic processes involved, can be readily obtained from a
number of general introductory sources. In the Australian context, the
Australian Best Practice Demonstration Program aimed to encourage
Australian companies to adopt international best practice, identify how it
might be effectively implemented, and to share the knowledge gained with
wider industry groups (Byrne 1995). Benchmarking was seen as an integral
part of the process of identifying excellence and using it to drive

6
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

improvement, and, to facilitate this, a ‘Benchmarking Self-help Manual’ (1995)


for use by participating organisations was developed. This manual still
provides a useful step-by-step introduction to benchmarking for the novice
and, although case studies focus on manufacturing processes, the relationship
to (for example) library technical services operations is clear.

Anne Evans 1994 text ‘Benchmarking: Taking your Organisation Towards


Best Practice’, has also provided library managers with an authoritative and
readable guide to the basic principals and practice of benchmarking within
organisations, and introduces a five step model that organisations can use to
integrate benchmarking into organisational practice. Building on this is the
useful Australian web site ‘Benchmarking in Australia’, (http://www.ozemail.
com.au/~benchmrk/), the home of Benchmarking Plus, Evans’ consulting
group which provides links to a database of information about best practice
and benchmarking, with a focus on Australia and New Zealand. A browse
through this site provides some insight into the current level of interest and
participation in benchmarking across a number of industry groups. Several
articles are currently available for downloading from this site including ‘The
Nuts and Bolts of Benchmarking’ (Evans & Coronal 1999)—a useful overview
of benchmarking practice.

The other key Australian web site is that of the Australian Quality Council
(http://benchnet.com/aqc/), which contains access to information on
Australian benchmarking networks and also restricted access to the
benchmarking data repository Benchmarking Edge. University libraries are
participants in these activities in an, as yet, small way.

That benchmarking is being actively considered and used as a quality tool


within the Australian university library sector is evidenced by the level of
involvement reported in survey responses and subsequent site visits
undertaken during the course of this project. Both individual libraries and
consortium bodies such as the Western Australian Group of University
Libraries (WAGUL) and the national Council of Australian University Librarians
(CAUL) have initiated recent and ongoing projects.

Some of these activities have been reported in both the print literature
(Robertson & Trahn 1997; Neumann 1996) and electronically via the World
Wide Web, (http://www.unilinc.edu.au/curtin.html, Curtin Cataloguing project,
http://www.ntu.edu.au/library/bench1.htm, NTU Acquisitions, cataloguing
and processing project).

One of the most comprehensive reports available in the Australian literature


to date describes a project undertaken by Queensland University of
Technology (QUT) and the University of New South Wales (UNSW) during
1995/1996. Initiated by QUT, the project focussed on several areas broadly

7
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

defined as technical services, document delivery and research support


services (Robertson & Trahn 1997). This case study provides a useful insight
into the development, methodology, implementation and outcomes of a major
benchmarking exercise from the perspective of both the benchmarker (QUT)
and the partner library (UNSW).

What is evident, in this report and others, is that, in order to be successful,


and to ensure positive outcomes for all partner libraries, benchmarking must
be approached with some insight into the potential pitfalls and problems that
may arise during the course of the exercise. A number of sources (Evans
1994; Evans & Coronal 1999; Robertson & Trahn, 1997) outline some of these,
including the need to ensure agreed outcomes for all partners, participative
training and awareness for all staff involved, the need for benchmarking to sit
within a broader framework of quality management and improvement, and
choice of appropriate partner. A more comprehensive view of the potential
difficulties involved in benchmarking library services, based on the
experiences of several benchmarking projects involving UNSW Library, is
outlined in Trahn (1998).

The current (1999) activities of select working parties in consultation with


the McKinnon Consulting group to produce concise benchmarks for
Australian university functions across the spectrum is being watched with
great interest. A number of these benchmarks will be applicable to library
and information services, and any publicly available report or literature will
become required reading.

An obvious gap in the literature reporting benchmarking experiences within


Australian university libraries relates to many of the smaller institutions within
the sector. Surveys undertaken as part of this project indicate, however, that
there is a greater involvement than evidenced in the literature, and that many
smaller libraries are keen to utilise benchmarking as part of wider quality and
continuous improvement programs. It is a characteristic of benchmarking that
much useful data resides on University Library intranets and in filing cabinets.
If we are to see real energy within this area it may be that some of these
experiences need to be shared in the literature, in order to reassure others
that successful benchmarking exercises, leading to positive improvements, can
be achieved with limited resources.

2.2.2 International sources


Benchmarking activity, in North America beyond the university sector, is
dominated in the US by the large commercial organisations, which focus on
broad picture industry metrics gathered by electronic surveys, rather than the
detailed process benchmarking to improve specific practices for which

8
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Australian activity tends to aim. Examples of large industry metrics, focussing


on organisation and resource allocation benchmarking, can be seen at the
Benchmarking Exchange (TBE web) site at http://www.benchmarking.org/.
It is interesting to note the large number of Australian organisations registered
as members with the associated electronic benchmarking network of this US
resource. Few of these topics, or the survey approaches, are specifically
applicable to the university context, or to library and information services.
The definitive commercial handbook on benchmarking was written by Robert
Camp (1989), currently of The Best Practice Institute, who has also been
associated with TBE activities in recent times.

As in other countries, it seems to be the special libraries, with their small size
and frequent organisational links with large commercial organisations, that
have undertaken useful work in the area. The Canadian Health Libraries
Association/Association des bibliotheques de la sante du Canada CHLA/ABSC
1998 Benchmarking Task Force Benchmarking Tool Kit is a model of
practicality and a kit which will assist these mainly small libraries
to find benefit from benchmarking. The U.S. Special Libraries Association also
has an active interest in benchmarking and useful publications on their
web site.

US academic libraries, at least the larger research libraries, are part of


extremely well endowed organisations with strong traditions of individual
independence. Concerns about the outcomes of any disclosure of their
internal operations have led these libraries not to be active participants in
benchmarking exercises to date. Changing economic circumstances may
alter this outlook. This said, the medium term outcomes of the ARL Access
and Technology Program/ILL/DD Related Resources Measuring the
Performance of Interlibrary Loan and Document Delivery Services
http://www.arl.org/access/illdd/illss.shtml, which has included subsequent
symposiums and workshops on strategies to redesign ILL/DD services
using the identified characteristics of low cost, high performing ILL
operations, may be of international significance. No other library process
improvement process has worked from such a large databank of performance
data. If the outcomes of this project include widespread adoption of best
practice, as one of the stated purposes indicates, then this national exercise
between university libraries will have been of global importance.

Whilst the overall focus of the National Association of College and University
Business Officers Benchmark Program is necessarily on the full range of
university administrative functions, amongst their List of Benchmarks and
Processes, located at http://www.nacubo.org/website/benchmarking/
program.html, is information under the heading Library. This is one of the
rare sources of publication on benchmarking activities in North American

9
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

universities. Many US library administrators see the NACUBO approach as too


costly, or not appropriate for them. However a methodology has been tested
over a number of years and the comprehensive range of university
administrative functions analysed within the same approach could provide
some interesting perspectives.

For snap shots on the state of the art in 1995 and 1997 in all the areas
covered by this report, including benchmarking, albeit with a European focus,
the best source is the two published proceedings of the Northumbria
International Conferences on Performance Measurement in Libraries and
Information Services. Articles by Stephen Town in both proceedings to date
cover British developments in benchmarking. Town undertook, at the
Shrivenham campus of Cranfield University, the first major benchmarking
exercise within the UK university library scene in 1993, and any of his
publications since refer to that and subsequent exercises. The 1999
conference will include the first English language report of a noteworthy
benchmarking exercise in the Netherlands. This will be useful, as English
language reports on European activity are rare.

Penny Garrod and Margaret Kinnell Evans, of Loughborough University


Department of Information and Library Studies, have published a series of
articles (1996–1998) arising from their BLRD report of 1995 on benchmarking
best practice. In this, different models of benchmarking, and their suitability
to different types of libraries, were discussed. Demonstrator projects were set
up, focusing on the academic and special library sectors. Even in the most
recent (1998) monograph, edited by Brockman, and the Garrod and Evans
article, seemed to indicate that benchmarking is actually not yet widely
understood or practiced in the UK higher education sector.

Some of the extensive benchmarking activities of the Standing Conference of


National and University Librarians (SCONUL) Benchmarking Group, chaired
by Stephen Town will be reported upon at the third Northumbria Conference
on Performance Measures in Libraries in August 1999. Benchmarking in the
UK is dominated by the new universities. Most British academic libraries, it
would seem, continue to rely on "informal" benchmarking related activities.
A benchmarking handbook is also planned by SCONUL for 1999 in relation to
this round of activities.

University libraries and information services around the world have recently
(1998) participated in an Association of Commonwealth Universities
University Management Benchmarking Club exercise, which examined
libraries using a specifically developed framework. Detailed results are not
publicly available, but some information exists about the CHEMS approach on
the ACU web site. The EFQM framework mentioned in the quality framework
sections of this report forms a part of the CHEMS approach. See
http://www.acu.ac.uk/chems/benchmark.html and Fielden (1995).

10
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

2.3 Performance indicators and measurement

2.3.1 Australasian sources


The history of performance indicator use and development within the
Australian university library sector really dates back to the development of
statistics contributed to Australian Academic and Research Libraries (AARL)
from 1953, published as an annual supplement to the journal of the same
name. These statistics have since been used as a benchmark of service
provision by libraries within Australia and New Zealand. Despite the relative
abundance of research into the development of specific indicators for the
sector since, the AARL statistics, like those collected by the Association of
Research Libraries in the US and HECFE in Britain, have been slow to
develop into a group of universally accepted indicators.

Taking the AARL statistics as a starting point, an index and description of the
performance indicators described in the literature published up until the end
of 1995 was complied into ‘Performance Indicators Database: Selecting the
Best Performance Indicators for your Library’ (Exon & Williamson 1996). This
database includes indicators suitable for measurement of the performance of
libraries, and incorporates a spreadsheet describing each indicator, a
reference to the source(s) of the indicator, a statement of how it is calculated,
and an index of applications for which it might be used. Whilst not
specifically focussing on indicators developed within Australia, it does
provide a useful insight into the level of development in this area up until
this time. The Exon paper in the first Northumbria Conference proceedings in
1995, Developing Performance Indicators for an Australian University Library
covered the same territory.

Out of the 1993–1995 Australian university Quality audits (see below


Quality best practice) there developed an increased interest in the use of
performance indicators as a measure of both the effectiveness and quality of
Australian university library services. At this time the Council of Australian
University Librarians undertook a project to develop a suite of indicators that
would have relevance across the sector (Byrne 1997). Three indicators were
developed—the Library/Clientele Congruence Indicator (Gorman & Cornish
1995), Document Delivery Quality Indicator (Robertson 1995), and Materials
Availability Indicator (Taylor 1995). These indicators have been applied in a
number of Australian university libraries. A 1996 survey identified the level of
usage of indicators, suggested improvements to existing indicators and
potential areas for development of additional indicators. This survey has been
updated 1998/1999 as part of this project and provides a useful summary of
the development and use of performance indicators and related measures

11
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

across the sector. The 1996 survey results are available on the CAUL website
and the 1999 survey results are due to be posted there mid 1999
(http://www.anu.edu.au/caul/).

A key area for development of specific indicators identified by survey


respondents is that of university library reference services. The most recent
contribution to this area comes from the work of the CAVAL Reference
Interest Group (CRIG) Working Party on Performance Measures for Reference
Services. Established in 1994, the Working Party has now produced three
reports, the final one consolidating the previous work done to identify
performance measures and indicators currently used to evaluate reference
services in Victorian academic libraries. The ‘Final Report’ (CAVAL Reference
Interest Group 1999) outlines a model of reference service effectiveness
derived from the data collected from user groups, and identifies twelve key
performance indicators used by library staff and customers. Three dimensions
of service evaluation are described—Attributes, Support, Knowledge (ASK
model). Although too early to evaluate the usefulness of this model, interest
in its potential application is likely to be high.

The search for adequate measures of library effectiveness has been carried
out extensively in New Zealand by Calvert and Cullen, and will be referred to
in the following chapters. Of the many articles and reports published
describing their work, ‘Stakeholder perceptions of University Library
Effectiveness’ (1995), provides a good overview of the research methodology,
and outlines subsequent indicators developed from responses to the study
questionnaire (based on a literature search of previously identified indicators).
Within this study, libraries are viewed as social agencies that must be
responsive to the needs and wishes of various constituencies. The authors
suggest that in the process of analysing library effectiveness, it is necessary to
determine not only what was done but also which of these tasks the library
was supposed to do. Calvert has also been active in the area of service
quality measurement and has applied the service quality framework
developed by Hernon and Altman (1996) to a study focussed on ‘moving
theory into practice’, with the adoption and testing of the framework in seven
university libraries in New Zealand (Calvert 1997; 1999).

There is to date a large gap in the Australian literature on the identification


and development of performance indicators for the electronic library. Most
interest has been focussed on the international work of Charles McClure,
Peter Brophy and EQUINOX, and its related and ancestor projects described
below. The inherent difficulties associated with measuring electronic resource
use and delivery, in particular web based resources, provides a challenge for
library managers. A useful outline of the key challenges, together with a
summary of key indicators developed at the University of New South Wales to

12
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

measure database and web use, is provided by D’Avigdor, Cargnelutti et al


(1996/7/8) and is one of the few local examples in this area
(http://www.library.unsw.edu.au/~eirg/eirg.html).

2.3.2 International sources


In searching for the past and current state of performance measurement for
North American research libraries, the Association of Research Libraries web
site is a key source, especially the performance measures, special projects
and statistical data base areas (http://www.arl.org/stats/perfmeas/). Historical
sources can be found in the Sources of Information on Performance and
Outcome Assessment on this site. There are also links to summary information
on SERVQUAL as a mechanism for assessing service quality, and to the
Wisconsin-Ohio Reference Evaluation program, a quasi standard in relation to
reference service assessment. All of these documents are useful.

Whilst the impressive ARL statistics site provides longitudinal data and a
sophisticated ability to manipulate the data for customised reports, Jewell
(1998) at http://www.arl.org/stats/specproj/etrends.htm summarises the results
of an ARL exploration of possible financial indicators for electronic resources
and services. This was a pilot project attempting to move the statistics
database into the electronic age. In the tradition of the ARL statistics, the
initial focus is on resource allocation in the new arena.

For electronic services indicators, the seminal paper remains Assessing the
Academic Networked Environment: by Charles McLure and Cynthia Lopata
(1996), http://istweb.syr.edu/~mcclure/network/toc.html. The Coalition for
Networked Information project of the same name is also useful for an outline
of reports on a series of activities in a number of institutions attempting to
put into use some of the indicators suggested by McClure. Virginia State
University, Brown University and University of Washington are among the
participants. (http://www.cni.org/projects/assessing/reports/)

McClure is also involved in two current U.S projects of potential


interest. One, co-directed with John Bertot, has just released a number
of working papers on Developing National Public Library and State wide
Network Electronic Performance Measures and Statistics at
http://www.albany.edu/~imlsstat/. Another is Performance Measures for
Federal Agency Websites, by Sprehe and McClure.

Older material in the information technology area specifically related to


university campuses includes Self assessment for Campus Information
Technology Services CAUSE Professional Paper No.12 (http://cause-
www.colorado.edu/information-resources/ir-library/abstracts/pub3012.html),

13
CONTENTS

HEIR Alliance Evaluation Guidelines for Institutional Information Resources


(http://cause-www.colorado.edu/information-resources/ir-
library/abstracts/hei2000.html). The HEIRA guidelines are particularly
important, as they have been endorsed by ARL, CAUSE and EDUCOM and
thus have the joint imprimatur of the library and IT organisations. There are
also sample data collection forms and references to IT surveys, including a
link to the Indiana University Information Technology Services IUTS/IUPUI
annual survey summary 1998 at http://www.indiana.edu/~uitssur/survey/
iupui_summary98.html. A related HEIRA Alliance document which outlines
what amounts to a picture of a ‘best practice’ or leading edge university
IT environment, An Example of the Information Technology Environment
at an information-resource-intensive institution is available at
http://www.educause.edu/collab/heirapapers/hei1061.html.

A range of articles and editorials on performance measures have appeared


regularly over the past few years in the US publication, Journal of Academic
Librarianship, including articles by Altman (1998) Hernon (1999), etc, and a
browse of recent issues will yield a dozen articles worth exploring.

One area in which the US is taking a lead is in the digitisation of elements of


the huge US research library collections. Having committed large resources to
such projects, some are articulating performance measures in relation not just
to the technical development of the project process, but to the quality of the
resources and their methods of delivery to the user, as seen by those users.
One impressive evaluation exercise is contained in the report of analytic
principles and design of the Online Books Evaluation Project of the Columbia
University Digital Libraries Collections, at http://www.columbia.edu.cu/
libraries/digital/texts/protocol/index.html.

The single most useful site for up to date information on current European
work is the new EQUINOX Library Performance Measurement and Quality
Management System site at http://equinox.dcu.ie/which brings together
unfinished aspects of earlier European Union projects, including CAMILE,
which was to publicise the outcomes of EQLIPSE, MINSTREL, DECIMAL,
DECIDE. There are direct links to each project.

The newly posted list of potential performance indicators for electronic


library services is also essential reading. On this site there are also links to the
MIEL2 final report at http://www.ukoln.ac.uk/dlis/models/studies/mis/mis.rtf.
The MIEL2 Final Report ranges across possible sources of appropriate
management information for the electronic library which can be used for
planning, evaluation and day to day management. MIEL 3, 4,and 5 are due to
report in 2000. Professor Peter Brophy has an involvement with most of these
projects and has also given an excellent progress summary paper in regard to
performance indicators for electronic services at the 1998 VALA Conference in

14
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Melbourne. ELINOR software development (De Montfort University at


http://www.iielr.dmu.ac.uk/Research/index.html) focuses on usage of
electronic resources with further development promised in close liaison with
the implementation of Effective Academic Library (see below).

The electronic age in libraries has brought new areas into consideration.
Measuring quality in databases, electronic sources and services projects, web
sites and so on has emerged as new roles and mechanisms become
important. The Centre for Information Quality Management, originally seeded
by the British Library Association and On-Line Users Group, has developed a
useful role over the past few years http://www.la-
hq.org.uk/liaison/ciqm/ciqm.html. An article by Armstrong on Metadata, Pics
and Quality which gives an indication of current work in this area can be
found at (http://www.ariadne.ac.uk/issue9/pics).

A very useful summary of international work up to 1995, and a single source


of detailed information on indicators, is the Sumsion Matrices of Performance
Indicators at http://www.staff.dmu.ac.uk/~camile/matrices/intro.htm
(Commissioned as part of the DECIDE project).

There are a number of key works which serve as a starting point for much
current work. In Britain Brophy and Sumsion draw on the Joint Funding
Council, Ad-hoc Group on Performance Indicators for Libraries publication,
the Effective Academic Library (EAL) 1995. Another refinement of this work
appeared in 1998 in the Cranfield project, reported as Academic library
effectiveness: a comparative approach by Jane Barton and John Blagden. Their
listing of key management statistics appropriate for university administrators is
also worth examination.

All librarians should refer to the 1996 International Standards Organisation


ISO/DIS 11620 Information and documentation —Library performance
indicators. The indicators are conservative but tested, in common use and
applicable in most libraries.

University library staff also need to refer to Poll and te Boekhorst’s 1996 IFLA
publication Measuring Quality: international guidelines for performance
measurement in academic libraries. With their international authority and
thorough approach to definition and methods of compilation, both
publications are key references.

Some international cross pollination is in evidence in the Pickering and


Crawford BLRD report on The stakeholder approach to the construction of
performance measures, in which fifteen British academic libraries used a
Calvert and Cullen inspired method, in the tradition of Van House, to design
and administer a survey to ten stakeholder groups. Comparisons with Follett,

15
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

the Effective Academic Library, ISO and IFLA publications are included. The
report recommends that the stakeholder approach, based on interview and
focus groups, should be used in relation to measurement of electronic
services and in future information planning strategy.

As with each of the areas discussed in this project, there are papers in both
proceedings of the Northumbria International Conference on Performance
Measurement in Libraries and Information Services (1995 and 1997) in relation
to:
• Performance measures and indicators;
• Qualitative measurement;
• Electronic/digital library measurement; and
• Managing information services (role for performance measurement in
changing styles, structures, procedures.

The Willemse 1997 survey report on Performance Assessment in IFLA and


United Kingdom Academic Libraries found that only a minority of European
libraries were undertaking and disseminating evaluative activities in relation to
their services.

2.4 Quality/best practice

2.4.1 Australasian sources


Only within the last five years has the Australian literature started to reflect a
level of interest in, and implementation of, best practice activity in university
libraries. As outlined by Williamson and Exon (1996), much of the impetus to
explore quality frameworks and tools within university libraries has come
from the 1993-1995 Quality Audits undertaken by the then Commonwealth
Labour Government. In seeking to evaluate and subsequently reward the
level of quality assurance processes in place in Australian Higher Education
institutions at the time, the government provided university libraries with the
necessary environment to promote and develop quality management systems
and frameworks.

Australian university librarians were, however, aware of, and actively


applying, the concepts of TQM to the management of libraries prior to this
period, as outlined in Cooper (1996) and by Williamson and Exon (1996).
The latter's 1994 survey, conducted after the second round of university
quality audits, sought university librarians’ ‘perceptions of, and participation
in, the quality audit process’ (p. 526). The results of this survey indicated that
a significant number of university libraries were involved in the development

16
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

of quality assurance processes and management, both during, and prior to,
this period, and this activity has provided a base upon which subsequent
progress can now be measured.

The Australian Best Practice Demonstration Program, which operated prior to,
and during, the quality audit period referred to above, was successfully used
by the Northern Territory University Library to ‘improve research information
services from a client perspective’ (Byrne 1995, p. 17). A number of specific
projects were developed within the broader program. NTU library was one
of only a few service organisations to adopt and work with guidelines
largely aimed at commercial operations, and their experience remains a
leading case study in the adoption of best practice and quality improvement
processes and programs within the Australian academic library sector.
A comprehensive report of their experience (Wilson and Byrne 1996) outlines
the program background, guidelines, principles and methodology, and
offers some lessons for others. The final report of this program is available
at (http://www.ntu.edu.au/admin/isd/qsdc/bppage.htm).

Other examples in the literature of quality management programs in practice


include an outline of the Monash University Library introduction of TQM and
Quality Assurance programs (Groenewegan 1996)—useful for its honesty in
documenting the problems encountered, and the implementation of Total
Quality Service (TQS) at Victoria University of Technology (Parker 1996)—a
project based on the work of Karl Albrecht and extended by external
consultants. The University of Melbourne is actively adopting the Australian
Business Excellence Framework (Phillips 1998; Presser and Garner 1999). The
‘quality journey’ undertaken by the University of Wollongong Library has seen
the successful adoption of the Australian Business Excellence Framework
(Australian Quality Council 1998), and a subsequent Achievement in Business
Excellence Award from the Australian Quality Council (McGregor 1997). The
University of Wollongong’s experiences are further explored in Chapter 5 of
this report.

2.4.2 International sources


The overseas literature reflects an international pattern, which places North
America more in the TQM/CQI tradition, and in which the UK has had a
number of government sponsored alternative frameworks available to give
process improvement a coherent setting. The use of ISO9000 series has been
localised, but interesting, and the current international trend is for a real role
for adaptations of national quality award frameworks, sometimes customised
for public services, education and training.

17
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The most prolific North American proponent of TQM in relation to libraries is


St Clair, who has published a range of material over the past five years.
St Clair visited Australia a few years ago, giving seminars in Canberra
and Melbourne.

A number of US academic libraries became participants in institution wide


quality programs launched a few years ago, based on TQM concepts.
Examples of such programs still in existence include Excellence 21 at Purdue
University at http://thorplus.lib.purdue.edu/ex21/and the M-Quality program
at University of Michigan. Other academic libraries such as the University of
Arizona have retained a strong emphasis on teams as the approach to
Continuous Quality Improvement (http://www.library.arizona.edu/library/
teams/prf/measurements.html). All of these developments are mentioned in a
summary source Special Libraries, Summer 1993 Benchmarking, Total Quality
Management and the Learning Organisation.

The interest of US academic libraries in the national quality awards


framework of the Baldridge criteria have received a boost in the last couple
of years with the publication and promotion by the US government of a
Baldridge framework tailored for educational organisations. Some US
universities, where interest in the TQM/CQI approach has stalled, are
considering using this framework to improve internal processes.

The simplest ‘how to’ source still in print in relation to implementing


ISO9000, in a special library context, is the Ellis and Norton 1993 ASLIB
publication Implementing BS5750/ISO9000 in Libraries.

A more recent and comprehensive publication from ASLIB (1996) by Brophy


and Coulling, is Quality management for information and library managers,
which covers an introduction to quality frameworks including ISO9000,
performance measurement from the customer’s perspective, national quality
awards and TQM, as well as the British Citizen’s Charter approach.
Benchmarking and performance measurement are dealt with as tools which
contribute towards TQM.British universities have utilised a range of quality
frameworks, a number of which have the imprimatur of the government, such
as the Charter Mark in the service arena, and Iip for human resource practice.
The British human resources focused framework Investors in People in
Higher Education http://www.lboro.ac.uk/service/sd/iipihe/iipinh1.1htm has
excited some interest beyond Britain. A number of British university libraries,
including the Universities of Central Lancashire and Wolverhampton, have
already had their library human resources practices assessed against this
framework. A link at http://www.lboro.ac.uk/service/sd/iipinhe/matrix.html
leads to documentation used by the Pilkington Library at Loughborough
University for an internal assessment using the IiP framework. The site
includes useful links to the IiP home page.

18
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

An example of the range of self assessment frameworks deemed appropriate


for UK public libraries is investigated by Assessment Tools for Quality
Management in Public Libraries Project (1998-1999), by Jones and
Usherwood. These included:
1. The Democratic Approach (from the UK Institute of Public Policy Research
1991)
2. The Quality Framework (from the UK Local Government
Management Board, 1989)
3. The Business Excellence Model (the European Framework for
Quality Management 1997) http://www.efqm.org/.
The project report will be posted at http://info.lboro.ac.uk/departments/
dils/research/qualman.htm. Customised frameworks and a tool kit are due to
be issued during 1999.

A number of university libraries in Europe are currently working to adapt the


Business Excellence Model framework for quality management purposes in
their own libraries. These include the Copenhagen Business School and
libraries in the Netherlands.

19
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

3 Benchmarking

3.1 Terminology
Library benchmarking has been described as ‘a friendly competitive
intelligence activity’ (Gohlke, 1997, p. 22). In trying to provide a universally
acceptable definition of benchmarking it is useful to describe the
characteristics of benchmarking. Although there are different types of
benchmarking and various models or approaches have been tried and tested,
a general consensus as to what benchmarking is, and what it involves, has
gradually emerged. The language which is used may vary but the principles
are the same:
• A structured or systematic approach to finding improvements and
implementing best practice;
• A continuous process of measuring products, services and practices
against leaders;
• A focus on processes (individual processes, which are deemed vital to
customer satisfaction, are suitable choices for benchmarking programmes);
• An emphasis on learning. Benchmarking should not be regarded simply as
a comparative exercise, or be totally results oriented (Garrod & Kinnell,
1996, pp. 142–143); and
• A foundation of sound measurement and comparison.

Benchmarking involves examining current services, identifying inefficient


practices and processes, and learning from those who have achieved
success. It is ‘a formal process of measuring and comparing an existing
process, product or service against that of recognised top performers’
(Allen, 1993, p. 123).

3.2 Methodologies for library benchmarking


To implement benchmarking successfully, a well-structured methodology
should be followed. The benchmarking process is normally documented as a
series of steps which may range from six to twelve depending on the way
each step is described and the level of detail included. Whilst language may
vary from one text to another, the following components have been distilled
from the literature as describing a standard methodology to be applied to a
benchmarking project:

21
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• recognise the need for change, gain commitment and set the scope;
• identify process to be benchmarked (subject) and how the process will
be performed (approach);
• select team and train members;
• analyse own processes within the broad area already defined;
• define and understand the process to be benchmarked;
– identify measures and collect process data;
• establish (call for) benchmarking partner(s);
• seek background information and process data;
• analyse and compare data against own internal process;
• finalise partner(s);
• conduct visits;
• analyse results;
– calculate measures and define within partner organisations, practices
in use in own organisation;
– compare values and identify differences;
– quantify effect of difference in practices and measures between
own organisation and partners;
– relate quantifiable differences to the practices employed and
determine which are significant to the goal of improving the
benchmarking process;
• develop action plans;
– determine cost effective means of achieving desired improvement
in benchmarked process and produce plan to be used to implement
the improvement;
• implement and monitor;
– put action plan to work and improve process;
– measure the improvement and identify causes, if any, for differences
between expected level of improvement and level attained; and
• benchmark again if necessary.

Variations have been observed in the approaches to benchmarking applied


within Australia, for example, by participants in programs under the auspices
of the Commonwealth Higher Education Management Services (CHEMS)
University Management Benchmarking Club, Universitas 21 and the Australian
Quality Council Benchmarking Network. All of these, however, aim to
achieve the rigorous and systematic approach which characterises formal
benchmarking. In these instances, it has been necessary to vary the standard
methodology because the exercises undertaken have usually been far broader

22
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

in scope and not limited solely to quantitative, process based activities. In


these exercises also, partner selection has, to a certain extent, been
predetermined by virtue of institutional membership of the organisations or
benchmarking network.

3.2.1 CHEMS
CHEMS is the management consultancy service of the Association of
Commonwealth Universities based in London. In 1996 CHEMS launched an
international `University Management Benchmarking Club’ for universities
primarily, but not exclusively, from the Commonwealth. This inheritance
explains the considerable level of Australian involvement in their activities.
The Club offers participating institutions the opportunity to compare their key
management processes with those of a range of comparable institutions.
Unlike most benchmarking initiatives, this Club focuses on the effectiveness
of university wide processes and not narrow departmental functions. It is
developmental in the sense that each year project members review the
methodology and refine it further (CHEMS, 1999).

The CHEMS methodology works as follows:


• Determination of topic by members;
• Development of a framework of open questions—each topic is divided
into five sub-topics, covering aspects such as policy and strategy,
implementation, monitoring and review, communication etc;
• Written submission provided by members responding to the questions and
highlighting any strengths or weaknesses on their part;
• Assessors distill the responses to ascertain the appropriateness of the
elements of the management processes adopted by each member
university, based solely on facts as declared by the university;
• Banding of performance of members by assessors against each sub-topic
using ‘approach, application and outcome analysis'—banding aims to
identify members who might be regarded as best in group;
• Workshop for members to discuss elements of good practice gleaned from
responses; and
• Distribution of final report including sections setting out the key features of
what members and assessors agreed to be best practice—each university
may self-assess against each best practice element—provides a guide
enabling members to make contact and, if appropriate, conduct further
benchmarking against best practitioners.

23
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The Club believes this encapsulates true benchmarking ie in the absence of


predetermined benchmarks, the aim is to establish benchmarks through the
process, it has proved possible to derive agreed benchmarks which can
themselves be used in future to guide management in the quest for
continuous improvement (CHEMS 1999).

3.2.2 Universitas 21 methodology


Universitas 21 provides a model for a new kind of network of universities to
advance the central scholarly purposes of its member institutions and to
enrich the essential educational and research enterprises at the heart of the
idea of a modern university. It is a small, tight-knit association of kindred
universities, and provides a framework for benchmarking of performance and
quality of its members. Universitas 21 offers its members operational links
around the world, via a network that functions as a loosely-coupled system
through which the member universities will derive substantial benefits from
organised, targeted collaboration.

U21 does not espouse a particular benchmarking methodology, but rather,


through its membership, provides the potential for engaging in benchmarking
between partners of similar standing, size and activity. The Association will
provide international benchmarks relating to quality, funding levels, fee
requirements, research outcomes, student support services, infrastructure
provision, management efficiency and related matters for member institutions
(Universitas 21, 1999).

3.2.3 AQC benchmarking methodology


Through its benchmarking networks, the Australian Quality Council provides
forums for the exchange of ideas and processes. In 1999, the networks will
focus on various topics such as Leading, Managing Improvement and
Change, Senior Executive Leadership, Balanced Scorecard Performance
Measurement. Participants often include organisations that have been
recognised through the Australian Quality Awards framework, thus
providing an opportunity to benchmark performance with those considered
to be ‘best in class’. All benchmarking projects are structured using a standard
approach and observing the Australian Quality Council: Benchmarking
Code of Conduct. Results of benchmarking activities form part of the AQC
Members Benchmarking Edge internet site, which contains a databank of
leading edge benchmarking information concerning outstanding practices,
structured by business process and the Australian Quality Awards criteria
(http://www.benchnet.com/aqc/).

24
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

3.3 Background
An examination of the literature prior to 1994 reveals that little has been
published on benchmarking. However, benchmarking as a term has been
used for a long time in the context of establishing ‘benchmarks’. It is the
concept of a standard to measure against that has been retained in
applications in quality management. ‘Benchmarking techniques were applied
in the early eighties but were not well documented until the first fully
descriptive book about benchmarking was published by Camp in 1989.
Camp's book was the first attempt to provide focus and solidify the concept’
(Allen 1993, p. 123).

Up until the mid 1990s, library benchmarking activities tended to be informal


rather than formal. The intent behind these informal activities was usually to
exchange ideas on particular aspects of operations, Robertson and Trahn note
that libraries have a tradition of sharing information about inputs, outputs,
processes, practices and policies. Cooperation, not competition, has been a
strong ethos. Information has been exchanged informally at professional
meetings, visits to other institutions, study tours and talking to colleagues at
other libraries in the course of daily work. Formal statistics on selected inputs
and outputs have been shared via cooperative collection and reporting in the
AARL Library Statistics and provide statistical summaries which librarians have
used, and continue to use, to establish how well they are performing in
relation to other institutions (1997, p. 128).

Information exchange has also been encouraged by professional groupings


(ALIA, regional and national groups such as UNISON, CAVAL, CAUL). In
addition, ‘the academic library community, through the use of the CAUL
questionnaire, has provided a structured format by means of which a
university librarian can survey all CAUL members on a topic of interest and
share the results with all CAUL members’ (Robertson & Trahn, 1997, p. 128).

To summarise, benchmarking in the early part of the decade was less likely
to focus on a particular process or subprocess and lacked the systematic
approach which characterises formal benchmarking. The similarity between
the formal and informal approach has been in the philosophy behind the
activity—the desire to improve products, processes and services by
comparing performance with others, usually but not always in the
same industry.

Until the mid 1990s formal benchmarking activity was largely confined to the
manufacturing industry, and was associated with the measurement of
operations used in the manufacture of tangible products. ‘In both the United
States of America and the United Kingdom, the health sector was probably
the first service based industry to utilise benchmarking to improve

25
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

performance and thus demonstrated for the first time that benchmarking
techniques could be adapted to suit the needs of the service environment’
(Garrod, 1996, p. 143). As a result ‘benchmarking, as a proven practical
management tool, is now spilling into the not-for-profit arena and is being
embraced by local, state, and federal governments and by academic and
other types of institutions’ (Gohlke, 1997, p. 24).

3.3.1 Early attempts


In June 1995 the University Librarian at Queensland University of Technology,
on behalf of the CAUL Executive, formally surveyed all members of CAUL to
assess the level of activity in formal benchmarking projects within the
Australian academic library community. The results indicated that only five
university libraries had undertaken any form of benchmarking at all. The
reasons, scope and nature of the projects mentioned varied:
• Required by the institution to set benchmarks for projects being
undertaken (Swinburne);
• As a precursor to seeking an institutional ISO9000 certification, the Library
was encouraged to develop systematic quality processes including
benchmarking (RMIT);
• As part of a university wide benchmarking exercise which included the
allocation of funding by the parent institution for self improvement,
benchmarking was also viewed as a logical extension to the library's
adoption of TQM and the introduction of performance indicators
(QUT); and
• Benchmarking was an integral component of the Australian Best Practice
Demonstration Program through which the Library had been funded for its
best practice project (NTU, http://www.ntu.edu.au/admin/isd/qsdc/
bppage.htm.).

(The fifth respondent (SCU) gave no details as to what had initiated the
activity).The remaining respondents indicated that they were either intending
to investigate benchmarking in the near future or were interested in finding
out more about the process. A few stated that they were engaged in what
could be termed informal benchmarking. From the limited information
available from this survey it appears that the purpose and the outcomes of
these early projects were markedly similar to the later EIP survey findings,
for example:

To improve the overall quality of library service…. as a method of


attracting clients, funding and resources. The project has enhanced the
profile of the Library both within and outside the University community
(NTU).

26
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Improve our services and provide comparisons of resourcing


levels (QUT).

RMIT is seeking ISO9000 certification in 1998. The Library is therefore


being encouraged to develop systematic, quality processes, including of
course, benchmarking (RMIT).

To get an understanding of where we stand in relation to resources and


service provision, to determine areas where improvement can be made
and to argue a case for funding (Swinburne).

What has changed since 1995 is the substantial increase in the number of
academic libraries using benchmarking successfully in the formal sense as a
tool for continuous improvement. Although this has not been accompanied
by a corresponding increase in the literature on library benchmarking it does
indicate an increase in the focus on quality improvement generally, and a
desire on the part of librarians to undertake a more systematic and rigorous
approach to the improvement of products, processes and services.

3.3.2 1995—Current situation


The EIP survey results indicate that the use of benchmarking as a quality
improvement tool is increasing. Why have academic libraries begun to use
benchmarking to measure and improve performance? What has occurred
since 1995 to account for this increase in benchmarking activity?

'Interest in benchmarking in the LIS sector is gradually growing as a


consequence, stemming from the need to find new ways of improving
effectiveness in the face of constraints on resources, and increasing demands
on services. LIS managers need new tools to help in the management of
change, and are under pressure to operate like businesses. Interest in
benchmarking forms part of the general ‘quality imperative’ which aims to
find, and use, new ways to measure and improve customer satisfaction’
(Kinnell, 1995, p 269).

The above statement from a British practitioner has been echoed in the
current project survey responses, and in discussions with senior library
managers where the following additional reasons emerged:
• Benchmarking challenges and opportunities include the enhanced role
of the information professional; impact of technology—multiple
mediums; direct access to information sources by clients; operating as
a business unit; budgets stretched—multiple formats; budgets decreasing;
corporate downsizing; contestability in government sector; outsourcing
(Robertson, S., 1998, p. 120);

27
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• Competition within institutions for scare resources, declining budgets and


the need to justify the existence of the library in measurable terms;
• An increase in the focus on quality generally is evidenced by participation
in consortia/strategic cooperatives at the institutional level eg involvement
in CHEMS, Universitas 21, Australian Quality Awards framework and most
recently the McKinnon Benchmarking Project;
• Following government and industry trends, the adoption of a more formal
method of measurement and continuous improvement of the quality of
products, services and work processes has worked to enhance the
reputation of libraries within their organisations;
• Programs such as the Commonwealth Government’s Australian Best
Practice Demonstration Program with emphasis on a step by step, formal
benchmarking process as an integral component and the additional
requirement to demonstrate/publish to industry;
• The publication in Australia in 1994 of benchmarking texts by Anne
Evans and Johanna Macneil which provided useful if general guides to
benchmarking;
• The launch of quality/benchmarking websites such as Benchmarking in
Australia run by benchmarking consultant Anne Evans and Library
Benchmarking International run by Annette Gohlke (President and
co-owner of LBI);
• The adoption/implementation of quality frameworks/systems such as ISO
9000, and Australian Quality Awards either by the parent institution or the
library itself;
• The appearance of benchmarking clubs and networks such the AQC
Benchmarking Network and BenchNET, which offer a number of services
including training, networking opportunities, online benchmarking surveys,
all of which help in the sharing of ideas and information and do much to
advertise benchmarking as a quality management tool; and
• The release of practically based toolkits such as the Benchmarking self help
manual and the International benchmarking sourcebook.

Two additional factors relating to processes for quality improvement which


had some influence on minds within the Australian university library sector at
that time, were the awarding of extensive Australian government Quality
Assurance funding in 1994 and 1995, and Quality assessment visits which
required institutions to demonstrate that quality management processes were
in place.

In 1997 Annette Gohlke wrote ‘Librarians in all types of libraries are finding
themselves in the position where they must build a solid and effective case
on how their library adds significant value to the organisation or institution

28
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

that pays the bills. An increased focus on efficiency requires an examination


of work processes, points to the need to measure productivity and to look
outside their own libraries to external sources for ‘best practices’ (p. 22).
Librarians in the Australian academic community have echoed Gohlke’s
sentiments and, as the survey findings below indicate, cost savings, increased
cost efficiencies and process improvement feature most often as the major
reasons for engaging in benchmarking activity.

3.4 Survey findings


In November 1998, the Project team surveyed all Australian and New Zealand
academic libraries, together with major non academic research libraries in
Australia (all State Libraries, CSIRO and the National Library of Australia) to
determine the nature and extent of benchmarking activity in the academic
and research library community. The survey largely replicated the QUT CAUL
survey of 1995, with minor adjustments to the questions and layout
(Appendix A).

The results indicated that between 1995 and 1998 eighteen Australian
university libraries, one New Zealand academic library, two state libraries and
the National Library of Australia participated in benchmarking either as
invited partners or project initiators. Benchmarking activities were undertaken
at library, institutional and external level (eg through library participation in
CHEMS and Universitas 21 and through the AQC benchmarking network). To
date, benchmarking has not been utilised by academic libraries in New
Zealand to the same extent as in Australia, although the University of Otago
intends to benchmark information skills with the University of Queensland in
1999. (The University of Auckland is a member of U21, no New Zealand
universities are as yet, members of CHEMS). Useful data was also received
from the non academic research library respondents and demonstrated the
usefulness of comparing the approaches between academic and other
research libraries. A visit was undertaken to the State Library of Victoria and
provided useful insights into the challenges of responding to a different set of
priorities, client groups etc.

29
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

3.4.1 Partners
Benchmarking partners included:
• other university libraries, primarily Australian with some activity about to
be initiated by a New Zealand partner (University of Otago);
• Universitas 21 (participating libraries from Australia, New Zealand,
Singapore, China, Canada, USA, UK);
• CHEMS (participating libraries from Australia, Canada, Hong Kong,
United Kingdom, Africa);
• state, national and TAFE libraries;
• non library partners such as enquiry services eg Student Union,
Student Services, Personnel, Buildings and Grounds; and
• other industries eg law firms, pharmaceutical companies, Telstra,
Australian Consumer Association, and a hospital through the Australian
Quality Council Benchmarking Network.

Participation in benchmarking projects occurred through a number of


initiatives including:
• institutional membership of CHEMS or Universitas 21,
AQC Benchmarking Network;
• activities initiated by consortia or groups such as WAGUL, UNILINC,
and CAUL;
• expressions of interest/invitations sought through discussion lists; and
• professional contacts, university alliances and internal quality coordinators.

The level of involvement varied depending on role of partners eg as initiators


or participants. It sometimes involved recording of data/information which
was fed to an external consultant who collated, analysed and reported the
findings. Completing questionnaires/providing information to the initiating
library, followed by a site visit to observe procedures in practice, was the
most common approach and fits most closely with standard benchmarking
methodology.

CHEMS benchmarking was initiated by CHEMS, the libraries did not make
direct contact. In the case of U21, partners are still working out potential
projects so the roles have not yet been defined. Because partners come from
around the world with different political, economic, educational and social
cultures, it is anticipated that relationships will be complex. In Australia, U21
activities are currently informal but it is anticipated that some will progress to
the formal benchmarking stage. To date, Australian partners have reached
agreement on the scope and methodology of projects, followed by agreement
to undertake selected activities. The level of activity between the Australasian
partners will probably continue to be more intensive than with the overseas

30
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

partners for the immediate future. The AQC benchmarking project required
participants to deliver presentations, participate in forums, collect and provide
data on activities.

3.4.2 Projects: scope and variables


Survey responses indicate that there are a number of activities suitable for
benchmarking. For the most part these are process-based activities.
Benchmarking between library partners, where the standard benchmarking
methodology was utilised, tended to be fairly specific in the choice of
process/sub process and quantitative rather than qualitative. Specific
processes or sub processes selected for benchmarking include:
• interlibrary loans;
• copy cataloguing;
• original cataloguing;
• shelving;
• acquisitions/cataloguing/processing;
• acquisition of core texts;
• document delivery;
• technical services throughput;
• library system costs;
• research support;
• information skills;
• materials availability;
• staff perceptions;
• customer satisfaction;
• organisational comparisons (U21);
• costing core processes (U21);
• university enquiry points; and
• leading and managing improvement and change (AQC).

CHEMS and U21 benchmarking projects tended to be either broader in scope


and/or more qualitative in their content and slightly different methodologies
were deployed. In the case of CHEMS ‘all library functions’ were
benchmarked. This was largely a qualitative exercise in which member
institution libraries were required to complete a questionnaire that addressed
the following areas:

31
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• Strategy, policy, planning and good management;


• Library services;
• Access;
• Collections;
• Support and training; and
• Human resources management.

Scoring was undertaken by CHEMS. A four-day workshop followed with


representatives from member institution libraries attending. A series of
statements of good practice were developed. Self-assessment against the
statements followed. From this, member institution libraries were able to
identify areas and prioritise for improvement. Final endorsement of the
statements occurred in March 1999.

3.4.3 Timeframe
Project timeframes varied from one week to eighteen months, some are still
ongoing (U21). Responses indicated that the length does impact on project
outcomes (see lessons/outcomes below).

3.4.4 Why benchmark: purpose?


Respondents provided a number of reasons for initiating and/or participating
in a benchmarking project, from the very specific eg turnaround times to the
more global ‘achieve best practice’. Specifically respondents indicated the
following reasons:
• cost comparisons (2), estimate unit costs; cost identification associated with
system purchase;
• reduction in turnaround times (2);
• reduction in error rates;
• establish meaningful performance indicators/realistic output measures
• explore feasibility of collaboration to achieve cost savings and increased
efficiency;
• investigate insourcing, outsourcing and collaborative opportunities;
• establish individual performance targets/self improvement;
• explore appropriate roles and activities for cataloguers;
• develop improved outcomes for customers;
• achieve process improvement/foster commitment to ongoing
process improvement;

32
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• pilot benchmarking/instill understanding of value of benchmarking for


quality improvement/develop a culture of improvement and
comparison/make improvements in performance and quality;
• as an instrument to achieve change/restructuring/confirmation of
direction/information exchange;
• identify best practice (2)/develop best practice model/achieve best
practice/benchmark best practice/determine examples of best practice;
• as a validation measure—potential to verify what is already
known;/identify and act on areas in need of improvement (CHEMS);
• develop statements of good practice (CHEMS); and
• as a framework for benchmarking of performance and quality (U21).

3.4.5 Achievement of purpose/lessons for


successful outcomes
Respondents indicated that the benefits of the project often varied between
different areas, and tended to be more successful in the areas with clearly
defined processes in place eg cataloguing, compared to research services.
The similarity of partners also impacted on projects. It was felt there was less
likelihood of achieving major improvements if partners were too similar,
although the benefit here was often a confirmation that they were on the
right track. Difficulties in benchmarking qualitative processes were
highlighted eg gaining agreement on how to measure reference transactions
provided a qualitative challenge.

Respondents were very clear on the lessons learnt from participation in


benchmarking projects. Comments provide an insight into the different
priorities, and often depended on whether the library initiated or just
participated in the project. Participants stressed the following criteria as
essential for a successful project outcome:
• clearly defined objectives;
• full commitment from participants;
• use of effective external consultants;
• effective use of existing knowledge and expertise;
• tight timeframe;
• sufficient resources to achieve objectives in timeframe;
• fewer questions, less repetition;
• clarification of terminology; time to follow up;
• understanding of benchmarking and a common view of processes before
starting out;

33
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• clear definition of expectations of project at the outset;


• more two-way interaction;
• clear objectives and methodologies;
• effective internal promotion of the project;
• management support;
• partner involvement in code of ethics;
• awareness of disclosure and expectations;
• consistency of methodology to ensure positive outcomes;
• methodology must be agreed and tested;
• information gathering process from all sides;
• participation of all partners in site visits;
• visits should involve staff actually working in the process/areas
being benchmarked;
• compatibility between operations/processes;
• impetus or desire to change operations/processes; and
• project highlighted the lack of publicly available performance data and the
lack of a common methodology for measuring performance in specific
research support services, acquisitions and cataloguing.

CHEMS
Feedback from CHEMS participants on the 1998 benchmarking exercise varied
from ‘limited value’ to ‘confirmation of best practice approach’, and ‘rigour in
completing questionnaire and accompanying benefits’.

The CHEMS framework offered a ‘holistic’ approach to the


benchmarking process rather than the perceived ‘fragmented’ approach
of other quality models. In addition it offered the benefit of a peer-
related framework with the underlying assumption that improving
processes will by default enable improved delivery of library services
and lead to increased client satisfaction (Swinburne).

Because of the external context of CHEMS, weaknesses identified against the


statements of good practice were likely to be addressed sooner. Participants
found the discipline of completing the questionnaire and rating against
statements of good practice to be an excellent method for identifying and
addressing areas for improvement. It was felt that external benchmarking
provided a greater motivation to improve performance.

34
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Universitas 21
In the case of U21 benchmarking activities, the focus is on comparable
activities and is externally imposed. Activities between Australasian partners
to date have included both quantitative (materials availability), qualitative
(client satisfaction) projects, comparative profiling and activity based costing.
In the area of client satisfaction surveys, results are being input into the
Rodski database, to create comparative profiles and provide useful partner
information for future benchmarking against other institutions and non-library
sector companies. U21 benchmarking is not a formal, process based
benchmarking model.

It is important that there is equal involvement of parties, whereby both/all


parties collect and analyse data, identify and make recommendations. In the
costing core processes exercise, the aim was to cost core processes and then
look at areas of considerable divergence. The methodology involved
individuals allocating time against specific categories. Difficulties arose in
matching categories against work already done at partner institutions, due to
organisational differences. The value of the project was more obvious within
the institution. Questions were also raised about process ownership.

Australian Quality Council projects—benchmarking outside LIS


Respondents indicated that cost, and inability to locate appropriate
internal or external funds, might hinder participation. Challenges also
included the establishment of a level of credibility by the library within an
external network of organisations. The University of Wollongong Library
described participation in the AQC benchmarking network as ‘Taking on a
business approach’.

This has been a major achievement of the Leading and Managing


Improvement and Change Project. Benchmarking visits to partner
businesses helped to place the role of the library into a financial/profit
oriented framework. Within the library context, profit was identified as
maximising the value of the investment (students) and as looking at the
full range of what the library does and drawing more out of it. There is
value in visiting other libraries to place your own work in context.
Improving internal processes can lead to more cost-effective practices
and negate impetus for outsourcing (for example).

35
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

However, benchmarking in the formal sense was not always viewed as the
most appropriate method for improving performance:

It can also be seen as an exercise that has the potential to fix and
organisation at a particular point in time. It has the potential to fix
sights in one direction while ignoring customer interest/need in another.
It is preferable to focus on continuous improvement through informal
internal benchmarking rather than institutional/library comparisons
(Deakin).

3.4.6 Training (see also Chapter 6)


Generally, the approach to training has been somewhat ad hoc. Survey
findings indicate a range of differing approaches:
• Awareness sessions and seminars for senior managers; limited to those
immediately involved in projects; briefings on data collection requirements
for participants;
• Workshops covering all aspects of benchmarking, working through the
chosen model, internally or externally facilitated. Generally, with the
exception of the University of Wollongong library, the focus has not been
on formal training; and
• A more formal approach to training was evident in the larger, institutional
level projects (CHEMS) and external projects (AQC).

The benchmarking survey has provided an extremely comprehensive insight


into the role, level and scope of benchmarking in the Australian academic
library community. The detailed responses from survey participants have
helped to provide an accurate picture of library benchmarking in this sector.
Activities suitable for benchmarking have been identified and, combined with
lessons learned, provide clear guidance for future project participants. The
diversity of approaches and differing methodologies contribute to the
flexibility of options available to potential benchmarkers. CAUL’s interest in
benchmarking has been viewed as positive in terms of the potential for the
development of material that libraries can then adapt and shape to suit
individual institutional and organisational contexts.

36
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

3.5 Future directions


Where to from here? It seems fairly clear the academic library community has
adopted benchmarking as one of a suite of quality improvement tools.
Survey participants have indicated that they will continue to use
benchmarking as a means to compare and improve performance in products,
processes and services. Participation in CHEMS and U21 activities will be
watched with interest by non-participating libraries and other institutional
initiatives may well be explored. Already, the higher education sector has
invested funds into a study to investigate the development of high level
indicators of performance.

3.5.1 McKinnon project


The McKinnon Walker/IDP Education Australia project is an initiative funded
by DETYA, which aims to identify those measures or reference points needed
to enable university executives to assess whether the university is making
progress in a particular area or activity, either in relation to previous
performance or in relation to peer universities. The challenge of the project is
to define more relevant benchmarks, to define these precisely so that any
comparisons are of like-with-like, and to use the resulting benchmarks both
to improve universities and to inform the public. The underlying motivations
for the project are to provide universities with quality assurance indicators
which have been developed specifically for the industry. There is also a
desire to develop benchmarks which suit universities, rather than to have
them imposed by an outside authority.

The Library and Information Services Working Party of this project identified 8
benchmarks which sought to answer the general question ‘How would a Vice
Chancellor know that the library/information technology service was
performing relative to good practice'?
• Effectiveness of planning process;
• Contribution to the quality of research;
• Contribution to the quality of teaching and learning;
• Contribution to the quality of corporate information systems;
• Effectiveness of staff resources;
• Effectiveness of collaborative alliances;
• Effectiveness of networks and communications infrastructure and
services; and
• Efficiency of help desk services.

37
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The draft benchmarks are being trialled in August and September 1999. At
this stage it is difficult to know whether the final outcome of the project will
be a set of benchmarks which will be used by an institution to monitor its
own progress or be used more systematically to compare institutions. The
project is in collaborative dialogue with CHEMS Benchmarking Club but
differs in concentrating on outcomes rather than processes. The academic
library community will monitor progress with interest.

3.6 International comparisons


In introducing this section it is important to note, as with the corresponding
sections in the two other core chapters, that the information contained here is
presented with the following limitations explained. This EIP project contained
no allowance to parallel the approach to information gathering that was taken
within Australia. There were no surveys of international activity or visits to
interesting sites identified by those surveys.

One member of the EIP team had, however, in June/July 1998 undertaken a
series of international visits to libraries, financed by her home library, UNSW,
to examine a select few interesting ‘quality’ sites in order to investigate how
quality frameworks were implemented, and related activities had been
undertaken. This was done in order to better position the UNSW Library in
terms of improving its own quality management processes. Small case studies
have been outlined here, drawn directly from those face to face meetings and
observations, to indicate some international practices in these areas. In no
way do they give a comprehensive outlook but they do illuminate
international opinions and work situations. The literature review (Chapter 2)
and useful source materials (Appendix E) provide a better overall picture of
international activity.

38
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Case studies

3.6.1 The large, traditional European university library—


University of Muenster
The University of Muenster Libraries sit within a conservative tradition
common in German universities. The University is large (50 000 students) and
consists almost entirely of full time local students. There are very few
overseas students, except in the specialised subject areas for which the
university is renowned. The conservative history of the Library is indicated
by the fact that Muenster operated only as a closed stack library until ten
years ago.

On the European scene there is a great divide between the former Western
European countries and the countries of the former Soviet bloc. These are
seen as the new ‘underdeveloped’ regions, with priority for library ‘aid’
activities to the immediate east focusing all spare resources and educational
assistance. The attitude of former communist areas to concepts and
movements such as quality and benchmarking reflects a deep suspicion
derived from their political background, and the considerable differences in
resourcing make the former borders also borders for co-operative activities
between like institutions. Much of the current activity in the EU is aimed at
simply raising awareness in the eastern European countries of current notions
such as quality principles, accepted in Western Europe but still ‘foreign’ to the
East. Eastern bloc countries may be unable to participate in co-operative
activities on an even footing for some time.

For many European libraries the real leadership and focus remains at the
state and not the national level, with the exception of the small European
nations. The German National Library, for example, focuses solely on
producing records for books published in Germany. Librarians seldom meet
on a national basis, and energies are fed into regional groupings, which are
the natural focus for benchmarking activities. Westphalia/Rheinland (the
region in which Muenster lies) has a group of thirteen university librarians
who meet monthly and carry out co-operative initiatives, including some
benchmarking under the dynamic leadership of Roswitha Poll, the Muenster
University Librarian and European expert on performance indicators,
benchmarking and activity costing. The group has a reputation for energy and
new initiatives within Germany. Its activities also include running a regional
co-operative cataloguing centre and a staff development centre where most
staff development activities for the librarians of the group are held.

39
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

A regional benchmarking project, in which all the university and polytechnic


libraries input data for their individual institutions relating to a core set of
performance indicators, was recently completed. The analysed data was fed
back to participants as charts which showed for each indicator:
• maximum figure for the group;
• minimum figure for the group;
• mean figure; and
• individual institutional record in relation to the above.

Suspicion about levels of disclosure is a real issue. Even with individual


responses not identified some participants fear that extremes can still be
identified and difficulties may follow. Some institutions returned only partly
completed sets of figures, deeming the remainder to be too sensitive.
Publication on their pioneering work for the benefit of the profession is
therefore difficult.

There is a perception that the more conservative European libraries focus too
much on statistics per se and cannot see the role of useful indicators in
performance improvement.

A huge problem in developing comparative metrics for benchmarking within


and between European universities is the enormous numbers of libraries on
traditional campuses. At Muenster there is the central library and there are
some 200 other campus ‘libraries’ running with all sorts of combinations of
library and other resources, including staffing and materials processing of
various sorts. One faculty may have more than fifteen libraries of its own.
Every Professor considers that his prestige rests on the size of her/his
(virtually personal) library. Even modest reductions in the number of these
will take over a decade. This makes comparisons and definitions particularly
complex both for internal and for external benchmarking.

3.6.2 A centralised, highly progressive and flexible university


library: Danish Business School Library
The DBS Library Director Michael Cotta Schoenberg explained at the first
Northumbria International Conference on Performance Measurement in
Libraries and Information Services in 1995 how quality management measures
were being integrated at the DBS library. The Library has undertaken a series
of progressive initiatives including looking at using the European Quality
Model (similar to the Baldridge and AQA frameworks) and its related Danish
Public Administration Quality Model, administering staff satisfaction surveys,
participating as a partner in the EU financed EQLIPSE project, and has
practiced both local and international benchmarking.

40
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

One of the key areas of interest for potential benchmarking for DBS has been
materials availability, since there are 15 000 students, of whom half are full
time and half are part time, a very small library materials budget until
recently, and teaching and research in an area where most material becomes
quickly out of date.

The DBS involvement in European/Danish benchmarking started out with a


range of European business schools filling in a suite of background data over
a couple of years (similar to AARL data collection). The project was known
as the EBSLG Quality Management Project. This project has now been closed,
as DBS, being 2–5 times larger than the others in the group, was getting little
from the comparative exercise. The more recent phase of local benchmarking
involves six of the twelve Danish university libraries. There were test sites for
sampling activities and follow on surveys. As a result of some interesting
patterns being identified, three areas were selected for which the actual
processes are being analysed and benchmarked with a view to process
improvement:
• speed of processing;
• supplier book supply performance; and
• cataloguing error rate.

3.6.3 The British defence university campus and


British benchmarking initiatives—Cranfield Shrivenham
campus and Stephen Town and the SCONUL
Benchmarking group
Regionalism is flourishing in Britain also. Libraries across sectors but within
regions are opening up more and more to co-operative activities, and view
benchmarking as particularly relevant to the regional concept. Regionalism
seems strongest away from London and the Southeast.

The SCONUL sponsored British benchmarking groups outlined in the


literature went about their tasks in a variety of ways, despite initial briefings
by Stephen Town on a common approach, and took very varied periods of
time to reach a conclusion. Almost all participants are from the ‘new’
universities. There are plans for a benchmarking clearinghouse to record the
projects, their data and outcomes, a little like Benchmarking Edge or the AQC
benchmarking database. The first published results will be presented at the
next Northumbria Conference on Performance Measures in August 1999.

Views were expressed that the university libraries really needed some type of
defined quality management process/framework in place in order to translate
the results of the gap analysis identified by benchmarking into tangible

41
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

returns. Few university libraries in the UK currently operate in this manner.


The handful who do are the new, smaller universities keen for a competitive
advantage, or even survival. Town is also convinced that customer focus is at
the basis of process improvement, but this is not a view held universally.

Both the SCONUL group and the Cranfield benchmarking exercises had
extended time frames, because of the complexity of local demands and the
scarcity of resources which could be dedicated to the work. Since their first
experiences in 1993, all of which have been documented in the literature in
articles by Town, Shrivenham have undertaken a range of less usual
benchmarking activities including:
• Strategic Benchmarking study which encompassed:
– buildings (housing integrated services/electronic libraries);
– electronic libraries (matrix of who had, and had done what in the whole
spectrum; and
– flexible teaching and learning.
• IT support (10 partners/3 in libraries). This was done by a consultant and
may have been more superficial than some participants would have liked.
Some non-library organisations fared well because of their Help Desk
systems;
• Document delivery/help desk with Surrey Institute of Art & Design, plus
three academic and one non-library partner; and
• ILL/Inquiries (with Loughborough and a couple of other participants).

3.6.4 The US scene: Purdue University Libraries


Purdue University in rural Indiana is a land grant (1874) university with a
total of 65 000 students, 35 000 of which are located at West Lafayette. There
are a dozen libraries in the Libraries portfolio.

The attitude to benchmarking displayed by the University managers (not the


Libraries) who run the quality management process, Excellence 21, based on
TQM principles but customised to Purdue, seems to be a common one in
large US universities. On benchmarking, Purdue management opinion is that
they already have very close co-operation with peers, meet as a group
regularly and connect by function frequently. They suggested that the amount
of co-operation, the effort put in to make ‘clean’ data and the risks of
exposure are currently seen as not worthwhile in terms of actual return
on investment.

In May 1998 Purdue convened a first meeting of eighteen senior managers


involved in some type of quality improvement/quality management initiatives
in their own universities. The group included University of California

42
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Berkeley, plus the CIC institutions, etc. The group discussed benchmarking
but the consensus was similar to the Purdue attitude that, at this point in time,
detailed benchmarking or comparative performance measures are politically
unacceptable. Less formal exchange of views and co-operation on new
initiatives or joint lobbying for a greater good were considered acceptable.

Within the Libraries, some small scale benchmarking has occurred as part of
the team approach to process improvement which is encouraged by Excellence
21. For example, one of the Purdue Excellence 21 TQM/CQI Library
improvement teams (the mail room/loading dock) had used benchmarking as
part of their improvement process. They used a team approach, and
benchmarked against comparable public service and commercial equivalents,
and had also devised client surveys adapting SERVQUAL TQM methodology to
their requirements, flow-charting all processes.

Prior to benchmarking visits or conference calls, staff, on their own initiatives,


had devised structured response questions in advance, illustrated processes
and people through floor plans of areas, photos of the team and working
areas, maps of campus, flow charts etc. Benchmarking on the micro scale has
been of considerable benefit. Many staff at the lower level in such institutions
come from a stable, conservative working situation, and have found the
broader perspectives brought by communicating beyond their workplaces,
exciting and beneficial.

43
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4 Performance indicators

4.1 Terminology
Unlike the topic of benchmarking, the literature abounds with articles on
performance measurement across the whole library sector. However, like
benchmarking, there has been little agreement to date on a standard
definition of what is meant by the term performance measurement. ‘In the
ever-growing literature on library performance measurement, no
standardization of terminology has been established’ (Cullen, 1995, p. 438).
In selecting a useful definition, Cullen and Calvert state a preference for
Lynch’s definition which makes careful distinctions between terms which are
sometimes used almost as synonyms by writers in the field. ‘The results of
measurement can be used to evaluate the performance of a library, and
thereby determine whether or not it is effective’ (Lynch, 1983, p. 388).
Another simple but useful definition is provided by te Boekhurst
‘performance measurement is comparing what a library is doing
(performance), with what it is meant to do (mission), and wants to achieve
(goals). The extent to which goals are reached can be determined by using
performance indicators’ (1996, p. 279).

To be effective, performance indicators must be developed in context, not in


isolation. They must be firmly rooted within a strategic management and
planning framework. ‘Performance indicators should emerge from the
definition of strategic objectives, and the results of performance measurement
should influence further strategic planning and strategic decisions’ (Abbott,
1994, p. 10).

4.2 Background
Performance measurement has become a major issue within the Australian
library community. As the discussion progresses, there is an accelerating
emphasis upon the need to use such measures. Not infrequently, embedded
within this approach, is the hope that resulting performance indicators will be
comparable, and this in turn rests upon the assumption that one can look at
a ‘performance indicator’ and understand what is means. Performance
measures should not be viewed as ends in themselves; rather they are part of
the process of evaluation, and only within this context do they have meaning’
(Novak, 1992, p. 263).

45
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

'Recent pressures on academic library managers, such as declining budgets


and calls for accountability to both the parent organisation and to library
users, have led them to take a renewed interest in performance measurement
and to look more thoroughly at theoretical work on the measurement of
effectiveness’ (Cullen & Calvert, 1995, p. 438). Although the project survey
findings did not indicate a high degree of activity from New Zealand
academic libraries in performance measurement, both Peter Calvert and
Rowena Cullen have published a number of influential articles dealing with
the evaluation of academic library performance during the last decade.

Calvert and Cullen’s more recent work focuses on stakeholder perceptions of


library effectiveness and uses the ‘constituency satisfaction model’, whereby
an organisation is assessed by the degree to which its constituents, or primary
stakeholders, have been satisfied. ‘This model has been tested by other
eminent researchers in the field (Van House, Childers), and has been found to
be a sturdy and useful model for assessing library organisational effectiveness’
(1995, p. 439). It is believed that this model has the potential to become a
useful approach because it ‘takes account of the fact that effectiveness is a
multi-dimensional construct’ (ibid, p. 440). The focus of Calvert’s more recent
research has been on gap reduction. His research ‘builds on Hernon and
Altman’s exploratory, generic framework with the intention of refining it for
practical use. The intended outcome of Calvert and Hernon’s research is to
provide university librarians in New Zealand, (and hopefully elsewhere), with
a flexible tool for analyzing and measuring customer expectations of service,’
(1997, p. 408).

4.2.1 Progress in Australia in the 1990s


The Council of Australian University Librarians (CAUL) has had a longstanding
commitment to measurement of academic library performance. This has been
demonstrated in the annual statistical compilation which is published as a
supplement to Australian Academic Research Libraries (AARL Statistics). ‘In
the early 1990’s, CAUL, in responding to a growing interest in performance
measurement and quality, undertook to identify possible areas for
collaboration. Although some concern was expressed about the danger posed
by invalid comparisons, members agreed that it would be useful to have
standard measures for key aspects of service delivery’ (Byrne, 1997, p. 252).
Progress towards this end is summarised below:
• Survey of CAUL members on priorities for selection and implementation of
performance indicators, conducted by Colin Taylor, University of South
Australia in 1992, (thirteen key indicators identified);
• Development and publication of three indicators in 1995:

46
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

– Library/clientele congruence (ie satisfaction), Indicator A


– Document delivery, Indicator B
– Proportion of sought material obtained at time of visit (Materials
Availability), Indicator C
• Review of priority areas undertaken in late 1995, to identify further
indicators for development and publication;
– Suggestions indicated a focus on international developments by
organisations such as IFLA, SCONUL and ISO
• The original list of thirteen indicators was revisited in light of the overseas
developments, and decisions made as to whether it was worth
adopting/monitoring work being done overseas;
– Agreement to negotiate with CAVAL Reference Interest Group (CRIG) to
progress their work on performance measures for reference services
(see below)
• In May 1996, all CAUL members were surveyed again to determine the
extent to which the indicators had been used and the need for revision;
– Responses indicated that experience with the indicators had been
limited
• Possibilities for future developments included:
– Costing methodologies
– Market penetration
– Support for CRIG project
– Adequacy of retrieval software and/or data.

The CRIG Working Party on Performance Measures for Reference Services


was established in 1994 to identify performance measures and indicators
currently used to evaluate reference services in Victorian academic libraries
(CRIG, 1999, p. 1). At the same time, CAUL commissioned the development
of selected performance indicators for Australian university libraries. Out of
this, a suite of indicators for general user satisfaction (library/clientele
congruence), document delivery, and materials availability were published.

As a result of a review of priority areas undertaken by CAUL in 1995,


reference fillrate was identified as an area of particular interest, and one in
which there was the potential for benchmarking. ‘The CRIG project therefore
complimented the CAUL indicators by focusing specifically on the analysis
and production of performance indicators for reference and information
services’ (Paton, 1996, p. 102). The Working Party utilised the multiple
constituencies model favoured by the New Zealand researchers, Calvert
and Cullen.

47
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

There have been a number of challenges associated with the CRIG project,
‘there are serious difficulties in developing a reliable and valid methodology
as has been extensively documented in the literature’ (Byrne, 1997, p. 255).
This was supported by Robertson and Trahn in their report on the
QUT/UNSW benchmarking project ‘in the area of research support, processes
are more diffuse than in the more procedural areas of library activity, and
there is very little agreement on appropriate output and performance
measures. A symptom of this is the absence of any AARL statistics for
reference transactions, or education, advice and information activities’ (1997,
p. 134). The QUT/UNSW project has been one of the few recorded attempts
to benchmark reference/research support services, and highlighted the lack of
publicly available performance data and the lack of a common methodology
for measuring performance in specific research support services, acquisitions
and cataloguing (ibid, p. 140).

Little additional work on the development of performance indicators for


academic libraries has occurred on a national level since the CAUL survey of
1996. Taylor’s earlier survey was utilised, with some modification, by the
current project team (Appendix B) and the findings are reported below. Paton
makes a telling point when she says ‘it needs to be recognised that changes
in library services mean that, over time, performance indicators which are
developed can become out of date. A process to redevelop indicators,
appropriate to the environment at a given time, is required’ (1996, p. 107).

CRIG’s Final Report on Performance Measures for Reference Services was


released in May 1999. The report presents the data from the user
constituencies of postgraduate students and academic staff, and
undergraduate students. Through the course of the study, twenty-one
indicators for evaluating reference service were identified from focus group
data. Analysis of the data enabled the Working Party to derive a model of
reference service effectiveness that identifies twelve key performance
indicators used by library staff and users. These indicators cluster into three
broad dimensions of service evaluation: Attributes, Support and Knowledge
(ASK Model) (1999, p. 1,4). The CRIG Working Party made the following
recommendations:
• the proposed model for evaluation of reference service effectiveness be
taken up by the profession;
• the twelve performance indicators for reference services be adopted by
Australian university libraries, and additional performance measures be
developed where necessary; and
• prioritisation of the twelve performance indicators and attendant
performance measures be addressed at a national level by Australian
university libraries.

48
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

At the time of writing, the CRIG Report had been distributed to all
CAUL members for consideration of the recommendations made by the
Working Party.

As the survey responses indicate, there has been a shift and consequent
reprioritising of indicators deemed to be important by the academic library
community. The need for indicators for the electronic library has emerged as
a key issue, and impacts to some extent on the work done to date by CRIG.
Fortunately, some excellent work has already been undertaken in this area
internationally by McClure, Brophy and the European Commission sponsored
EQUINOX project, which are described in some detail in the Useful Sources
list (Appendix E). Survey responses also illustrate the large amount of effort
that libraries have invested in the development of in-house indicators. There
has not been an across the board adoption of the CAUL indicators, and it is
quite likely that there has been some duplication of effort in the development
of in-house indicators within various institutions. Exploration of the possibility
of sharing this effort, and the indicators developed to date, is worthy of
further investigation.

4.3 Survey findings


Survey responses indicate that almost all academic libraries are using
indicators in some form to measure performance. In addition to the CAUL
Indicators, other measures of performance have either been adapted from
existing indicators, or developed in-house. Emphasis has been placed on the
need to have performance indicators which directly respond to institutional
and library key result areas in strategic and operational plans. The trend
appears to have been to develop in-house indicators in response to this
requirement, rather than to utilise the CAUL Indicators. Some use/adaptation
of external indicators has been mentioned eg SCONUL, IFLA, ISO 11620
generally and Van House, Hernon and Altman and Parasuraman with respect
to client satisfaction.

4.3.1 CAUL indicators


Survey responses indicate that there has been an increase in the number of
Australian academic libraries using either all or a selection of the CAUL
indicators since the review of priority areas conducted in 1995. Of the three
indicators, Materials Availability (Indicator C) is used by twelve institutions,
survey responses indicate that academic libraries are generally satisfied with
its format, content and process of analysis. Some minor suggestions for
improvement have been documented (see below). Indicator C has also been

49
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

used by Universitas 21 partners University of Melbourne, Queensland and


New South Wales Libraries, for benchmarking performance in availability
of materials.

Document Delivery Indicator B is used least of the three indicators (6).


However, survey responses indicate that many academic libraries are
measuring document delivery fillrate and turnaround time through indicators
developed in-house. Some adaptation of Indicator B has been made to
measure turnaround time on requests from external/off-campus students.
Additional information on why there has not been greater use of Indicator B
would be useful, eg are there inherent problems in the process of analysis
which could be rectified? Unfortunately, no feedback on improvements to
Indicator B was offered by those institutions currently using it, which may
indicate that it is proving to be an adequate measure of performance in this
area for those libraries.

Responses indicate that there is a high level of activity in measuring client


satisfaction through a variety of methods. A number of suggestions for
improving Indicator A have been made (see below). There has been
additional work done by individual institutions in altering this survey to
better suit specific organisational conditions and services.

Table 4.1 Use of CAUL and other performance indicators


UNIVERSITY Q. 1 Use of CAUL indicators Q.2 Others in use or planned;
adapted or developed in-house

ADFA No ISO11620, Van House,


CAUL under review
ANU No Planned for 1999
Bond "2 used" –
Central Queensland A, C –
Charles Sturt A, C –
Curtin A In-house & ISO compliant
Client focus; Education; Scholarly
Information; Resources
In-house
External alliances, Promotion
Deakin C Balanced Scorecard
In-house
Staff attitude survey;
internal customer survey
Flinders A, C In-house
Client surveys which target specific
aspects of services/operations
eg photocopying

continued

50
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Table 4.1 Use of CAUL and other performance indicators (continued)


UNIVERSITY Q. 1 Use of CAUL indicators Q.2 Others in use or planned;
adapted or developed in-house

Griffith A, B, C Adapted
Facilities use rate, service point
use rate surveys
In-house
Activity Costing—unit and total cost
of activities and services
JCU A, B (Irregular), C –
Macquarie A –
Monash B Adapted
Staff perception (U/Melbourne survey)
Client satisfaction (CAUL A &
Van House)
In-house
Distance education students survey
Murdoch A In-house
For selected core services
(acquisitions, cataloguing, serials)
ABN ILL management statistics
ILL statistics for WAGUL
NTU A, B, C ISO 11620 under consideration
In-house
Off campus requests
(fillrate & turnaround time), info skills,
service points
QUT B, C In-house
Reference, lending, technical services
Others listed in Service Charter
RMIT A, C (Infrequent) –
SCU No –
Swinburne No –
U/Ballarat C (once) In-house
Information desk satisfaction; ILL
(Exon & Williamson database)
U/Canberra No In-house
Accessibility, Availability, Adaptability,
Assistance from professional staff
ie all staff performing professionally
U/Melbourne C Adapted
Client satisfaction (CAUL A)
In-house
Aligned with Strategic Plan
UNE No –

continued

51
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Table 4.1 Use of CAUL and other performance indicators (continued)


UNIVERSITY Q. 1 Use of CAUL indicators Q.2 Others in use or planned;
adapted or developed in-house

UNSW A (once only), C Adapted


Parasuraman client satisfaction
(developed by Rodski Associates
for U/Melbourne for U21).
In-house
Goals with a number of indicators
attached to each for:
timely linkage to info (7); client
information skills (1); links with scholarly
community (4); effective resource
management (3)
U/Newcastle B, C (from 1999) –
UQ A (from 1999), B, C Adapted
Client satisfaction (from 1999 CAUL A)
In-house
Client satisfaction; Document delivery;
Shelving
USQ A, B, C Adapted
Off-campus request turnaround time
(CAUL B)
In-house
Shelving; subject search turnaround
time;
suppliers turnaround time;
accessioning/processing/
cataloguing turnaround time
U/Sydney A (once only) –
U/Tasmania No –
UTS C In-house
Client satisfaction; staff performance
indicators
UWA B (trialling) In-house
Client survey; materials availability
UWS No –
UOW B (1999), C In-house
Service desk satisfaction; client
feedback; document delivery;
information literacy level; bookvote use
VUT A SCONUL indicators being investigated

52
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4.3.2 Ratings
Client satisfaction, and the need to extend performance indicators into the
electronic information environment, were viewed as the most important.
Document delivery was also viewed as very important. Not all institutions
provided ratings for all the indicators.

Table 4.2 Breakdown of ratings


Very important Important Not important

Market penetration 11 12 5
Opening hours compared to demand 14 12 4
Collection quality review by
expert checklists (viz. conspectus) 5 11 12
Collection use (viz. IFLA) 14 12 2
Catalogue quality—known item search 10 15 4
Catalogue quality—subject search 7 17 4
Extend to electronic information services? 23 4 2
Acquisition speed 12 15 2
Book processing speed 15 13 2
Document delivery 22 7 –
ILL speed 20 10 –
Reference fillrate 5 19 2
User satisfaction 26 4 –
Cost efficiency 18 11 –
Costing methodologies 13 13 3
Electronic resources—Quality 20 7 1
Electronic resources—Availability 24 5 –
Adequacy of retrieval software and/or data 16 11 1

4.3.3 Additional/important indicators


In addition to the original range of indicators identified by Colin Taylor in the
1992 survey, and following the review of the priority areas in 1995,
institutions have reprioritised the list generated in the original survey, and
identified a number of additional and important indicators. Reasons stated for
the need for these additional indicators include:
• an increased reliance on electronic resources and services;
• increased participation by librarians in information literacy training;
• a more flexible approach to the delivery of courses and students opting for
alternative modes of study; and
• the need for effectiveness and efficiency measures in view of shrinking
budgets and competition for resources.

53
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Table 4.3 Additional and important performance indicators


ANU Campus viability; innovation; efficient/effective web navigation
CQU Information Literacy (effectiveness of teaching); overall P/Is for library effectiveness
Curtin Electronic Library (see: McClure & Lopata and Brophy & Wynne’s work)
Deakin DETYA library indicator at top level for institutional performance
Flinders Support for off-campus students especially interstate and overseas
Monash Serials usage survey
Murdoch User acceptance of non-traditional sources of information;
User adoption of electronic information retrieval
NTU Information skills (program effectiveness & penetration)
Staff training/skills effectiveness
QUT Quality of information literacy training
RMIT Proportion of resources available electronically 24 hours per day;
SCU Items return to shelf turnaround time
U/Ballarat Performance of IT services in converged operations
U/Canberra Accessibility, Availability, Adaptability, Assistance
U/Melbourne Identifying indicators and measures through use of AQC Business
Excellence framework
UNSW Facilities use indicators
Choice and construction of indicators requires flexibility in terms of current need
UQ Efficiency and effectiveness of service; indicators that aid in identifying areas
where savings can be made or determination of service levels
U/Sydney Access relative to need and time (more relevant now than opening hours)
UTS Electronic resources access by remote users; reader education programs—
penetration and use
UWS-M AARL ratio statistics
UOW Costing methodologies for key processes common to a number of libraries—copy
and original cataloguing; ILL/doc delivery costs; electronic resources.

4.3.4 Areas where CAUL could undertake further


developmental work
Survey responses indicate that institutions believe CAUL should undertake
further developmental work on electronic resources/services. Information
literacy has also been identified as another area where more work would be
welcomed. Librarians are looking beyond a numbers/quantitative measure for
information literacy ie not limited to percentage attending sessions, but rather
extended to measure the achievement of outcomes/competencies, based
on participation. Cost efficiency and costing methodologies were also
identified as important, which is not surprising in view of current and future
budget restrictions.

54
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Table 4.4 Further developmental work by CAUL


ADFA Should be based on ISO 11620; cost efficiency
ANU Client satisfaction
CQU Electronic resources; opening hours compared to demand
CSU Electronic services/evaluation
Curtin Electronic resources—quality, availability
Adequacy of retrieval software and/or data
Deakin Electronic information—quality and availability
Flinders Electronic resources—quality and availability;
Support for off campus students
Griffith Standardised costing methodology;
Client satisfaction with specific services which enables libraries to link satisfaction
level with specific aspects of service being studied
JCU Market penetration; Cost efficiency
Monash Client satisfaction
Murdoch Costing methodologies
NTU Electronic services—quality and availability; Cost efficiency; Information skills
QUT Resource access (including collection) quality eg extension of conspectus to include
electronic resources;
Costing—development of suitable methodologies
RMIT Information literacy training—% staff, students trained
SCU Document supply
Swinburne Those related to the electronic environment
U/Ballarat Qualitative—provision of information literacy services and Information Desk
services (but not limited to CRIG focus group methodology)
U/Melbourne Learning organisation; Continuous improvement; Accommodation
UNSW Use of electronic resources; Other electronic services related indicators
UQ Shelving; Interaction with computerised searching
USQ Collection usage; Cost efficiency; Electronic resources—availability/accessibility
U/Sydney Cost effectiveness in management; Use of collections/facilities
UTS Electronic resources—quality and use;
Document delivery and ILL turnaround time and use;
Computer literacy program—quality/effectiveness
UWS-M Electronic resources
U/Wollongong Collection use/relevance
VUT Electronic services

55
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4.3.5 Comments/changes to current CAUL indicators

General
• Should be aligned with ISO 11620;
• Need to be adapted for the high volume of electronic resources available;
• Improve efficiency of data analysis techniques eg through use of
scannable; survey forms; and
• More coordination of sharing of results of use.

Library/Client congruence indicator A


• Students misinterpreted some of the terms;
• Option to calculate and display frequency distribution tables ;
• Extend to cover indirect service to clients;
• Extend to cover multi branch operations;
• Measure user satisfaction level in more electronic environment;
• Too long; and
• Would be more useful if clients could rank the importance of various
services and programs in addition to their perception of the library’s
performance in the delivery of services eg Hernon and Altman’s tools.

Document delivery indicator B


No suggestions made

Materials availability indicator C


• Upgrade software;
• Expansion of missing category to more clearly define the various missing
categories eg not on shelf, lost, not on shelf in correct location etc; and
• Too much emphasis on physical collections.
Since the survey was conducted late in 1998, it has been discovered that
software relating to two of the three CAUL indicators is not Y2K compliant.
In view of the fact that the software may need some upgrading, it would be
useful to revisit the indicators in more detail and seek further feedback from
CAUL members as to improvements.

56
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4.3.6 Use of information gathered from performance indicators


in benchmarking, quality management
Most of the survey respondents indicated that they were using the
information gathered from performance measurement to improve the quality
of resources and services provided. Methods and the extent of use varied
between institutions. However, it appears that there is further scope for
interlibrary benchmarking, based on some of the measures in place, and it
would be worth exploring this in more detail.

Table 4.5 Use of performance indicator information in benchmarking


and quality management
CQU Benchmarking with other University Libraries
Quality improvement
CSU Process improvement
Curtin Data from Indicator A has been collated, tabulated and analyzed and a report
prepared with recommendations for follow up actions. These are being considered
in Unit operational planning and upcoming LIS-wide strategic planning
Deakin Collection development based on Conspectus levels. P/Is driving staffing and
resource allocation decisions and review of work practices.
Flinders Modified library access policies, student support policies and procedures. Process
improvements made
Griffith Increased number of CD ROM workstations as a result of Indicator A feedback;
Quarterly costing figures fed into Technical Services continuous improvement
processes;
Cultural awareness training for all service points staff as a result of feedback from
international students in 1997 Satisfaction survey
JCU Process improvement
Monash Plans to benchmark with U/Melbourne on staff perceptions and client satisfaction
Murdoch PI’s are used as one indicator of processes requiring improvement. Our Document
Supply work flow and processes associated with that were recently restructured.
The PI flowing from the ABN monthly management report indicated significant
improvement in our performance as a supplier.
NTU Indicators A, B used in Best practice project. Improvement plans are developed
based on feedback from client satisfaction surveys
QUT To improve workflows and services
To gain support for the Library eg additional funding for the collection
RMIT Incorporated into ongoing planning
U/Ballarat Information gathered is fed back to the teams to enable them to examine means of
process improvement
U/Canberra Used in the University’s Annual Report and in report to DETYA on
Quality Assurance
U/Melbourne Continuous improvement projects are always linked back to performance indicator
information eg shelving, costings

continued

57
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Table 4.5 Use of performance indicator information in benchmarking


and quality management (continued)
UNSW Target measures routinely used as part of annual planning review. In 1999
emphasis will be on improvement and relevance of current indicators as part of a
drive to enhance the quality management framework.
Indicator C used to focus on problem areas and attempt new strategies to improve
Client satisfaction data is now used to identify the perceived gaps and client
priorities
UQ Currently benchmarking with U21 partners—Indicator C 1998, Indicator A 1999.
Library constantly pursues process and quality improvement initiatives. Where
appropriate, comparisons are made between branches, other university libraries
and innovative sites
USQ Collection improvement; shelving procedures; change in procedures for handling
requests from external students
UTS We have conducted a review of virtually all of our functional units and services in
the past two years to achieve process improvements, cost efficiencies and overall
quality improvement. The reviews usually incorporate a literature review, customer
and other stakeholder focus group sessions, site visits to organisations providing
similar products or services. Information derived from these is considered in
developing review recommendations and implementation strategies.
UWA We have not used the CAUL indicators in this way prior to trialling Indicator B
U/Wollongong Teams monitor their performance indicators and develop improvement actions
within their Team Action Plans. Also feedback from clients is incorporated in
improvement activities. A KPI report is prepared annually on overall Library
performance within the KPI framework and provides a foundation for improvement
activities. Internal benchmarks have been established with most teams and some
external benchmarking has commenced, eg KPIs process improvement and
communication success were benchmarked with other organisations as part of the
Leading and Managing Improvement and Change benchmarking network. The
KPIs of bookvote use, materials availability and budget utilisation were examined
in an acquisitions benchmarking activity.
We also scan the Benchmarking Exchange (an electronic benchmarking bulletin
board) on a regular basis to identify potential benchmarking activities that are in
alignment with our KPIs and existing measures and data.
VUT User satisfaction survey results have been used as input to process and
service review

4.3.7 Training for performance measurement


Generally, the approach to performance measurement training has been
rather ad hoc. The most common approach has been one where staff, who
are immediately involved in performance measurement activities, are given
training on the mechanics of the CAUL surveys. With one or two exceptions,
there has been little attempt to train staff in why we use performance
indicators—ie the role of performance measurement in the quality
management framework generally. In the majority of libraries, the
training/awareness has been conducted in-house. (Issues relating to training
are explored in more detail in Chapter 6).

58
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4.4 Future directions


Responses to the Performance Indicator survey provide the academic library
community with a comprehensive up to date overview of the use of CAUL
Indicators specifically, and the approach to performance measurement
generally. The majority of survey responses indicate that performance
measurement is an integral part of the quality management system within the
respective institutions. Best practice institutions have firmly integrated
performance measurement into the improvement cycle. Development of
in-house indicators has been more widespread than anticipated. It would be
useful to share these developments within CAUL so as to avoid duplication of
effort, and to provide a more common basis for benchmarking. The results of
the QUT/UNSW benchmarking project are important for a number of reasons,
not least of these being that it highlighted the enormous complexity of
making fair and meaningful comparisons between libraries, even on the basis
of superficially straightforward indicators. ‘If libraries wish to benchmark in a
rigorous and fruitful manner, the continued development of performance
indicators by CAUL and similar bodies is necessary, together with
implementation of the indicators by libraries, and a willingness to share the
results they obtain’ (Robertson and Trahn, 1997, p. 140).

The measurement of electronic media and services presents significant


challenges for libraries, and the evolving information services field. The rapid
pace of technological change, together with the early adoption of electronic
information and digital communications technologies by the global library
community, add to the critical importance of these concerns. The impact of
this rapid technology adoption challenges conventional library statistics and
measurement concepts (Young, 1998, p. 157–158). There has been limited
progress in the Australian academic library community in addressing these
challenges to date. This is an area where it could well be worth waiting until
the outcomes of the most recently EU-funded research project EQUINOX are
assessed. To date, an initial set of performance indicators for electronic library
services have been developed, and feedback is currently being sought by the
EQUINOX project team, (http://equinox.dcu.ie/). Additional information
about the EQUINOX project is included in the Useful Sources list in
Appendix E.

59
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

4.5 International comparisons


In introducing this section it is important to note, as with the corresponding
sections in the two other core chapters, that the information contained here is
presented with the following limitations explained. This EIP project contained
no allowance to parallel the approach to information gathering that was taken
within Australia. There were no surveys of international activity or visits to
interesting sites identified by those surveys.

One member of the EIP team had, however, in June/July 1998 undertaken a
series of international visits to libraries, financed by her home library, UNSW,
to examine a select few interesting ‘quality’ sites, to investigate how quality
frameworks were implemented, and related activities had been undertaken, to
better position the UNSW Library in terms of improving its own quality
management processes.

Small case studies have been outlined here, drawn directly from those face to
face meetings and observations, to indicate some international practices in
these areas. In no way do they give a comprehensive outlook, but they do
illuminate international opinions and work situations. The literature review
and useful source materials provides an additional overall picture.

4.5.1 Practising what one preaches: University of


Muenster Libraries
The Librarian at the University of Muenster is Roswitha Poll, a key figure in
the literature of performance measurement, and personally influential in
Europe, in raising the profile in the profession of the theory and practice of
the use of performance indicators for process improvement. Fr Poll edited the
IFLA Guidelines on Performance Indicators and sits on the ISO technical
panel for the same area. The European Union is an important sponsor of
projects in the area. Pieter te Boekhorst, the co-editor of the IFLA Guidelines
was the Muenster contact person for the European Union sponsored
EQLIPSE (Evaluation and Quality in Library Performance: System for Europe)
project, and is now working on the successor project, EQUINOX which,
amongst other things is attempting to develop performance indicators for
electronic libraries.

The Muenster Library has put theory into practice as regards a comprehensive
suite of thirty performance indicators. Its use of indicators is an integral part
of planning structures, including the regular articulation of mission, goals,
objectives, performance indicators and performance targets for the Libraries.
These processes were regarded as quite radical when initiated within the

60
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Library, but other parts of the University have since been influenced by
this thinking.

One special Libraries project included an examination of the relationship


between external loans, in-house use, photocopying (every 7 pages counted
as an article). The finding was that for 1 million loans there were an
additional 600 000 equivalent in-house uses. This relationship may hold good
for any research library.

Muenster has also regularly devised and run surveys such as materials
availability, and staff satisfaction. The staff satisfaction survey was internally
developed from a project run by one of the junior librarians. It was
developed with full union consultation. Overall results were published, and
the results for each department discussed internally within that department.

Muenster also participated in a trial of the use of prototype software and


other means, to collect data on 59 selected performance indicators as part
of the EQLIPSE project. The software was problematic but the original
project has fed into EQUINOX, which aims to produce software ready for
market exploitation.

A recent initiative in Europe is using activity based costing as a management


tool in libraries. Fr Poll intends to publish a German language manual on
activity based costing in libraries in 1999.

4.5.2 Performance indicators in a quality context:


Danish Business School Library
The Danish Business School Library defined a set of performance indicators a
few years ago which dovetails with set annual targets. It is the role of the
Library Quality Committee to consider annual performance against those
targets, and to report on progress. The Quality Committee members are
appointed by the Librarian but include a range of staff at all levels, and from
varied functions. There is a solid core of enthusiastic young staff whose
professional development is encouraged in this way.

DBS have a substantial set of spreadsheets published internally monthly, and


yearly (English language list) with breakdowns on performance in all area.
The favourite measure at present is one which shows:
a. The percentage of reservations which become available within the
specified time period; and
b. The percentage of available reservations that are actually collected;

The overall indicator is obtained by applying b to a to give the real


reservations performance.

61
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

New indicators are trialed by using the committee to undertake a number of


small investigations on current issues as a matter of priority. For example,
DBS intends to examine patterns of student internet use, and one or two
other priority areas to find additional indicators, and to consider strategies for
dealing with emerging issues such as making best use of equipment (eg
dealing with frivolous internet use), and making best use of the catalogue
(automatic truncation and automatic synonym searching are to be included in
the catalogue design).

The DBS was also a partner in the EQLIPSE project mentioned above, and
outlined in the Useful Sources list (Appendix E). Difficulties encountered in
this project included:
• the level of bureaucracy required for an international project;
• the instability of the trial software;
• the indicators list was conservative and perhaps not the most useful
listing; and
• the original project was based on the premise that library systems could
produce data for management purposes with a little work. This proved not
to be the case.

4.5.3 The European researchers: Manchester Metropolitan


University. Centre for Research in Library and
Information Management (CERLIM)
The research staff at CERLIM have been connected in some way with all the
European Union sponsored projects with some bearing on performance
measurement. These included DECIDE, MINSTREL, DECIMAL, EQLIPSE,
CAMILE (Concerted Action on Management Information in Europe). Details of
all of these are in the Useful Sources list (Appendix E).

The EQLIPSE project was useful in terms of:


• summarising the current situation in Europe in regard to performance
measurement, and quality management;
• presenting a range of indicators;
• trying these indicators out in the field; and
• trying to present library management information requirements in terms of
a specification and prototype for a software package.

The successor EQUINOX project will take the outcomes, add in electronic
performance indicators and a requirement to come up with software that is
more attuned to user needs, and much closer to commercial reality.

62
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The EQUINOX group will also consider the use of broader quality
frameworks such as the relevant Quality Awards. Initially the focus was on
ISO 9000, but project surveys produced a surprising strength of feeling in the
European library world that individual institutions want to pick and choose
what suited them, and not be tied to inflexible programs which cost a
great deal in time and/or money, as is the current perception in the case of
ISO 9000.

4.5.4 SCONUL Performance indicator project at


Cranfield University
Jane Barton was the principal investigator for this 1998 project, under the
supervision of John Blagden. The aim of the PI project was to use existing
SCONUL (Standing Conference of National and University Libraries) data in
large part, to develop a small suite of performance indicators which may be
acceptable across the university sector. Their use was to monitor
performance, and to consider comparative performance from a senior
institutional management perspective. There were suggestions from vice-
chancellors that large swathes of ratios and comparisons (such as those in the
Effective Academic Library report) either do not tell the management what
they are interested in, or are so operational in nature and esoteric, that
university management find them virtually incomprehensible.

The aim of the new slim line PI’s was to allow individual institutions to see
how they compared vis-a-vis other institutions, and check with themselves
whether this is what they want in this instance. It would also make
identifying sector leaders for certain areas easier, and allow a clearer picture
of each institution, so that individual institutions, which have some common
trait, may follow up with certain others for benchmarking purposes.

The recommendations in the report which arose from the project, included
some matters of interest:
• expenditure on information provision includes both acquisition and access
(i.e. cost to access via ILL or electronically);
• seat hours and seat hours occupied per week are used to measure access;
• user education and information services are dealt with as a combined
activity expressed in terms of staff hours (very similar to a recent
Australian Universitas 21 approach);
• the recommendations are set alongside contextual information (library
context/institutional context. This is also similar to a U21 approach);

63
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• impact was mentioned, but with the rider that the British Library is about
to select a number of tenders for research in this area, and that this should
be included farther down the track;
• measurement of in-house use compared to borrowings each time there is a
significant change in library service is recommended, but these data are
not comparable so this would be for operational use only;
• there are recommendations on the liberalisation of access to holdings
nationally, and the consideration by libraries of this type of access over
holdings strategy for marginal areas;
• availability studies should include consideration of electronic publications;
• specific availability studies should be conducted for specific categories of
users only initially; and
• data sets for the electronic library should be developed in concert
with UCISA.

Europe itself, for all the collective efforts of the European Union, is still very
much pre-occupied with regional concerns. European approaches are still
very localised, vary widely according to the region, and any move towards
cooperative initiatives like benchmarking, are unlikely to arise from
national initiatives.

European libraries are interested in what Australians are doing since


developments in the United States tend to dominate any international
perspectives for them. The ability of the Australian libraries to organise on a
national basis is also of interest to European librarians.

64
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5 Quality/best practice

5.1 Terminology
One of the difficulties that becomes evident when discussing quality and best
practice activity in academic and research libraries, is that of determining a
shared understanding of the terminology. A common theme of discussions
during site visits to specific libraries throughout the course of this project, has
been that words such as ‘customer’ and ‘quality’, (perhaps because of their
more general association with the business environment), may elicit a degree
of cynicism amongst staff, and create a barrier to quality program
implementation. This is particularly evident for those further removed from
the actual implementation of ‘quality’ management programs and frameworks.
At both the University of Melbourne and the University of Wollongong, the
adoption of quality language and terminology is seen as integral to the
success of programs. ‘The use of some of the terminology such as ‘client’ or
‘customer’ may be unfamiliar at first, but it is vital in sharpening the service
focus and inculcating a professional view of the relationship between those
providing and those receiving the service’ (McGregor 1997).

Generally accepted definitions of quality include:


• 'Quality…means a predictable degree of uniformity and dependability at
low cost, with a quality suited to the market’ (Deming 1986);
• 'the extent of discrepancy between customer’s expectations or desires, and
their perceptions of the service’ (Zeithamal 1986);
• 'The totality of features and characteristics of a product or service that bear
on the library’s ability to satisfy stated or implied needs’ (ISO 11620).

Irrespective of which definition is preferred, it is clear that ‘quality’ cannot be


achieved without consideration of, and focus on, the customer or client. It is
perhaps for this reason, as much as any other, that quality management has
become widely recognised as having a significant place in contemporary
university and research libraries. The traditional service ethic of libraries,
which, in practice, has often meant the provision for users of what libraries
judged to be appropriate services, has been taken into a new dimension in
some libraries through appreciation of quality management principles, and
the planned implementation of these principles over a number of years.

Best practice and quality are often used synonymously, and, whilst there are
similarities, best practice has engendered its own definitions. The EIP ‘Best

65
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

practice for Australian university libraries’ project team has adopted the
Australian Best Practice Demonstration Program (ABPDP) definition that
defines best practice as:

The pursuit of world class performance. It is the way in which the most
successful organisations manage and organise their operations. It is a
moving target. As the leading organisations continue to improve the
‘best practice’ goalposts are constantly moving. The concept of
continuous improvement is integral to the achievement of best practice.
(ABPDP 1994).

This definition encompasses comments made by respondents to the project


team in an informal discussion with delegates at a 1998 Benchmarking
conference. Most defined best practice in terms of ‘a list of
characteristics/principles, rather than a general statement’ or ‘an ideal which
the library is pursuing, or to which it is aspiring. It is somewhat of a moving
target’. Some however, used ‘best practice’ in the plural to describe ’sites of
excellence where the lessons on how and why the site(s) work(s) so
effectively in the specified area(s) can be adapted to one’s own operation’
(Trahn, e-mail correspondence, December 1998). These comments were
echoed strongly by delegates at the Australian Library and Information
Association Conference, Best Practice Issues Session in Adelaide in October
1998.

5.2 Background
The ‘quality’ movement in university libraries in Australia developed out of
the climate surrounding the then Commonwealth Labour Government’s
Quality Audit of the higher education system during the period 1993–1995.
This process, rather than focussing on educational outcomes, assessed the
level of quality assurance policies and practices in place, resulting in a
‘ranking’ of universities that was subject to intense criticism (Williamson &
Exon 1996). Nevertheless, the audit period created an impetus for the review
and adoption of quality management programs both broadly across the
universities, and also within individual university libraries.

Although the principles of Total Quality Management (TQM) as espoused by


Deming and others during the 1980’s had been enthusiastically adopted and
built into a number of library planning programs, the Quality Audit period
saw the formalisation of ‘quality’ into university and library management
documentation and terminology. Activity during this period included the
development of the current suite of three CAUL performance indicators. Some
institution specific initiatives at this time included the following examples:

66
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• Monash University Library adopted a Total Quality Management program in


1992/3;
• Victoria University of Technology Library and Macquarie University Library
implemented a Total Quality Service (TQS) program during 1993/4; and
• Northern Territory University Library was awarded a grant to participate in
the Australian Best Practice Demonstration Program in 1994.

A Council of Australian University Librarians/Australian Institute of


Management TQM seminar in 1994 provided an opportunity for librarians to
share these experiences.

Subsequently, the provision of quality management training to librarians has


extended significantly, with a number of programs in place including those
sponsored by the Australian Library and Information Association (ALIA) and
the AIMA Training and Consultancy Services. The Australian Quality Council,
through its training programs, provides support and guidance for a national
quality framework—the Australian Business Excellence Framework, and,
through this, the Australian Quality Awards for Business Excellence.
A number of libraries have successfully participated in the assessment process
associated with the Australian Quality Awards since 1996, including Northern
Territory University, the University of Melbourne and University of
Wollongong libraries. (Issues related to the training aspect are explored in
more detail in Chapter 6).

Responses to the ‘EIP Quality/Best Practice/Performance Measurement Survey’


indicate that current interest and activity in quality and best practice across
the Australian university library sector may be influenced, not only by the
successful adoption of formal frameworks such as those described above, but
also by the structural settings of libraries within universities, which have
exposed some of these library services to other units within the institution
which practice or encourage frameworks, such the ISO 9000 series of
standards or quality frameworks designed for vocational education and
training. Mergers between higher education and TAFE, and the convergence
of library and computing services, have both played such a role in some
libraries. Some Australian universities such as RMIT University, Swinburne and
Curtin University have dedicated quality offices or units which encourage and
support the implementation of quality practices across the university.

67
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5.3 Survey findings

5.3.1 Why ‘quality'?


Without exception, university library managers see formal quality
management occupying a significant place in contemporary academic and
research libraries. The use of words such as ‘essential’, ‘important’ and
‘integral’ typify the responses received. A number of responses elaborated
and outlined why programs had been implemented, or what quality
management could or had contributed to individual libraries:

Formal quality management programs can act as catalysts to ensure


that excellent quality management and continuing improvement
practices and processes are planned, developed, implemented,
maintained and improved, and that these operate consistently across
the Library. (UNSW).

A formal quality management program provides concrete evidence of


the organisation’s commitment to client service, continuous
improvement and provision of quality service. (Griffith).

At Macquarie a formal quality service initiative implemented in 1994


proved an excellent way to focus in the current teaching and research
needs of the university, and enabled the library to reposition itself to
focus on the customer’.

The perceived need to ‘demonstrate to the funding authority that services are
of high quality’ (Monash), and to provide ‘effective management of programs
in a climate of decreasing resources’ (Ballarat), were cited by a number of
libraries implementing quality management programs. Perhaps these
arguments are best summed up by Felicity McGregor from the University of
Wollongong Library:

The changing nature of universities is one of accountability for the


management of resources. Quality management will help ensure a
systemic approach is taken in assessing the effectiveness and efficiency
of resource management …We have found the application of a quality
management program has led to: innovative solutions to problems with
systems and resource limitations… (McGregor 1997, pp. 83).

Of the twenty-nine CAUL member responses to the survey, ten libraries


specifically related the need for improving and maintaining client focus as a
primary outcome of either intended or implemented quality programs. This is
not unexpected given the traditional service ethos that transcends service
delivery across all sectors. More unexpected were the few responses that

68
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

indicated that whilst ‘quality’ should be considered in planning management


programs, it could be extremely difficult to implement. Central Queensland
University responded:

We have tried to implement a formal quality management program as


part of a larger university wide project. This received special funds and
was only implemented in two sections of the Library. The formal
program is very expensive and not feasible to maintain.

Given that for a number of libraries’ implementation of ‘formal’ quality


management is linked to the imperative to provide a basis for arguing the
maintenance or increase of funding levels, it is ironic that for others, the lack
of available funding, is the very reason formal programs cannot be developed.

5.3.2 Criteria for successful quality programs


Survey responses from libraries that have either implemented or are in the
process of implementing formal quality programs, indicate a number of
common elements that make such programs successful:
• Integration with the library’s (and wider university) strategic planning
processes leading to ‘clear articulation of mission or purpose, values and
goals’ (Deakin, ADFA, Curtin), and support from the institution;
• Planning system functions effectively to ensure implementation of
improvements (UQ). Align with business, marketing and HR plans of the
institution as well (Griffith);
• Commitment from senior, (Flinders and others) middle management and
supervisory staff (Curtin);
• Senior manager responsible for oversight of program (Curtin);
• Senior and middle managers prepared to relinquish some of their
traditional decision-making responsibilities (Macquarie);
• Integrated within management processes (Swinburne);
• Realistic goals and expectations set (Melbourne);
• Flexible enough to be tailored to meet individual library requirements
(Flinders, Macquarie);
• Adequate resourcing with stakeholders clear on the level of
implementation, and the appropriate resources required (CQU);
• Participation of all stakeholders including clients, in the process (RMIT);
• Need to provide a quantitative basis to support submissions and any
funding recommendations (Griffith);
• Ensuring continuous improvement by appropriate feedback and
implementation mechanisms (Curtin);

69
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• Willingness of library to change as part of the process (CSU);


• The importance of a good internal presence working with an external
facilitator (Macquarie);
• Make sure use of frameworks does not detract from existing client focus
and feedback mechanisms, and on time and within budget service delivery
(Deakin);
• Effective training programs to ensure commitment to continuous
improvement (QUT);
• Communication of the underlying principles to all staff (Griffith);
• 'Ownership’ of the program by staff at all levels (achieved through staff
involvement in the development and implementation of the program
(Griffith). This in turn leads to opportunities for staff to see ‘practical
application and benefit of such a program to their work areas, and the
library as a whole’ (Swinburne, ADFA);
• Development of processes and support mechanisms integrated into day to
day ways of working, (especially dealing with information management)
(UNSW);
• Enhancement of staff satisfaction and well being (Melbourne); and
• Development and maintenance of effective teams, armed with knowledge
of the principles of quality management and the tools (USQ).
Above all, commitment of senior staff, together with the ability to look
beyond the immediate and focus on future outcomes, was seen by a majority
to be the key to successfully adopting quality management. The following
comment, again from Felicity McGregor perhaps best sums this up when
discussing the motivation behind Wollongong’s adoption of the Australian
Quality Council’s Business Excellence Framework:

A major factor in selecting a program emphasising performance


measurement was strategic; to ensure that the library would be
equipped to meet future challenges…Quality management was
adopted…as a comprehensive and integrating framework which was
applicable to the library’s particular stage of development and to the
successful management of current and perceived future environments
(McGregor 1997, pp. 83).

70
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5.3.3 Impact of programs on staff


Generally, survey responses indicated that the impact of the introduction of
quality management initiatives/programs has been positive. A summary of
comments below describe some of the more significant outcomes:
• Improved client focus;
‘These programs have resulted in services which meet the needs of users,
awareness of staff to the needs of users, and more efficient management
of policies and processes’ (CQU).
• Staff satisfaction, improved morale, empowerment, improved
work practices;
‘In general, the effect has been positive and empowering for staff. The
focus on measurable outcomes has allowed staff to see the results of
initiatives, and changed work practices’ (Griffith).
• Strategic advantages for the staff and the library;
‘staff have a sense of where we are going (Strategic directions); we know
if we are achieving what we set out to achieve (operational plans, BSC);
we know the priorities’ (Deakin).
‘Clearer alignment of individual action plan objectives with the Library’s
operational and strategic plans’ Increased profile within the University,
now more closely aligned with the University’s goals’ (Melbourne).
• Heightened awareness of quality issues/quality consciousness;
‘A large proportion of library staff have an increased understanding of, and
commitment to improving quality. Library service has improved in several
areas due to information gathered from quality programs being used to
redesign processes’ (QUT).
• Resource Implications.
The Library has also benefited from being able to use information from
quality programs to lobby the University for more resources’ (QUT).
Not all institutions have had totally positive experiences. Difficulties have
been encountered with jargon and relating abstract concepts to actual work.
There are cases where some staff have remained unconvinced, or opposed to
the quality management approach, as it questions many assumptions and
work methods which staff have often used for many years (Melbourne).
Where there has not been full integration of quality initiatives into work
processes, quality can be viewed as ‘extra’ work (NTU) and the amount of
time for full implementation can affect how staff perceive the program (UTS).

71
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5.3.4 Features of the organisation/structure


Survey respondents also identified a number of features within their
organisational structure/management which they believed had facilitated the
successful implementation of quality programs. Responses did not highlight
any particular feature/structure over the rest. Of interest were the differing
views on whether a designated staff member should be assigned the role of
Quality Coordinator—views were quite polarised on this issue.

A member of the Senior Management Team is the Quality Coordinator,


this has facilitated the successful implementation of these programs
(Curtin).

The Library’s Research & Development Officer has a coordinating role


in the implementation of the Library’s quality management programs by
researching, developing and overseeing the implementation of programs
(QUT).

The original Quality Steering group was replaced by a part-time Quality


Coordinator and Quality Coordination Group in 1997. The Quality
Coordinator works closely with the Manager, Planning & Projects
(Melbourne).

In contrast:

Because the culture is one of continuous improvement and delivering


client focussed services, there is no Quality Coordinator. A Strategic
Initiatives Coordinator keeps tabs on the teams we use to implement
aspects of our strategic plan, as well as undertaking project work’ .
‘We favour a model that has `quality issues` mainstreamed, rather than
set up ‘quality management’ separately, and run the risk of staff seeing
quality and continuous improvement as something extra. (Deakin).

There is no separate Quality Committee because the decision has been


made that the existing team leadership groupings should integrate
quality issues into their normal functioning. The Quality Coordinator
position is simply an initial catalyst. There will be some ongoing
monitoring and coordinating functions, but these should not requires a
separate position in the longer term. (UNSW).

Although at the time the survey was conducted, more than 50 per cent of
the respondents had assigned responsibility for quality to one person, the
intention was not always to maintain the position indefinitely. Many
respondents also indicated that a team based culture and a relatively flat
organisational structure were crucial to the successful implementation of the

72
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

programs, as was visible and ongoing support from senior management.


Specific examples include:

A Management Advisory Team made up of all team coordinators and


level 7 staff and above which meets monthly to review and develop
policies and communicate broader university issues to all staff.
(Wollongong).

The ‘7 Up’ group which consists of all library staff of HEW7 and above.
All members of `7 Up` are also members of a Priority Area taskforce
and responsible for promoting and propelling the Priority Area quality
initiatives throughout the library and involving other staff in the
initiatives. (Queensland).

5.3.5 Quality management frameworks


Survey responses indicate that the adoption of standard external quality
frameworks (eg. Australian Business Excellence Framework, Balanced
Scorecard, ISO 9000) to support quality initiatives in Australian university
libraries, is not yet widespread. Many libraries however, have adopted or
developed their own programs, either based on standard frameworks as
above, or on the broader concepts of TQM. A number have indicated that
although formal programs may not be in place, the library actively utilises
quality tools such as benchmarking, performance measurement and
continuous improvement, usually in an effort to achieve best practice in
specific service areas. Both library and institutional imperatives influence the
choice of framework and/or quality improvement tools. The level of adoption
of standard quality management frameworks by Australian university libraries
is summarised below:

73
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Table 5.1 Standard frameworks implemented in the past or currently implemented


CHEMS Queensland 1998 assessed against libraries framework
Balanced Score Card Deakin (has added a 5th perspective—Information
Resources to the standard Learning & growth, Internal
business processes, Client/Customer and Financial
management)
Australian Quality Awards for
Business Excellence Melbourne (applied for recognition 1994; 1996 winners
of Achievement in Business Excellence; Award; self
assessment against framework planned in 1999).
NTU (1998 Institution winners of Progress towards
Business Excellence Award);
Wollongong (via internal ‘Quality and Service
Excellence’ program, 1996 award at achievement level
for Award for Business excellence, 1998 Finalist in
Outstanding Achievement category)
ISO 9000 Ballarat TAFE arm plans for Information Services Branch
(included Library but with focus initially on IT
accreditation —see below)
ISO 11620 Use only to inform performance measures chosen
(Curtin, ADFA)
NT Quality Framework for
Vocational Education and Training NTU (1998 Institution awarded Quality Endorsed
Training Organisation status)
Swinburne Quality Management System
(SQMS)—based on Scottish Quality Management
System with links to AQA and ISO 9000 Swinburne (required by University to self assess on one or
more SQMS key processes annually)
TQM Elements of TQM in Wollongong QSE Program and use
of TQM training and tools, QUT
TQS Macquarie, UTS, VUT

The frameworks formally adopted (Balanced Score Card, AQA, SQMS) have a
number of features in common. For example, all are multi-faceted and utilise
a number of similar techniques and tools. Both AQA and SQMS are heavily
reliant on self-assessment, and all have been adopted for their perceived fit to
the particular library’s/institution’s needs/strategic directions. Both the BSC
and AQA frameworks have their roots in the business environment, and have
undergone some adjustment or reinterpretation to encompass aspects of
library service delivery. A thorough and sustained implementation of a quality
framework, whether the result of a library only initiative or a university wide
initiative is not an easy process, as the relatively small number of fully
implemented frameworks indicates. How many of the institutions who
indicate they have ‘partially implemented’, or are ‘in progress’, or are
‘considering initiatives’, will achieve full and successful implementation of a
quality framework could, in part, be influenced by the ease with which

74
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

libraries can make informed decisions about the most appropriate path to
follow, and the steps to take.

A number of reports and articles in the literature outlining the adoption and
implementation of these frameworks, in particular the AQA model (McGregor
1997, Presser & Garner 1999), have helped to increase awareness of their
potential usefulness and applicability across the sector. It is significant that
whilst only six of the twenty-nine CAUL member respondents to the survey
indicated that they had formally adopted these programs, a further eleven
libraries indicated that they were either making use of AQA criteria, or were
interested in working towards, or gaining ISO certification (18 libraries are
thinking/doing something as below).

Table 5.2 Partial implementation or progress towards/intention to implement


ADFA Intended use of ISO 11620, using CAUL indicators where they are not
inconsistent with ISO 11620
ANU `Quality assurance about to be implemented as recommended by a review`
Ballarat Interest in AQA but no decision as yet; ISO 9000 possible for IT sector as well
Curtin ISO 11620 has been used to develop key performance indicators; University
Program & Planning Review aligned with AQA criteria
Deakin Looking at AQA and ISO models as part of 1999 exploration of frameworks as a
strategic imperative
Griffith Investigating adoption of AQA with a view to assessment either late 1999/early
2000
Melbourne Embarking in 1999 on an organisational self-assessment project based on the
Australian Business Excellence Framework
Monash Consideration being given to ISO 9002 certification of digital library
Newcastle Plan to use ISO in 1999
NTU Considering extension of ISO 9001 to Information Resources
RMIT Successfully participated in audit of RMIT quality management system for teaching
and learning (ISO 9001 certification)
Used aspects of ISO 9000 and AQA through RMIT Quality Office/AQC
sponsored quality review and improvement process for two process flows
SLSA Considering using AQA framework for self-assessment
Swinburne May participate in ISO 9000 certification being undertaken by TAFE arm in 1999
UNSW Working towards extended Divisional compliance with ISO and self assessment
and application for AQA award in 2000
USQ Use the principles of AQA, Deming and ISO but no formal application for
assessment
UTS AQA criteria used to describe long term priorities for Library in a matrix
VUT ISO 9000 being introduced to describe selected library procedures, one
amalgamated campus successfully audited against ISO 9000 (WMIT)

75
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

As the survey responses clearly indicate, recent mergers between Higher


Education and TAFE institutions have led to interest in the level of adoption
of quality frameworks within TAFE, and how they might be extended across
the new larger institutions. Specific examples include the University of
Ballarat and Swinburne University of Technology.

5.3.6 Relationship between quality frameworks and tools


Libraries have described the relationship between the quality tools
(benchmarking, performance measurement) and overall quality management
programs in two ways; firstly as a means to improve processes internally,
and secondly to improve the strategic position of the Library within the
university community:

Information gathered from performance indicators has been used in


two ways:

In a practical sense it has been used to improve workflows and services.


Politically, information has been used to gain support for the Library eg
a ‘poor’ result in conspectus has been used to argue successfully for
funds for the collection. (QUT).

Process benchmarking is seen as one tool in the quality toolbox, it is


linked to the strategic plan, process improvement and overall
involvement in the University’s organisational goals in Universitas 21
and Victorian relationships. We also link it to our application of the
AQC Business Excellence framework. To this end, there is an intention
to participate in the AQC benchmarking networks. (Melbourne).

Since benchmarking is a tool to enhance the ability to manage by fact


and to contribute to process improvement it particularly contributes to
the enhancement to those elements in a quality framework related to
those areas. It has a significant role in contributing to quality
information as part of the quality management system (QMS) of the
Division of Information Services’. The Library target measure results are
routinely used as part of annual planning review. Next year there will
be particular emphasis on the improvement and relevance of current
indicators as part of a drive to enhance our quality management
framework. (UNSW)

An integral part of the Library’s Quality and Service Excellence program


adopted in 1994, modelled on the AQ Awards assessment criteria which
focuses on continuous improvement, innovation and encourages
regular self assessment and benchmarking. (Wollongong).

76
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5.3.7 Informing future practice and integration into


library operations
Irrespective of whether a framework is a standard, external model or has
been modified or developed internally, or whether it has been fully or
partially implemented, respondents all provided a number of positive
outcomes with respect to the way the framework and/or tools had
informed subsequent practice and the extent to which the tools such as
benchmarking and performance measurement had been integrated into the
way the library works:

As a result of these programs and subsequent to analysis of the results,


recommendations are made to the senior management of the library. In
some instances changes can be implemented straight away, and in
others working groups needed to be established to examine possible
solutions to issues or problem areas. Many positive improvements have
resulted from the programs we have conducted. (CQU).

Setting up the BSC has caused us to totally revise the way we present
our Strategic Plan, our Annual Report, what statistics are collected etc.
It has had profound influence at the management level and, hopefully,
once all bedded down, to all levels within the organisation. (Deakin).

Improved, simpler, more effective planning process. At the micro level,


improvements to specific operational processes (eg. shelving, ordering,
lending) as a result of continuous improvement projects. (Melbourne).

Integrally. Results of quality management initiatives are fed back into


the quality management programme. (Queensland).

Prior to QSE, many of our decisions and evaluation practices were


based on intuition or gut feel. We have now developed a more
sophisticated approach to process and service management, have a
better understanding of variation and its impacts, trend data has been
established and assists us in planning for future events. Our staff have
also become strong champions of quality (and interestingly see it less as
an additional thing to do) and would be reluctant to return to pre-QSE
practices. (Wollongong).

77
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5.4 Case studies

5.4.1 The Balanced Score Card (BSC)—Deakin


University library
(Information source: Sue McKnight, University Librarian; Cate Richmond,
Strategic Planning Manager; Helen Livingston, Deputy University Librarian)

Framework description
The Balanced Scorecard (BSC) is basically a way of grouping performance
indicators with the additional advantage of providing a strategic management
system. Developed at Harvard Business School by Robert S. Kaplan and
David P. Norton, it has been primarily designed for businesses as a means of
focussing beyond financial measures, to incorporate criteria that measure
performance from three additional perspectives—customer satisfaction
(Clients), internal business processes and the organisation’s innovation and
improvement (learning and growth) activities.

From our client focus groups we ascertained the hierarchy of values (or value
models) of our clients, and these have been used to define the objectives
within the five perspectives. For each Objective there are a number of high
level performance indicators that are relevant in our environment. The high
level performance indicators cascade down to Unit level indicators, and by
the end of 1999, we hope, into individual performance indicators in the
Performance & Planning review process.

Why BSC?
BSC provides a framework that the Library believes can be easily explained
and understood by staff and others. It has been given increased relevance
within the library environment through the addition of a fifth perspective—
Information Resources (‘satisfying demand for information from Library and
other resources') to the existing categories specified by the Library as Clients
('providing value to clients to help them achieve their goals’), Financial
Resources (‘building financial strength to develop Library services and assets’),
Internal Processes (‘excelling at processes for fast, effective delivery of services
and resources’) and Learning and Growth (‘enabling staff to lead and
innovate’). ‘The BSC is a tool for monitoring all facets of our work and service
delivery. It is really a “quality tool” not just performance measurement’.

78
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Background
Approximately three years ago the Library began to focus on the library
strategic plan as a means of developing future direction. Involving (and
engaging) stakeholders in the process was seen as a key issue—this was
achieved through the use of focus groups targeting key customer
groups/levels of students. Groups were externally facilitated, and from the
results the strategic plan was developed, and arranged to reflect what the
stakeholders identified as key or important. There was some difficulty
identifying measures for strategic directions—the Balanced Scorecard (BSC)
was recommended by the consultants engaged to drive the process and after
review was subsequently adopted.

Criteria for successful integration


• Clear articulation of mission or purpose, and values and goals is required
so that the continuous improvement process takes place within a
framework that everyone understands and can buy into.
• In addition it is really important to instill a culture of improvement
and change.

Training commitment
The management team has had many days of training about the BSC,
performance measurement, strategic planning etc. All professional staff have
had an overview of the BSC and have contributed to setting high level
performance indicators. All staff were given a presentation of the BSC and
what it means for the Library. Unit managers have discussed with their own
staff the BSC, strategic directions and have collectively set up operational
plans and targets.

Organisational features
Three teams have been formed (Information Skills, Information Resources,
Access and Delivery) to advise library management on various issues, and to
facilitate implementation of the Strategic Plan. All professional level staff are
involved in various working groups related to implementation. Non-
professional staff are not generally expected to participate but are given both
encouragement and the opportunity if interested.

79
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The Library has a strong emphasis on keeping client needs in focus.


• Annual focus groups.
• Strong liaison with student association and other groups.
• Extensive School based committee representation.
• Well-established academic staff liaison programme.

In addition the Library has a dynamic library (internal) intranet that aims to
provide staff with both access to the available tools and information needed,
together with internal communication and information exchange. Arranged
under four areas:
• Units—governance (annual reports, working copies of strategic and
operational plans, policies and procedures, BSC).
• Staff—(teams, training and development).
• Tools—(online library resources, project management resources,
templates).
• Help.

How well is the quality management system integrated?


The BSC is being built into all our planning and decision making. Setting up
the BSC has caused us to totally revise the way we present our Strategic Plan,
our Annual Report, what statistics are collected etc. It has had profound
influence at the management level and hopefully once all bedded down, to
all levels within the organisation. A Strategic Planning Manager monitors all
the teams we use to implement aspects of our strategic plan.

Criteria for successful integration


Team work (cross campus and cross divisional and often with a member from
another area of the University, including academic staff) are the key to getting
ownership and excellent results.

Relationship between framework and tools


BSC has been the main impetus for development of performance
measures/indicators within the Library. Customer satisfaction is measured in
several ways. A University wide internal client satisfaction survey has been
developed. University staff consistently rate highly the services provided by
the Library. This survey may not be continued beyond 1999. Focus groups
have been used successfully to identify a hierarchy of customer values as part
of the BSC development and implementation. The intention is that focus

80
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

groups will remain the main method of identifying client needs. Internal
customer feedback processes are in place via online and written feedback.

Indicators to measure electronic access and use are seen as a priority area for
development but it is extremely difficult to find effective measures. BSC
includes general measures and targets related to ‘Using physical and
electronic information resources,’ however these measures do not indicate
what is used or the quality of the resource. Some work may be done in
linking electronic measures to client satisfaction measures.

Benchmarking generally is not used as a tool within the BSC framework.


Benchmarking is generally viewed as having the potential to fix an
organisation in a particular point of time or in one direction whilst ignoring
customer interest and need in another. The Library is far more interested in
the potential for continuous improvement through informal internal
benchmarking rather than institutional/library comparisons.

Challenges
A major challenge for Deakin may be the linking of BSC to other quality
frameworks. We are investigating quality frameworks such as ISO, AQC and
other models. We are revising all our policies and procedures using a
template that will stand us in good stead by these accrediting bodies.
However the prime motivation is to get consistent policies and procedures
rather than go for an award.

The University Quality Facilitator is interested in supporting the Library to


focus on AQC accreditation. The Library may use this as a way of looking at
the ‘gap’ between BSC and AQC frameworks.

Achievements
Staff have a sense of where we are going (strategic directions); we know if
we are achieving what we set out to achieve (operational plans, BSC); we
know the priorities (strategic plan). Because we can see where we are at any
given moment in time we can easily communicate our achievements etc.

As a result staff morale is high and we get many accolades from the
University who can see that we are doing what they want and doing it well.
The University can also see how our strategic plan aligns with the University’s
strategic plan, and therefore how the Library is contributing to the
achievement of the University’s strategic objectives.

81
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Summary
Like Wollongong (see below), Deakin University Library has successfully
adopted and applied a program primarily aimed at commercial business
operations to a service environment. The extent to which BSC will continue
to influence the management planning process will be dependent on staff
support, and continuing tangible improvement of planning and operational
processes. Indications are however, that BSC has provided Deakin with the
means to focus activity without losing sight of customer and client values.

An important aspect of the adoption of BSC has been its ability to provide a
quality management and continuous improvement tool that can be
‘incorporated into all aspects of library practice’, allowing ‘quality issues’ to be
‘mainstreamed, rather than set up ‘quality management’ separately and run
the risk of staff seeing quality and continuous improvement as something
extra’.

5.4.2 Australian Quality Awards—Business Excellence


Framework—University of Wollongong and University of
Melbourne libraries
(Information Source: Felicity McGregor, University librarian, Margie Jantti,
Quality Coordinator, members of University of Wollongong Library focus group).

Framework description
The Framework provides a roadmap for business improvement and long term
success. It is both an evaluation tool for the Australian Quality Awards for
Business Excellence, and a business improvement tool. It can be used for
internal self assessment by any organisation wishing to improve its business
results and ensure long term viability. It is also a useful tool to take stock of
where your organisation is now, and to involve your staff in getting to where
you want to be in the future. It focuses on key elements underpinning
effective management practices. The 1998 framework required organisations
to provide evidence using the ADRI (Approach, Deployment, Results,
Improvement) model against the following categories:
• Leadership
• Strategy and planning
• Information and analysis
• People
• Customer focus
• Processes, products and services
• Organisational performance

82
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Why quality?
'The changing nature of universities is one of accountability for the
management of resources. Quality management will help ensure a systemic
approach is taken in assessing the effectiveness and efficiency of resource
management.’ (McGregor 1997, pp. 83).

Choice of framework
‘A major factor in selecting a program emphasising performance measurement
was strategic; to ensure that the library would be equipped to meet future
challenges…Quality management was adopted…as a comprehensive and
integrating framework which was applicable to the library’s particular stage of
development and to the successful management of current and perceived
future environments’ (McGregor 1997, pp. 83). The adoption of a formal
quality management model has proven to be beneficial for us. We have
adopted the Australian Quality Council’s Australian Business Excellence
Framework for monitoring and measuring organisational performance. This
model was chosen as opposed to other management models for its
organisational fit, and congruence with existing principles and practices’.

Background
A formal quality management program was implemented in 1994, Quality
and Service Excellence (QSE). The objectives of QSE include:
• Development of excellent Library services through the implementation of a
total quality management program: Quality and Service Excellence;
• Development of a systematic approach to documenting the improvements
in client service which have been achieved to date, as well as providing a
basis for measuring future improvements;
• Library-wide commitment and priority to the application of quality
management principles to all processes and services;
• Establishment of a framework for regular self-assessment of the Library’s
activities and results;
• Library-wide focus on delivering increasing value to clients; and
• Staff who are empowered to use their individual and combined skills and
experience to improve processes and their outputs through needs-based
training and development.

A Critical Success Factor Framework (CSF) was established in 1996 to identify


areas that are critical to the ongoing success and sustainability of the Library.
Key process and key performance indicators (KPIs) were also identified.
Teams within the Library have developed performance indicators for their key

83
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

processes and services within the CSF/KPI framework. KPI reports are
produced annually, however, teams monitor their processes and services
regularly throughout the year. Benchmarking commenced in 1997 and we
have just embarked on our third project.

Organisational self-assessment based on the Australian Business Excellence


Framework criteria commenced in 1996, and was repeated in 1998.
Assessment will be an ongoing management tool. Client and staff surveys and
focus groups are regularly conducted to measure the integration and
demonstration of the values, and to ascertain the gaps between needs and
performance outcomes.

Criteria for successful implementation


• Full support from executive management—a quality champion;
• Vision for the future—where does the Library see itself or want to see itself
in the future;
• Team-based culture;
• Training in the use of TQM tools and principles, facilitation skills and basic
statistical analysis;
• Opportunity to practice the skills, eg Quality Improvement Teams;
• Ongoing communication of quality activities, initiatives and results;
• Reinforcement of the model, eg in planning, communication media to
demonstrate it is not a one-off activity or a management fad;
• Recognition that leadership and initiative is not the sole responsibility of
management; and
• Perseverance.

Training commitment
(1994) QSE program. A consultant provided training in basic TQM tools and
principles; some staff received training in facilitation skills. Inhouse
workshops on TQM Awareness were developed, and refresher training was
also provided for staff who had not had the opportunity to participate in a
quality improvement team, and to introduce new staff to QSE. These
workshops are repeated on a needs basis. A number of these programs
would form part of a staff development program in any organisation, but the
focus of the Wollongong sessions is totally informed by the QSE approach
and its requirements.

84
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Examples of in-house workshops include:


• TQM Awareness
• TQM Plus
• Quality Client Service
• Telephone Techniques
• Dealing with Difficult Clients
• Feedback Skills
• KPI Workshops
• Benchmarking
• Data gathering and statistical awareness, use of EXCEL
and statistical packages
• Participation in external workshops is also encouraged
(AQC short courses etc)

Organisational features
• Teams structures were introduced in the early 1990’s and training was
provided to facilitate the process. The Library now has a relatively flat
structure and is team-based;
• Multiskilling and job enrichment opportunities have improved overall
operations knowledge for many staff;
• A Management Advisory Team, made up of all team coordinators and level
7 staff and above, meet monthly to review and develop policies and
communicate broader university issues to all staff;
• Staff empowerment is encouraged by supporting judgement and decision
making throughout the Library;
• Supervisors are now called coordinators;
• A Quality Steering Committee formed in 1995; and
• A Quality Coordinator appointed in 1996—the key aims of this dedicated
position were to: drive QSE and to facilitate the integration of QSE
principles throughout the Library; provide a resource to staff; take
responsibility for QSE documentation; and evaluation of quality programs.

Criteria for successful integration


To overcome cynicism of introducing a new management fad, ongoing
commitment and reinforcement of the program is vital. The role of quality
champions cannot be under-estimated. Initially the champions may be
members of the senior executive, however to sustain a commitment to
continuous improvement and value adding activities, champions must be

85
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

nurtured through all levels of the organisation. The organisation must develop
an infrastructure that encourages and reinforces the values and behaviours
that are congruent with how the organisation wants to operate.

Quality improvement programs were identified by analysing client feedback


and external assessment results. The QSE program has been an evolutionary
process and is continually refined as we develop our knowledge of quality
management principles, and develop recommendations for improvement. As
we continue with organisational self assessment, the application of the ADRI
(Approach, Deployment, Results, Improvement) model, improvement
activities have become a part of the ongoing improvement cycle. We have
demonstrated our commitment to the QSE program in the following ways:
• Quality management principles have been incorporated into the strategic
planning process, and our key strategies and actions are organised under
our Critical Success Factor (CSF) framework;
• Teams develop action plans which support continuous improvement
activities and strategic objectives;
• Annual and monthly reports are prepared within the CSF framework;
• Regular communications to staff reporting on outcomes and improvements;
• Framing both the Annual report and monthly unit reports under CSF
headings, to show the relationship of team functions to focus on areas
such as customer satisfaction;
• Involving all staff in the process;
• Surveys (staff and client) are incorporated in the annual calendar
of activities;
• Inclusion in core skills training for all staff, eg TQM Awareness training,
Client Service training, team building, Key Performance Indicators; and
• Annual development reviews include a section on staff member’s
contribution to quality.

Relationship between framework and tools


Prior to QSE, many decisions and evaluation practices were based on
intuition or gut feel. We have now developed a more sophisticated approach
to process and service management, have a better understanding of variation
and its impacts. Trend data has been established and assists us in planning
for future events. Our staff have also become strong champions of quality
(and interestingly see it less as an additional thing to do), and would be
reluctant to return to pre-QSE practices’ (McGregor, 1998)

86
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Challenges
• Initially, financial planning aspects were difficult to relate to work done
within the Library. With the cultural change that has occurred, there is a
feeling that this may no longer pose a problem as the Library has become
more business focused, and it is easier to see links and relationships
within this context. There has been an uptake of statistical measurement
and skills.
• Staff resistance was negated as team skills were built up using tools such
as Myer-Briggs to identify individual strengths within teams. Cross
functional teams support the framework, and enable the creation of links
between good ideas and subsequent change or improvement.
• Articulating high level concepts (leadership etc) and placing these into a
process flowchart.
• Process thinking applies less readily to some areas eg information literacy
as opposed to serials data entry.
• AQC project—benchmarking outside LIS—cost and inability to locate
appropriate internal or external funds might hinder participation.
Challenges also included the establishment of credibility of the library
within an external network of organisations.

Achievements
• Innovative solutions to problems with systems and resource limitations;
• Improved client focus from ALL staff—not just Information Services;
• Improved alignment of strategic objectives and team actions;
• Leadership demonstrated at all levels within the Library and commitment
and participation by staff in the development of improvement goals
and strategies;
• Benchmarking visits to partner businesses helped to place the role of the
library into financial/profit oriented framework. Within the library context,
profit was identified as maximising the value of the investment made by
students for their education, and looking at the full range of what you do
and drawing more out of it. There is value in visiting other libraries to
place your own work in context. Improving internal processes can
lead to more cost-effective practices and negate impetus for outsourcing
(for example).

87
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Advice/summary
We have to get away from the concept that ‘quality’ is something extra we
have to do. We expect, and often demand, quality service and products in
our day-to day interactions, and if we are stakeholders in other businesses or
groups, we expect that the resources will be managed well and provide value
to others. These expectations are also valid for libraries, and will become
increasingly important in a climate of economic uncertainty, technological
innovation and changing supplier services/products, and relationships
(McGregor, 1999).

University of Melbourne Library


(Information Source: Angela Bridgland, Deputy University Librarian,
Andrea Phillips, Quality Coordinator, Liz Neumann, Manager Systems and
Standards, Tony Arthur, Associate Librarian, Information Resources and other
library staff)

University of Melbourne Library also uses the Australian Business Excellence


Framework, and has been through the Quality Award process in 1994 and
1996 (receiving recognition at Achievement in Business Excellence).
The Library is now using the framework for organisational self-assessment.
The Library’s quality program has evolved into a suite of integrated activities
which includes strategic planning, continuous improvement of process
projects, benchmarking, HR planning and related HR initiatives, organisational
self-assessment, development of KRAs/KPIs/related performance measures
and so. Quality improvement is seen as an ongoing journey of organisational
improvement and cultural change.

In 1998, the Library completed an internal self-assessment against two of the


AQ categories—Information and Analysis, Customer Focus. Each Division’s
performance against these categories was assessed using the ADRI model,
and detailed reports produced. Self assessment resulted in recommendations
for improvement which will now be acted upon. In 1999 assessment will be
undertaken across all seven AQ categories, on a library wide basis. There will
be some shortcutting of the process via questionnaires designed to define
perceived weaknesses. A commitment to seeing recommendations made as
a result of the self-assessment process implemented, underpins staff
commitment. Participation in the AQC Benchmarking Network has been
deferred until all seven approaches have been clearly articulated.
Implementation of quality management has increased:

88
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• awareness of the need to look beyond the University and beyond the
profession for ideas and opportunities;
• reflection on the nature of the organisation and its business; and
• use of information for decision-making and recognition of the need to
question the status quo, and constantly seek to improve.

Use of the Australian Quality Council’s framework has also informed the
approach of the Australian Universitas 21 libraries to information sharing and
improving performance.

5.4.3 Swinburne Quality Management System (SQMS)


(Information source: Fran Hegarty (Director, Information Services), Denise
Doherty (Manager, Information Support Services), Rose Humphries, Library
Manager (Prahan Campus Librarian), Rob Carmichael (Head, Office for
Quality Education).

Whilst other case studies describe quality management systems and


frameworks reviewed and adopted by university libraries, the SQMS is an
institution-wide system. In adopting the system to manage library planning
and quality service delivery, Swinburne University Library have had the
support and advice of the University Quality Unit and staff, who have worked
actively with library staff to successfully implement the system.

Framework description
The SQMS model was originally based on the Scottish Quality Management
System which is a system based around the needs of educational institutions
(see http://www.sconto.demon.co.uk/sqms.htm for details on the system).
It is defined within Swinburne documentation as ‘A documented management
system that meets the requirements of a defined standard, and designed to
ensure that the quality of the programs and services provided by the
University meet the goals and objectives. This management system also
includes the documented process for continuous improvement’ (Glossary—
SQMS Formal Review Information Pack, quoted in Swinburne University of
Technology ‘Overview of SQMS')

Swinburne have adapted what is largely an externally assessed system to one


that is used in-house, and is driven by a self-assessment process using the
ADRI (Approach, Deployment, Results and Improvement) model. Self-
assessment is undertaken on one of 15 criteria on an annual basis (outlined
at http://www.swin.edu.au/qed/overview.htm. These criteria have been
adapted from the 14 standards that make up the basis of the Scottish
system,with the addition of ‘Research—the achievement of high standard
research activity’, as the fifteenth criteria specific to Swinburne.

89
CONTENTS

In order to achieve improvements in key processes as described in SQMS, the


Swinburne Quality Review Program (SQRP) was developed. This consists of
continuous improvement through self-assessment and validation review.
Organisational units are required to self-assess on ‘one or more of the SQMS
processes annually’. This is followed by validation reviews of selected units
conducted by staff from other areas.

Why SQMS?
From the Library perspective, SQMS is seen largely as providing a framework
that has, unlike others, been developed specifically for educational
institutions; is flexible; lines up with ISO and AQC frameworks, and yet is a
quality management system in its own right. It has enabled the Library to
assess and improve a number of services formally by identifying
gaps/needs/areas for improvement, establishing the action needed to
improve the service, and then setting targets for improvement. Overall
quality management is seen to be ‘high priority’ and senior staff are now
committed to seeing the adopted model work, and appear to be active
’sellers’ to other staff.

Criteria for successful implementation


SQMS is viewed as a flexible approach to quality improvement. The Program
started with a series of pilot audits that were formal and not well received
due to lack of integration with core business. There is now a closer alignment
with the strategic planning process, and a movement away from corrective
action requests, to a more developmental focus through internal
benchmarking. This reflects the impression that quality at Swinburne is
moving to a continuous improvement/self-assessment model, in line with the
changed emphasis of the new ISO 9000 (2000) standard, and the Australian
Quality Awards for Business Excellence Framework, rather than a model
based purely on quality assurance/documentation.

From the library perspective, successful quality management includes the


following criteria:
• Understanding by senior staff charged with implementation of the
importance and value of a quality management program; and
• A need for all staff to see the practical application and benefit of such a
program to their work areas, and the Library as a whole.

90
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Training commitment
• On-going ‘quality training’ programs both in relation to the overview of
SQMS, Australian Quality Council programs, and in areas identified as
deficit eg. customer service training;
• A number of university staff, including members of library staff, have
received internal auditor training;
• Quality facilitators also have ISO 9000 training;
• Benchmarking training has been conducted by an external trainer; and
• Self-assessment teams are cross-functional and have been trained in best
practice for teams.

Organisational features
SQMS is seen as a process model for organisational change, working up from
unit level to corporate level. Self-assessment scoring occurs at unit level. It is
a top down and bottom up approach that encourages change from below,
linked to organisational and institutional strategic priorities. Recently, the
focus has changed from validation review, to validation through internal
benchmarking, on the basis that potential outcomes are likely to be better,
particularly in relation to continuous improvement and exchange of ideas. It
is useful therefore when more than one area undertakes assessment of the
same criteria in the same year.

To date the library has assessed performance against Criteria 13—Premises,


Equipment and Resources (1995), and is currently preparing for assessment
against Criteria 14—Communications and Administration. Once every five
years a full organisational audit is undertaken. Criteria are nominated for
review on an annual basis. Developmental Action Plans (DAP) are developed
as a result of feedback from the assessment process. These are reviewed by
the Validation team which then generates Corrective Action Requests (CAR).
The earlier exercise involved a large amount of work re-analysing and
re-interpreting criteria to match library needs at a micro rather than macro
level. The subsequent 1996 Validation Review process saw a team of four
validators, from all areas of the university, review documentation and
processes and pick up on gaps. Questions from this team were submitted in
advance of the site visit that focussed on one campus library only.

91
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

How well is SQMS integrated into Library management


and planning processes?
Full integration has not happened as yet. Quality management processes are
still not central in all undertakings. Cultural change is happening slowly.
‘Quality’ has become a standing item at all meetings of management/sectional
groups.

Relationship between SQMS and other quality frameworks and tools—


CHEMS and benchmarking
In 1998 CHEMS addressed benchmarking in library and information resources.
This was largely a qualitative exercise in which member institution libraries
were required to complete a questionnaire that addressed the following areas:
• Strategy, policy, planning and good management;
• Library services;
• Access;
• Collections;
• Support and training; and
• HR management.

Scoring was undertaken by CHEMS. A four day workshop attended by


representatives from member institution libraries identified statements of good
practice. Self-assessment against the statements followed. From this, member
institution libraries were able to identify areas and prioritise for improvement.

Swinburne found that the CHEMS framework offered a holistic’ approach


to the benchmarking process rather than the perceived ‘fragmented’
approach of other quality models. In addition, it offered the benefit of a peer
related framework with the underlying assumption that improving processes,
will by default, enable improved delivery of library services, and lead to
increased client satisfaction. Closing the quality loop and stressing benefits to
clients are seen as fundamental to the success of the approach.

The combination of CHEMS and SQMS has allowed Swinburne to assess


performance at both organisational and unit level. Because of the external
context of CHEMS, weaknesses identified against the statements of good
practice are likely to be addressed sooner. The Library found the discipline of
completing the questionnaire, and rating against statements of good practice,
to be an excellent method for identifying and addressing areas for
improvement—in this instance, external benchmarking provided a greater
motivation to improve performance.

92
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

ISO 9000
The TAFE division at Swinburne will be seeking ISO 9000 certification in
early 1999. The Library (which serves both TAFE and higher education
sectors), is considering the possibility of being party to this.

Service Level Agreements (SLAs)


Service level agreements are being developed with the Divisions. SLAs have
three components:
• Corporate Standard Services, ‘the minimum services provided by
Information Resources to all staff and students of the University’. These
include services such as library facilities open at each campus in core
hours, lending service, reference service, off-campus library services etc.
• Common needs/Common Services (Core Services)—’These are services that
are subject to annual agreement by Divisional management that are
common to all teaching divisions’. These include services such as funded
opening hours, varying loan periods according to demand, information
literacy key competencies, training, etc.
• Customised Services/Division Specific Services (Non-core)—’services that
are …either Divisional specific and/or/not able to be provided within the
Information Resources’ recurrent budget’. These include services such as
inter-library loans, research and consultancy, training—specific
subjects/classes etc.

SLA’s include KPI’s/targets and costs for some components, with some
utilisation of the SCONUL performance indicators. It is intended that during
the period of the SLA, an internal review group will review performance
against targets within the SLA. This self-assessment will also cover program
delivery within the Library, and may be a better way of assessing reference
services. Participation in self-assessment activity is seen as a priority in normal
work, and staff involvement has helped communication and enthusiasm for
the process.

Challenges
Implementing any quality management system implies a number of
challenges. Those encountered by Swinburne include:
• Implementation within a multi-campus/TAFE/Higher Education
organisational environment;
• Communication—Effort has been put into communication through staff
meetings, in order to encourage all library staff to see how and where the
self-assessment process (for example) sits within both the overall program,

93
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

and library specific services. In addition, multi campus challenges exist


with respect to communication, and may include making connection back
to senior management based at a specific campus, perceived over servicing
at campuses etc.;
• Overcoming resistance to and encouraging ‘cultural change'; and
• Difficulties dealing with recommendations and corrective actions (as a
result of the SQMS review process), when there are budgetary implications
or lack of university resources, policies or processes in place to address
the problem effectively.

Achievements
• Identified areas for corrective action; and
• Heightened awareness of the continuous improvement cycle.

5.4.4 Internally developed quality framework—


University of Queensland
(Information source: Janine Schmidt, University Librarian, Jennifer Croud,
Financial Services and Projects Coordinator, Mary Lyons, Manager, Corporate
Services and other library staff)

A formal quality management program is an important management tool for


contemporary academic and research libraries. It provides information on
whether or not services perform as expected and contributes to improved
delivery of services and the efficient and effective use of resources.

Framework description
Priority areas
Priority Areas for the year are determined at the end of the previous year
during an Annual Review undertaken at a two day session involving all levels
of staff, some of whom are present by virtue of seniority, and others in a
representative capacity. At the Review an ‘appreciative inquiry process’ is
used to consider the progress achieved by the Library during the last year,
and to position the Library for the future. User needs for integration into
service and information access goals are identified, and taskforces formed to
address issues and prepare possible implementation strategies. Then,
throughout the next year, the taskforces address the Priority Areas,
implementing new services and improving existing ones. TQM processes are
being implemented in key areas, for example document delivery.

94
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Benchmarking
Areas identified for improvement are integrated into the Library’s Priority
Areas.

In 1998, benchmarking exercises through the Commonwealth Higher


Education Management Service (CHEMS), which manages the Commonwealth
University Management Benchmarking Club, and with Australian
Universitas 21 partners, the University of Melbourne and the University of
New South Wales, were carried out. In 1999 the Library will undertake a
benchmarking exercise with the University of Otago. Internal benchmarking
is also carried out, and the performance of various branch libraries against
specific criteria is compared.

Performance measures
Both quantitative and qualitative performance measures are in place.
Additional measures are being developed. With regard to process measures,
many statistical details are kept, and measures developed from these as
required. The perspective of client satisfaction is the basis for measurement,
activity based costing is also being carried out.

The overall program of quality management is ongoing, but selected


components may either be one-off exercises, or only repeated every so often.
Priority Areas for address are determined each year as part of the overall
program of quality management. Selected areas may continue as priorities
from one year to another but generally, areas vary with newly implemented
and improved services being incorporated into standard operating practices,
and no longer identified as requiring ’special’ attention. While the
commitment to achieving best practice is ongoing, specific benchmarking
exercises are completed, and others initiated. Performance measurement
is ongoing.

Criteria for successful implementation


• clearly defined mission and goals;
• clear relationship of library goals to university goals;
• understanding of client needs;
• awareness of how services are currently offered;
• specified outcomes;
• planning system to implement improvements; and
• a vision of where you want to be, knowledge of where you are and a plan
to get from where you are to where you want to be.

95
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Training commitment
• The ‘appreciative inquiry process’ of the Annual Review is explained to
staff at the beginning of the Review and a consultant facilitates the process.
• With regard to specific Priority Areas, taskforces are led by team leaders
who arrange awareness/training for team members as required.
• All staff are informed of the results of quality management
initiatives/benchmarking exercises through library publications, and in
information sessions.
• Staff are trained in the collection of data, both quantitative and qualitative,
that is part of standard operating procedures.

Organisational features
The ‘7 Up’ group
This group consists of all library staff of HEW 7 and above. All members of
‘7 Up’ are members of a Priority Area taskforce. ‘7 Up’ is responsible for
promoting and propelling the Priority Area quality initiatives throughout the
Library, and involving other staff members in initiatives. Through this group
quality initiatives reach all sections of the Library. A system of communication
is in place whereby all staff are informed of initiatives and the results of
programs. Also, the Library has a Projects Coordinator who:
• coordinates library projects related to benchmarking and best practice;
• coordinates quality assurance projects, including the development of
quality assurance policies and procedures, and the development of
performance measures; and
• maintains a database to facilitate the collection, analysis and reporting of
library performance data and provides reports on library performance to
internal and external bodies as required

Criteria for successful integration


At the planning level through:
• the leadership of senior management; and
• The Annual Review ‘appreciative inquiry process’ and determination of
Priority Areas for address in the next year.

At the operational level through:


• membership of Priority Area taskforces being open to all;
• the collection of data, both quantitative and qualitative, being part of
standard operating procedures;

96
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• results of quality management initiatives/benchmarking exercises being


disseminated to all staff through library publications; and
• staff being encouraged to scan the environment for opportunities for
improvement, and staff from all branches and sections being able to attend
conferences, read, monitor/scan webpages and conduct research on
specific topics.

Results of quality management initiatives are fed back into the quality
management program.

Challenges
• Would have benefited from more staff involvement and time (CHEMS);
• Difficulties in establishing the priority, change in focus (Benchmarking
document delivery project);
• Balance between keenness and knowledge, and the need to involve those
whose job responsibility it is (Planning and Appreciative Enquiry
processes); and
• Effective use of time, are managers spending too much time on the less
important, operational aspects of their jobs, and not enough on the
visionary forward focus aspects?

Achievements
Staff
• increased participation in strategic planning activities;
• opportunities for involvement in cross-sectional initiatives, and to work
with colleagues from other sections of the Library; and
• increased awareness of comparative data.

Library
• improved information on whether or not services are performing
as expected;
• improved delivery of existing services and implementation of
new services; and
• more efficient and effective use of resources.

97
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

For example, amongst other items over the last few years, the quality
management program has resulted in/aided:
• extension of the virtual library;
• improvement in customer service, particularly with regard to information
skills programs and service at ‘non-service’ points;
• refurbishment of library buildings and development of a
new library (Ipswich);
• implementation of a new integrated Library Management System;
• development of the Library’s Home Page as an interface to all
library services;
• improved shelving practices;
• improved document delivery services; and
• development of plans on flexible delivery and service options.

External frameworks
University of Queensland is a member of the Commonwealth University
Management Benchmarking Club managed by the Commonwealth Higher
Education Management Service (CHEMS). Each year the Benchmarking Club
reviews different areas of member universities. In 1998, one of the areas was
Library and Information Services. The Library was surveyed about many
activities in which it participates, and then compared with other
Commonwealth Universities. The Library scored the maximum rating in all
categories, and was the only library to do so. The Library is also considering
using the AQC and ISO frameworks.

5.5 Future directions


From the survey responses it appears that there is a growing trend towards
the adoption and application of quality frameworks within the Australian
university library community. Although less than 50 per cent of the
respondents have adopted and applied a framework to date, the majority
indicated their intention to investigate, implement or further develop partially
implemented frameworks in the next twelve months. Benchmarking and
performance measurement have been utilised as continuous improvement
tools in most of the libraries surveyed, and indicate a growing familiarity and
ease with what were viewed early in the 1990s, as tools for the business,
profit/product driven sector of the economy. In view of the competition for
resources both in funds and clients, and the growing focus on accountability,

98
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

productivity and competitiveness, it seem likely that the academic library


community will continue to implement and develop quality and best practice
management programs.

Programs of this nature will become increasingly important in a climate of


economic uncertainty, technological innovation and changing supplier
services/products and relationships. Continuous improvement and quality
management can be applied in non-profit, service organisations, and tangible
and relevant results can be achieved. It is far better for the organisation to
initiate performance measures, rather than have irrelevant or unrealistic
performance indicators imposed by bodies that have limited knowledge of
the role and function of libraries and the value they provide (Felicity
McGregor, UOW).

5.6 International comparisons


In introducing this section it is important to note, as with the corresponding
sections in the two other core chapters, that the information contained here is
presented with the following limiting factors outlined. This EIP project
contained no allowance to parallel the approach to information gathering that
was taken within Australia. There were no surveys of international activity or
visits to interesting sites identified by those surveys.

One member of the EIP team had, however, in June/July 1998 undertaken a
series of international visits, financed by her home library, UNSW, to examine
a select few interesting ‘quality’ sites, in order to investigate how the quality
frameworks were implemented, and how related activities had been
undertaken. The intention was to better position the UNSW Library in terms
of improving its quality management processes.

Two of those 1998 observations are recorded here to demonstrate two very
different approaches, by two very different university libraries, in two
continents, using different quality frameworks. The team does not claim this
is a balanced representation of all aspects of the international scene, but
simply a set of snap shot case studies which illuminate extremes and
represent implementation scenarios, not paralleled within Australia. The
literature review in Chapter 2 reveals a fuller picture.

99
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

5.6.1 University of Central Lancashire, Preston, UK,


Library and Learning Resources Services, LLRS

General
University of Central Lancashire (UCLAN) is an example of a small ‘new’ UK
university, implementing innovative ways to support their non-traditional
student body, who tend to consist of part-time, mature age, first generation
university level, international, and minority students. At the bottom of UK
university ’ league tables ’ in terms of resources, it rates highly in terms of
student satisfaction with their experience.

Quality journey
UCLAN is a central city campus which is now one of the main employers in
the Preston area in this post-industrial era. The LLRS workforce is
exceptionally female in composition (even for a library), and has a high part-
time staff component. Library and computing arms of the university were
‘converged’ in 1995, with a significant impact on services and their quality
approach. As in Australia, converging technology and reduced resources have
ensured a continuing interest in convergence throughout Britain. Two well
published librarians, Peter Brophy and Kate Coulling, were the two key staff
initially involved at the LLRS. The LLRS Quality Coordinator oversaw the ISO
9002 certification process from 1992, and made a significant personal
contribution towards its success.

UCLAN LLRS has been ISO certified for almost five years, and has a mature
and simplified system. The perceived benefits of auditors feedback have
become less as fewer improvements to the quality management system are
recommended. At the time implementation was begun, there were compelling
reasons why a quantum leap in the quality and consistency of services was
crucial to the survival and positioning of UCLAN.

The university wide orientation program covers topics such quality


assessment for teaching and learning and quality issues in areas such as
campus services. Part of the program is called ‘A Day in the Life of a Student’.
This is used to highlight how quality related issues are real for the user, and
relate to treatment received from staff at all points of library and learning
services, and university facilities. Within the LLRS there is an informal
induction session on ISO. All individuals have access to Quality Workbench
software as their ISO 9000 documentation software and desktop information
system. This type of training is really a simple, ‘this is how you use our
quality documentation system, and this is what it is used for’ type session.
There are no separate formal sessions for LLRS staff, but initially there were

100
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

sessions on writing procedures, with practical components. Some staff are


trained externally in the Internal Audit process. There is a contact person, a
staff member, base level or upwards in each unit, with a particular aptitude
or interest in quality issues.

Potential changes are simply directed through the update documentation


mode. Documentation is simpler than it was initially, usually less than one
page per document. The entire LLRS has less than one hundred processes
and policies. Quality Workbench was originally given to UCLAN as part of the
EEC funded EQLIPSE project, and customised to meet their needs. Other
EQLIPSE project partners included Munster University and the Danish
Business School who found less use for the software, as their quality
approaches did not include ISO certification.

The UCLAN as a whole has also been certified against the IiP program
(Investors in People). LLRS staff were part of the random group selected for
interview, to assess whether UCLAN could be awarded the IiP mark. (IiP is a
government supported assessment framework relating to only one area of the
AQA categories (PEOPLE). Since UCLAN have an annual development
assessment program which, in the LLRS is combined with the used of a
competencies profile, LLRS is well equipped in relation to personnel
practices. The results of the annual assessments in terms of training needs,
are aggregated by unit heads after the process has been completed, to
identify training which should be provided internally in the LLRS. UCLAN has
an excellent staff development program. The University is also a recipient of
the Charter Mark, which again is a restricted framework relating to service
industries and their customers. A range of clients were interviewed, as well as
staff who deliver service. The work for the ISO framework made preparation
for other certification visits easier.

Conclusions
The process has worked well in this setting because:
• Key senior staff were knowledgable and supported the process;
• UCLAN as a whole is small, flexible and aggressive in pursuing service as a
marketing advantage;
• A structured quality management system such as ISO has proved useful
given the high proportion of part-time and session only staff to ensure staff
learn the correct procedures and carry them out;
• Simplicity and common sense are key values within the organisation, and
their quality implementation reflects this;

101
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• LLRS prioritised how they would go about implementation by starting with


core processes; and
• Participation in research projects helped develop an understanding of
the process.

Future
The Quality Coordinator would like to see progress in using the European
Quality Award criteria to broaden the LLRS approach to quality.

5.6.2 TQM/CQI program adapted to a large US university:


University of Michigan and the M-Quality program and
Libraries involvement

General
M-Quality began with a 1991 report to University of Michigan management
on Enhancing Quality in an Era of Resource Constraints: Report of the Task
Force on Costs in Higher Education. Michigan is a very large institution with
25 000 employees on multiple campuses. There is a strong culture based on
control at the faculty or equivalent level. M-Quality is an adaptation of TQM
principles and its ideology and tools are presented to each area head for
discretionary use, although, initially there were some elements of
persuasiveness. So much controversy ensued initially over the use of the term
‘customers’, that HRD later introduced the euphemism ‘those we serve’ to
continued scepticism. The driver for this massive program was the then
radical notion of using the same or reduced money to deliver a better service
through reduced costs.

Some academic areas rejected the notions and some, like the business
areas, nursing, etc have embraced M-Quality because it fits with their
culture and other initiatives. It is at the faculty/unit level where the
continuous improvement is happening. At this level there are also Lead
teams, usually the executive/management group, who look strategically
at areas for improvement.

Team leader training of forty hours includes two hours introductory concepts,
eight hours refining tools and processes for selecting appropriate issues to
work on. This is to ensure that they select the critical business practices of
their unit, and choose something significant enough to make an impact, and
is within their power to change. All projects must meet certain criteria before
time is spent on them, and all leaders must be trained prior to team
formation. If a unit wishes to refine a topic area, or needs to learn skills to

102
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

use in ‘Managing by Fact’, HRD can be called on for a consulting fee. Course
attendance is also charged out to customers units as HRD is required to
generate its own income.

A noticeable strategy at University of Michigan is to call on very high level


expertise within the university to give intellectual weight to M-Quality, and
thus convince staff of the validity and applicability of concepts. The four
basic principles of M-Quality are:
• Pursuing continuous improvement;
• Managing by fact;
• Respecting people and their ideas; and
• Satisfying those we serve.

For ordinary non-academic staff this means a range of things. It can mean
being on a team, or trying to implement ‘Quality in everyday activities’ (if not
officially on a team).

There have been M-Quality Expos, in which the Libraries’ staff have
participated, since 1994. Expos have been two day events, coordinated by a
team, with top officials of the university interacting with the staff
improvement teams, and information in a special lift-out of the university
newspaper. Team members keep their handouts and their materials and use
them to talk to their customers over the next year. The Expos have almost
evolved into an university expo, with an accompanying mini conference on
developing issues in running the university, which are open to all. In the
most recent Expo, new digital initiatives were demonstrated in the Millenium
Room collection of showcases for new technology.

Early general success stories focused on areas such as custodial (painters,


washers, cleaners etc), under pressure from outside contractors for
outsourcing. The first ever client surveys were done, and staff radically
reformed into fast response teams.

In the academic area there was an ongoing team effort between academic
and support staff to improve the grant application process, and relations
between the two groups of staff. The first overall staff perception survey was
held in 1998.

103
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Quality journey
The Libraries have fielded the following teams over time:
• Library Student Hiring Team;
• M-RUSH Team (materials processing);
• Serials processing;
• Shelf reliability;
• Digital Library Production Services; and
• Some other smaller groups included ILL, MERLYN records group
and Labelling.
Almost all staff attended a brief general orientation to M-quality and its
concepts. About forty staff went through team leader training, and about a
dozen of these attended the forty hour facilitator training. Some staff
facilitated more than one team, others were short term leaders/facilitators
when others left the organisation. The Facilitator group had a management
literature awareness raising and experience sharing group going for a while.
Initially there was no university-provided management training at Michigan,
and staff in the Library received some defacto leadership training only
through M-Quality courses.

The first Libraries teams tried conscientiously to follow the recommended


seven step process and use the tools. Staff found this process too rigid. In
actual fact, they found only a small subset of the tools were of use.
These were:
• flowcharts;
• staff interviews;
• cause and effect diagrams;
• time studies; and
• simple surveys.

Some investigations stretched to eighteen months to two years and met for
only about an hour a week with occasional breaks, a very expensive learning
process. In the case of serials, it took the team 3–4 months to understand the
process initially, and fully flowchart it (15 feet long). Most of the teams did
not in the end come up with radical suggested changes. There was more of
an incremental improvement, hinging on defining categories of use and user,
and designing better forms. All groups made several presentations to the
senior management group with some material going on an internal web site.

104
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Conclusions
• Cross functional teams provided opportunities to get to know and work
with people outside the immediate unit;
• Developmental opportunities were offered to staff at lower levels, who
would not otherwise have been involved in this type of improvement
process work;
• Enhanced meeting skills and more acceptance of the need for facts
(i.e. managing by fact);
• Some of the data has been re-used for other purposes;
• Sound incremental improvements made by defining categories of use
and user, and designing better forms; and
• Staff gained recognition in the email and the staff newsletter.

Future
The M-Quality process has suffered a hiatus at the institutional level recently,
and also at the library level, through changing university staff and agendas.

105
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

6 Staff competencies

6.1 Introduction
Staff competencies, including training, whilst not an integral component of
this report, have surfaced as an issue worth exploring. Survey responses have
indicated a diversity of approaches to training which range from ad hoc
informal awareness sessions to fully documented workshops. It is obvious
that the introduction of benchmarking, performance management and best
practice/quality improvement initiatives carries with it some need for a
corresponding focus on training for all staff, not just for those immediately
involved in the process. All staff need to be skilled in the use of the tools
and techniques which form part of these quality improvement processes.

Both the literature and the survey responses indicate that, whilst many
libraries have embraced quality improvement initiatives with some ardour, the
training and competencies required for staff to work effectively with the tools
and techniques has not always been adequately addressed. Some libraries
have recognised the need to ensure that staff have the appropriate blend of
skills to perform effectively in this new, more accountable, and therefore
more challenging environment, others have noted the need but have done
less of a concrete nature in terms of training to date. On the one hand, there
is an expectation that staff will apply new ‘business related’ principles and
concepts in their work, while on the other there is evidence that they are not
being given enough of the requisite knowledge and skills to interpret the
language, and apply the principles in practice.

When should this training/issue of competency occur? In the course of


discussions at one of the site visits, it was suggested that it would be worth
addressing these training issues through the professional courses/educational
process. Courses in library and information management need regular review
due to the rapidly changing nature of the industry in which professionals are
required to operate. Skilling graduates in quality management tools and
techniques before they commence employment, would help to ensure that
the profession is equipping its workforce with appropriate skills and
knowledge to work more effectively and efficiently in the rapidly changing
environment which characterises the library and information industry.

107
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

There are also sound arguments for quality principles to be integrated into
the routine staff development program of a Library, especially in the areas
of communication with clients (client service skills), communication between
library staff (team skills) and management training (leadership, strategic
planning, managing with information), just as the aim of quality management
frameworks is to become part of the routine way of operating. In many
situations, just in time, focussed training rather than a ’sheep dip’ approach is
far more effective, even when new approaches are being implemented. Large
scale training, particularly if not appropriate for the organisation, can lead to
a higher level of staff cynicism because the workshops or formal training did
not occur at the appropriate time, and in appropriate amounts,
to be either motivating or cost effective.

6.2 Competency standards


During the last decade there have been major economic and technological
changes which have affected Australia and made it necessary to examine the
way Australians work. In the library and information services industry alone,
increasing use of electronic means to store and access information has meant
that many library and information services professionals have had to
reorganise the way in which they work. To cope with this change, and
compete in both the domestic and international market, Australian industry
needs a skilled, educated and flexible workforce. It is in this environment that
the National Training Agenda was developed. Implementing the Agenda has
involved development of:
• A nationally consistent training system based on national curricula and
focused on industry needs;
• An effective training market;
• A national system for accreditation of trainers and assessors; and
• National competency standards (ALIA workshop booklet 1, 1997, p.7).

The Australian Library Industry Competency Standards have been developed


under the umbrella of the National Training Agenda. They were first
published in 1995. The introduction of competency standards provides
industry and enterprise with benchmarks against which it is possible to:
• Identify the skills and knowledge of an organisation;
• Ensure that workers are able to acquire necessary skills and
knowledge; and
• Measure performance levels within an organisation. (ALIA Workshop
booklet 2, 1997, p. 5).

108
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Once identified, industry competency standards form the basis for a


nationally consistent framework by which workers can:
• have their existing skills assessed against the framework and formally
recognised (competency based assessment and skills recognition);
• have improvement in those skills organised about the competencies
(competency based training), and
• have access to skills based higher levels of remuneration
(skills based pay).

In the process, some industries with far from outstanding records in the area
of skills recognition, staff development and pay levels have been able to
improve their performance. Some large public organisations have
implemented the National Training Agenda thoroughly throughout the
organisation. There has been a lower level of adoption within universities in
Australia. Within university libraries, whilst the standards have assisted with
the re-definition of jobs and the skills required, there has been little of the
formal assessment framework adopted to date. Whilst it is an attractive
proposition to be able to recognise the depth of expertise built up over time,
particularly by support staff who may not have formal educational
qualifications, significant resources are required for formal workplace
assessment frameworks to be fully implemented.

The library and information industry has now received its first set of national
competency standards. Whether they will be widely adopted by libraries, and
how they will be applied and assessed, remains to be seen. To date, the only
evidence of acceptance has been their preliminary use in two public library
systems in New South Wales, four academic libraries in Western Australia,
New South Wales, Victoria and the Northern Territory respectively and one
State Library (Tasmania). The majority of libraries appear to be adopting a
‘wait and see approach’ (Bridgland, 1998, p. 174).

The Australian Quality Council has also produced Quality Management


Competencies which are nationally endorsed standards, as are the Library
standards. These are written for staff at all levels, and can be used to assist
with defining training needs in relation to quality awareness.

109
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

6.3 Training for benchmarking


Survey responses to the question of what type of training/awareness was
provided indicated a range of approaches from:
• awareness sessions and seminars for senior management and those closely
involved in the projects;
• briefing and data collection requirements for staff immediately involved in
the process; and
• Comprehensive workshops covering all aspects of the benchmarking
process—conducted by either internal or external facilitators (AQC)
Libraries engaged in externally managed benchmarking projects eg CHEMS
and AQC, tended to invest more time and effort into the training of staff. In
these projects, training tended to be initiated by the University, CHEMS or the
AQC, rather than the library itself.

It is evident from the responses and feedback from site visits, that a more
rigorous approach to benchmarking training would help to demystify a
process which is still largely viewed as being more ‘business oriented’ than
library oriented.

An important point which relates to the value of increasing the awareness,


knowledge and skills of those involved in benchmarking was raised in the
published outcomes of the QUT/UNSW benchmarking project. The positive
impact of the project on those being benchmarked (UNSW) was less than it
could have been. ‘Virtually all the advice available is directed at those
wishing to benchmark. It is virtually impossible to find words of wisdom for
those who are being benchmarked. A fully cooperative model linking two
equal partners is not the model portrayed in much of the advice on
benchmarking’ (Trahn, 1997, p. 135). In order to gain the maximum benefit
from the project, both benchmarking partners needed to have a mutual
conceptual understanding on which to build the partnership. Trahn rightly
states that ‘any library finding themselves part of a benchmarking process
initiated from elsewhere, needs to put in some structured effort of their own,
in order to raise the level of awareness about benchmarking for all staff from
all areas which will become part of the study’ (1997, p. 136). Familiarity with
the terminology and processes of quality assurance/quality management
would have enabled staff to view benchmarking with another organisation as
just another quality management tool to assist with further real process
improvement in our own organisation.

Garrod and Kinnell state that ‘benchmarking can be perceived as an


empowering tool as it focuses on process owners. It thus enables
paraprofessional and junior professional staff to play a more proactive role in

110
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

the identification of problems and the implementation of change.


Benchmarking also promotes teamworking and can lead to better
communication between various levels of staff. Benchmarking is a learning
experience; the process of analysing procedures, and identifying gaps in
performance, encourages staff to interact both with internal members of staff
where there is a common interface between tasks, and with external
organisations when establishing partnerships’ (1996, p. 147).

If the use of benchmarking in the library and information services sector is to


be promoted, and its relevance fully evaluated, then certain information and
training needs should be addressed. The development needs in relation to
benchmarking are of key importance to its implementation. A study
conducted by Garrod and Kinnell in 1995 highlighted training as a key area
requiring attention. In 1997 the training imperative was examined again. The
authors identified a number of key issues necessary for benchmarking to
succeed including training:
• a definition of benchmarking is required, describing what benchmarking is
and what it involves in the LIS sector;
• a model or approach to benchmarking should be identified for use in the
LIS sector;
• processes are the most suitable subjects for LIS benchmarking exercises;
• training and skills in benchmarking, and other quality methods, are
required for all staff if programmes are to be successful;
• timescales involved: these should not be underestimated, as benchmarking
exercises can take as long as a year, per process, to complete; and
• communication, effective communication at all levels is essential in all
quality programmes and for the successful implementation of quality tools
(p. 113).

6.4 Training for performance measurement


Most of the training undertaken in relation to performance measurement was
either specific to the application of one of the CAUL indicators prior to its
use, or limited to staff involved in performance measurement activities. There
was a lack of evidence of a broader approach to training in this area, use of
performance indicators in the context of strategic planning for example,
although some mention was made of this in the responses to the quality/best
practice survey. Once again, only the University of Wollongong Library had
developed a comprehensive approach to training in performance
measurement techniques, through ‘an in-house workshop on developing key
performance indicators for all library teams. Two sessions included identifying

111
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

key processes and factors influencing performance in team areas, theoretical


background and the application of key performance indicators, linking
performance indicators to team key processes, Library key performance
indicators and Critical Success Factors. The Workshop is now offered as a self
paced workshop for all new staff.’

Responses indicate that training is addressed in the following ways:


• staff meetings;
• teleconferences;
• briefings for senior and middle managers on the methodology and
outcomes of the instruments;
• training in survey design and implementation techniques;
• training in data collection, data entry and analysis for staff immediately
responsible for this activity eg using CAUL Performance Indicator B;
• training in survey form distribution and coding prior to each survey period;
• communication of survey results via Library Management Information
website;
• incorporated into planning workshops. Strategic planning sessions for
senior and middle managers include a component on performance
indicators. Basic training undertaken as outlined in CAUL manuals for staff
immediately responsible for the data collection and analysis;
• awareness sessions and training for staff undertaking performance
measurement activities;
• planned for when the entire measurement framework is finalised. Part of a
staged process of strategic planning, linking to processes and training staff
in quality management approaches;
• staff in all core areas are familiar with the in-house indicators. In 1999
aspects of training related to measures, indicators and statistical
understanding may form part of staff development sessions which focus
on improving quality management. CAUL Performance Indicators A
and C require only on the job instruction as needed rather than formal
training sessions;
• staff are trained in the collection of data, both quantitative and qualitative,
that is part of standard operating procedures, and are kept informed of the
results of the quality management program through library publications
and information sessions;
• all staff are informed of the purpose and procedure before conducting
surveys; and
• training occurs within the context of the Library’s Quality Programme and
the University’s and the Library’s performance review Scheme.

112
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

6.5 Training for quality/best practice


Sundry responses and interviews indicate a range of training sessions have
been conducted which covered customer/client service, TQM, continuous
improvement, team building, key performance indicators, strategic planning,
benchmarking, facilitation, feedback skills.

Once again, approaches to training varied from the formal to the informal.
Levels of formality also varied depending on the participants. It was more
likely to be more formally focused if aimed at senior management, or for staff
immediately involved in the project. A less formal broader based awareness
type session for other staff was in wider use. In most responses, training was
a ‘one-off’ session, program or workshop, conducted at the time the decision
to introduce an initiative was made. Institutions with ongoing training
programs which address quality management techniques (including
benchmarking and performance measurement) and skills are not common.
It is recognised that a significant investment of resources is required for the
successful implementation of this approach. It may be more likely to be in
place where the institution has a position dedicated to quality, staff
development and training. Training also tended to be more formal if the
program was a university initiative (CHEMS), and if it was conducted by an
external consultant/facilitator. For example, due to the rigorous requirements
of the Australian Business Excellence framework, the University of
Wollongong Library invested a large amount of time and effort into staff
training, and presented the most coordinated approach.

When training was limited to senior management and/or those staff


immediately involved in the project, effective use of communication became
a major contributing factor to the success of the project. Progress reports,
items in internal newsletters, reports on the Intranet, presentations at staff
meetings had the effect of keeping staff not immediately involved in the
initiative informed on progress, and contributed to a wider sense of
project ownership.

The following aspects of training have particular significance for quality


programmes in general:
• Ownership—in benchmarking where individual processes are subject to
analysis, comparison and improvement, it is vital that staff accept that they
have responsibility for, and authority over, their sphere of work. Staff need
to receive training which will help them to become aware of the
importance of their individual contribution to the organisation, and the
impact this has on overall effectiveness. It should include an understanding
of the relationships and interfaces with other processes and departments;

113
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• Cross-sectoral work—provides opportunities for networking and training;


• Library & Information Services courses—education and training for quality
needs to be included in initial qualifications for undergraduates and
postgraduate LIS courses. There should also be mid-career updates in skills
and knowledge;
• Training in tools and techniques—with particular focus on the use of
quality management tools and techniques eg flowcharting, process
mapping, cause and effect analysis, understanding of statistics and other
measurement techniques; and
• Business and management training—there is a need for overall business
training including financial, Human resource management, organisational
development so as to set the process within the larger framework. (Garrod
& Kinnell, 1997, p. 116).

6.6 Future directions


Training is essential if quality improvement programmes are to be successful.
There is a need for more emphasis on all aspects of management in initial
professional education. (Garrod and Kinnell, 1996, p. 147). If quality
initiatives are to succeed, all LIS staff need to understand the rationale behind
these activities, as well as to acquire and apply the necessary knowledge and
skills. (ibid, 1997, p. 111). It is recommended that training in quality
management tools and techniques be incorporated into the curriculum of
Schools of Library and Information Studies, perhaps under the heading of
leadership and management. A commitment to skilling staff in the appropriate
skills and techniques carries with it certain resource implications. Libraries
need to allocate both time and money to the issue of training. Training for
quality means:
• Ensuring that staff have a chance to participate fully, and are encouraged
to feel that their contribution to the overall success of the organisation
is valued;
• Ensuring that all staff are kept fully informed of events and changes, and
are also provided with opportunities to voice their opinions and make
suggestions; and
• Equipping staff with the knowledge and tools to carry out their jobs
effectively and, above all, to develop an awareness of the importance of
customer satisfaction within the overall objectives of the LIS. (Garrod &
Kinnell, 1997, p. 117).

114
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Bridgland states ‘the future role of library workers very much depends on
how well those within the sector ensure that their changing knowledge and
skills service changing client needs and information applications. Job growth
projections by DETYA consistently place librarians in the top percentile for
demand of skills over the next 5–10 years. Continuing education and training
therefore has a major role to play in securing the future for those within the
sector. (1999, n.p).

6.7 International comparisons


In introducing this section it is important to note, as with the corresponding
sections in the other core chapters, the information contained here is
presented with the following limiting factors outlined. Since this EIP project
contained no allowance to parallel the approach to information gathering that
was taken within Australia (ie. there were no surveys of international activity
or visits to interesting sites identified by those surveys), this information is
drawn from observations made in 1998 during a UNSW Library sponsored
visit by one project team member to look at interesting overseas sites with
quality related systems and projects of interest.

Whilst the Australian National Training Reform Agenda and its structure and
language closely follow its British counterpart, British university libraries
appear to have left work within this framework to the vocational education
sector. Consequently, the most extensive international training programs in
the quality area are related to United States TQM/CQI training programs,
usually implemented as part of an institution wide framework or program. All
of these include leadership and team training, customer skills, and a little on
creative problem solving and tools, making meetings effective and perhaps,
more recently, using the Baldrige criteria for organisational self assessment.
Some examples are given below.

The Danish work is also very interesting but not yet available in English.

6.7.1 Training for Quality (CQI) in a US university:


Purdue: a case study
Purdue University uses an external training consultancy to provide train the
trainer materials and courses. The same organisation also delivers training
directly to specific units. In the very decentralised framework of Purdue,
where the power resides in the faculties, the central human resources
arranges access to, but does not deliver, training. All trained trainers use
workshops, and teaching materials and work books developed and supplied

115
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

by the training organisation. The Continuous Quality Improvement based


Excellence 21 quality program is predicated on teamwork and involvement,
and this is the focus of training. Specific quality related programs available
from the trainer’s catalogue include:
• Quality through the eyes of the customer;
• Quality: the individual’s role;
• Quality: the leadership role;
• Clarifying customer expectations;
• Resolving customer dissatisfaction;
• Solving quality problems;
• Tools and techniques for solving quality problems;
• Participating in quality problem solving sessions;
• Leading quality problem solving sessions;
• Analysing work flows;
• Focusing your team on quality;
• Building individual commitment to quality;
• Sustaining momentum for continuous improvement; and
• Making team meetings work.

The training area is planning to incorporate sessions beyond the basics such
as Tapping Staff Potential which aims to:
• Foster the service oriented culture in a seamless way (ie without
unit barriers);
• Identify client needs;
• Provide exceptionally responsive service; and
• Present continuous process improvement at a more sophisticated level.

The introductory awareness raising program is called Mindset for Continuous


Improvement.

Key industry involvement in the establishment and continuation of the Purdue


Excellence 21 Program came from the Motorola Corporation and Motorola
University, which wanted to influence the way the university bureaucracy
worked, and also to try to ensure that graduates with the technical skills had
also been exposed to the appropriate skills to function effectively in the
workplace. The Motorola influence has been powerful and continuing for a
number of years. Motorola was a key U.S. Baldrige Prize winner with a world
reputation for the implementation of TQM throughout their global company.

116
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Over the years the emphasis has changed towards performance measurement
and process improvement with more rigour in data collection and the
requirements to build data collection in to the workflow as new targets so
that subsequent performance deviations can be tracked easily. Another area
supported is the formation of cross campus teams to break down the usual
barriers (eg. executive assistants).

6.7.2 Using library competency profiles: the Danish Business


School library
One of the rare international examples of using a competency based
framework exists at DBS where the staff intranet displays competency listings
developed for about eighteen different library staff profiles. With these
eighteen profiles all positions in the DBS library are covered. DBS do not use
the profiles for assessment in relation to progression or salary but they are
used for awareness raising and for identifying staff development needs at the
individual yearly developmental review.

Staff development is provided through the business school, through the Royal
Danish Library School continuing education and the newly established Danish
Library Centre which focuses on programs for the newer skills for library staff
across all of Denmark.

6.7.3 Large scale training using M-Quality program to


introduce leadership training: University of Michigan
Michigan used the intellectual approach of delivering a large number of
lectures from business gurus to try to win middle level hearts and minds. This
was appropriate considering that the whole program came about because the
Head of the Business School became head of the university.

At the lower level, training was extensive and consisted of team leader and
facilitator training of forty hours for facilitators, which included two hours of
introductory concepts, eight hours refining tools and processes and how to
select appropriate improvement projects to work. These had to be critical
business practices in individual units, something significant enough to make
an impact on yet also within their power to change. No team was allowed to
go off ‘doing’ something until the leader(s) had been through this training. All
projects had to meet certain criteria before time is spent on them. All teams
must have a leader and a separate facilitator who monitors the process rather
than the content.

117
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Almost all University of Michigan Libraries staff attended a brief general


orientation to M-quality and its concepts. About forty staff went through team
leader training and about a dozen of these attended the forty-hour facilitator
training. Some people facilitated more than one team, others were short-term
leaders/facilitators when others left the organisation. The Facilitator group had
a management literature awareness raising and experience-sharing group
going for a while with a regular core of facilitators.

At the time M-Quality was implemented, there was no management training


provided within Michigan from the centre, so staff in the Libraries received
some de facto leadership training through M-Quality courses. HRD have since
developed the initial management training and recently put in place an
excellent program. For a number of library staff M-Quality was the first and
only source of supervisor training and they are grateful for the extensive
nature of the training.

If a unit requires further refining choices of improvement processes, or needs


to learn skills to use in Managing by Fact, HRD can be called on for a
consulting fee. Course attendance is also charged out to customers units
because the HRD has to generate its own income and basically be
self-sufficient.

For ordinary non-academic staff M-Quality means a range of things. It could


mean being on a team, it could mean trying to implement ‘Quality in
everyday activities’ (if not officially on a team). A publication `Becoming
involved in M-Quality’ was written by a team of enthusiastic staff to interpret
how M-Quality could influence each worker, even if they were not on an
improvement team.

The multi-day annual M-Quality Expos which were showcases for


improvement teams have evolved into a type of internal trade exhibition with
an accompanying mini-conference on developing issues in running the
University. This is seen as a consciousness raising mechanism for staff who
attend those sessions as well as visit the Expo.

118
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

7 Conclusions and
recommendations

7.1 Conclusions
There is considerable activity occurring in Australian university libraries
in the areas of benchmarking, performance measurement and best practice.
However, to this point, very little has been published which would enable
libraries to learn from the experiences of other institutions across the country.

This investigation has identified a number of exemplars of ‘best practice’


in each of these areas. Libraries can learn much of value by reference to
these applications. In addition, the project has identified a need to review the
adequacy of current CAUL Performance Indicators. Many Australian university
libraries have also expressed the need for development of additional
indicators in several related segments of library performance.

In all areas of best practice, overseas experience and activity should also
be monitored. This study has drawn on these and has highlighted those
initiatives of particular note for the Australian environment.

The information provided in this report should assist Australian university


libraries in identifying models and techniques which may assist them to
initiate programs appropriate to their individual missions and organisational
culture. The completion of the ‘Handbook of Best practice’, an initiative
which will be further refined and published as an outcome of this project,
will further facilitate such initiatives.

7.2 Recommendations

7.2.1 Benchmarking
Recommendation 1: That the Australian academic library community further
investigate membership of organisations such as CHEMS and AQC as a means
to benchmark performance.

To date only a small number of Australian academic institutions are members


of CHEMS. The libraries of those institutions who participated in the 1998
benchmarking of library and information resources found the exercise to be

119
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

both positive and useful. Increasing the membership of CHEMS in both the
Australian and New Zealand academic community could facilitate greater use
of benchmarking as a tool to improve performance. The requirement to rate
performance against statements of good practice would thus be shared
amongst a greater number of libraries.

To date, three Australian academic libraries have successfully demonstrated


the applicability of the Australian Quality Awards Framework for Business
Excellence to the academic library sector. The Australian Quality Council’s
Benchmarking Network provides an additional avenue for academic libraries
to engage in benchmarking with organisations who have been recognised for
business excellence, and who have successfully demonstrated the application
of benchmarking.

7.2.2 Performance measurement


Recommendation 2: That CAUL investigate the potential inclusion into annual
AARL compilation, statistics which relate to research/reference services,
including provision of advice, education and information.

At the present time, academic library performance in the above areas is not
included in the annual AARL Statistics. Given that performance measurement
in these areas has been problematic due to the qualitative rather than
quantitative nature of the measurement, more widespread documentation of
the results may assist in informing the academic library community on levels
of activity and performance within these areas.

To date, the QUT/UNSW benchmarking project has been one of the few
recorded attempts to benchmark reference/research support services. It
highlighted the lack of publicly available performance data and the lack of a
common methodology for measuring performance in specific research
support services, acquisitions and cataloguing. This view was also highlighted
by survey respondents in the current project study.

The Final report of the CRIG Working Party on Performance Measures for
Reference Services states that addition of statistics for reference services to the
AARL statistics may lend impetus to the development and adoption of the
recommendations made by CRIG in their report. This Report recommended:
• the proposed [ASK] model for evaluation of reference service effectiveness
be taken up by the profession;
• the twelve performance indicators for reference services be adopted by
Australian university libraries and additional performance measures be
developed where necessary; and

120
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• prioritisation of the twelve performance indicators and attendant


performance measures be addressed at a national level by Australian
university libraries.

It is important that CAUL address these considerations.

Recommendation 3: That CAUL establish a formal timeframe/program for


the review and subsequent updating of existing performance indicators
(ie biannually)

It is evident, both from the priority areas identified for updating and
modification, and from the level of in-house adaptation of indicators reflected
in the survey responses, that if the CAUL Indicators are to remain a valid and
useful tool for libraries there must be consideration given to the need to
ensure their relevance in relation to:
• changes within the profession that may impact on service delivery
(particularly in the areas of Document Delivery and Customer
Satisfaction); and
• external factors that may impact on their continuing usefulness
(eg. technological advances, in particular the growing area of electronic
information, resources and services, Y2K).

Specific responses to the project survey recommend a number of potential


changes to existing indicators which could be considered within a program
of formal and regular review.

Recommendation 4: That CAUL investigate the feasibility of creating a database


for registering and sharing of indicators currently in use by Australian university
libraries (both external and in-house), and that resources be allocated for the
regular (annual) review and updating of this database.This could include the
possible CAUL sponsorship of part-time position to monitor and report on
developments in this field.

Survey responses illustrate the large amount of effort that libraries have
invested in the development of in-house indicators. There has not been an
across the board adoption of the CAUL indicators, and it is quite likely that
there has been some duplication of effort in the development of in-house
indicators within various institutions. Exploring the possibility of sharing this
effort, and the indicators developed to date, was noted by a number of senior
library managers in the course of the EIP site visits and is thus worthy of
further investigation. Given that published literature on performance
measurement within the Australian academic library community is not prolific,
a database may fill a much needed gap in the availability of performance
information, and could facilitate wider participation in library benchmarking.

121
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Recommendation 5: That CAUL investigate the development of indicators to


add to its existing suite, and that this development include the possible review
and adaptation of overseas work. Specifically, indicators which should be
considered include:
• Indicators for the electronic library;
• Indicators for evaluation of information literacy training and reference
services; and
• Additional work on the client satisfaction indicator.

The work being undertaken by the European Commission funded EQUINOX


(Library Performance Measurement and Quality Management System) project
team indicates that it is fairly well advanced in the development of indicators
for the electronic library. EQUINOX builds on the recommendations of earlier
EC projects, including EQLIPSE, MINSTREL, DECIMAL, DECIDE, and on the
CAMILE Concerted Action, by taking a world lead in developing electronic
library performance indicators. An initial list of electronic performance
indicators was posted on the EQUINOX website in June 1999. It would seem
to make good sense for the Australian academic library community to
maintain a watching brief on this activity, participating in any trails/tests that
may become available rather than duplicating the work that has been done
to date.

There has been an increase in the level of involvement of librarians in the


teaching of information literacy/skills. This activity is now more likely to form
part of the curriculum and usually includes completion of a formal
assessment activity. To date most of the measurement activity has been
limited to numbers attending sessions. With this increase in activity there
has come the need to look beyond a quantitative measure towards a measure
that will provide information on achievement of outcomes/level of
competency attained.

Although CAUL has already developed an indicator for client satisfaction,


many of the respondents have expressed some dissatisfaction with its current
form. It would be useful to examine the alternative approaches adopted by
some survey respondents, in particular those who have utilised the work
undertaken by Calvert and Cullen in New Zealand and Parasuraman, Hernon
and Altman, in the United States, where the focus has been on stakeholder
evaluation of library effectiveness through the use of a gap
analysis/constituency model.

122
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

7.2.3 Quality/best practice


Recommendation 6: That CAUL investigate the establishment of a
’sponsor/mentor’ program that utilises the experiences of ‘best practice’
libraries in the development and implementation of quality management
frameworks and tools for the Australian academic library community. Such a
program could take the form of:
• Best practice libraries ‘hosting’ and supporting over a specified time (eg.
twelve months) libraries interested in advancing and learning about quality
management and/or specific frameworks that have been successfully
integrated into library management;
• Staff exchanges/visits between libraries involved in the program; and
• Sharing of experiences and documentation, expertise and follow-up
support/advice.

Survey responses indicated that, whilst the number of libraries adopting


‘formal’ quality frameworks was low, a majority were interested in using
quality management to inform planning and decision making, particularly in
the light of decreasing resources and budgets. A program such as that
recommended above would need to be structured to ensure adequate
support to both the host and learning library, include guidelines for the initial
development of specific aims, objectives and outcomes, and be closely
monitored (perhaps via a pilot program).to ensure that best practice libraries
were not unfairly disadvantaged in terms of ability to continue ‘normal’
business.

Given that for a number of libraries implementation of ‘formal’ quality


management is linked to the imperative to provide a basis for arguing the
maintenance or increase of funding levels, it is ironic that for others the lack
of available funding is the very reason formal programs cannot be developed.
A program such as that recommended may provide those libraries without
sufficient funds to improve or adapt existing management strategies the
opportunity to do so, whilst in turn contributing to the sharing of experiences
both within the sector and internally. This can only be advantageous to the
Australian university library sector as a whole and provide the opportunity for
CAUL and participating libraries to exhibit worldwide leadership in the area
of best practice development and information exchange.

123
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

7.2.4 Staff competencies and training


Recommendation 7: That training in quality management tools and techniques
be incorporated into the curriculum of Schools of Library and Information
Studies, perhaps under the heading of leadership
and management.

Quality/best practice, benchmarking and performance measurement are


becoming more widely adopted as frameworks and tools within frameworks
for improving performance within the academic library sector. Given the
challenges they provide in terms of language, principles and applications,
and the inevitable changes that occur within the workplace as a result of their
implementation, it is important that both current and future staff are given
appropriate education and training in the application of the principles in
the workplace.

Considering the increase and interest in the implementation of quality


management principles and frameworks within Australian university libraries,
it is apparent that those graduates equipped with such knowledge and skills,
perhaps in relation to specific tools ie. benchmarking, will be of value to
organisations seeking to improve both the level of awareness, and
acceptance, of ‘quality’.

To date, the majority of academic libraries have not implemented formal


information, training and development sessions for all their staff. Training has
varied, depending on both the level and degree of involvement of staff within
the institutions. Institutions who have exposed all their staff to the principles
and practices associated with quality management are still in the minority. For
staff already in the workplace, it would be useful to provide mid-career
updates in skills and knowledge through university library staff development
programmes. Staff also need to receive training which will help them to
become aware of the importance of their individual contribution to the
organisation, and the impact that this has on overall effectiveness

Recommendation 8: That CAUL facilitate the sharing of currently available staff


training programs across the sector, utilising the experiences of libraries already
advanced in this area.

To date, only the University of Wollongong Library has developed a


comprehensive approach to training in quality management, benchmarking
and performance measurement techniques through the use of both in-house
workshops and self paced packages. Wollongong provides an example of the
sort of in-house development that may well be duplicated across the sector
for varying uses.

124
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Useful training initiatives were also observed in a number of other CAUL


libraries, including Deakin, University of Melbourne and Queensland
University of Technology. A project to document and publicise such programs
would contribute to the knowledge and skills base of the academic library
community. It would reduce the likelihood of duplication of effort and would
ensure that programs offered and available were of relevance and practical
value to the institutions using them.

7.2.5 Overall
Recommendation 9: That given interest in Australian developments
in the area of best practice, every opportunity be taken by
Australasian practitioners to link Australian efforts into those underway
internationally.

The 3rd International conference on Performance Measures in Library and


Information Services in Newcastle, UK in August will provide the ideal forum
at which to explore these issues with those from around the library world
most involved in recent developments. A new international journal on the
topic area will also be launched at the conference which may provide an
on going mechanism to monitor developments and publish to the profession.
The development of handbook/source books to assist library staff is
underway in Britain for benchmarking in university libraries, for the use
of quality management frameworks for public libraries in Britain and for
costing library processes in Germany. No one project is provided with large
resources. The possibility of international co-operation in this and other areas
should be explored.

125
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Appendix A:
Benchmarking survey

1. Has your library engaged in or planned any formal benchmarking


exercises?
2. If so, was this part of a larger institutional exercise? Please give
brief details.
3. For each benchmarking exercise (if any) please give the following details:

Name of partner/s
Type of organisation/s (ie library/other)
Duration of benchmarking exercise/dates
Scope of benchmarking (eg acquisitions process)
Key variables/indicators (eg supply time, materials availability
Your contact for the exercise (name, title, phone, email)
Report available? (Y/N)

4. How did your library become involved in or identify other


potential partners?
5. What was the role and level of involvement of each partner
in each exercise?
6. What was the overall purpose of each exercise? Was the purpose
achieved, if not why not and if so to what extent?
7. If you have not yet concluded a benchmarking exercise, what concrete
outcomes would you like to see?
8. Did staff engaged in each benchmarking exercise undergo any form of
training/awareness prior to the project? Please give brief details.
9. What lessons have you learned from the exercise/s? What would you do
differently next time?

127
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

10. Have you engaged in any informal benchmarking, if so please give


details of:
Name of partner/s
Type of organisation/s (ie library/other)
Duration of benchmarking exercise/dates
Scope of benchmarking (eg acquisitions process)
Key variables/indicators (eg supply time, materials availability
Your contact for the exercise (name, title, phone, email)
Report available? (Y/N)

11. What was/is the relationship between benchmarking activities and other
quality management/improvement processes within your library?

128
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Appendix B:
Performance indicators survey

1. Does your library use the CAUL performance indicators? Please indicate
those used, frequency of administration and month/year of last use
[ ] Client Satisfaction Indicator A
Frequency: monthly/quarterly/annually/other: ____________
Last Use (MM/YY) ______________

[ ] Document Delivery Indicator B


Frequency: monthly/quarterly/annually/other: ____________
Last Use (MM/YY) ______________

[ ] Materials Availability Indicator C


Frequency: monthly/quarterly/annually/other: ____________
Last Use (MM/YY) ______________
2. Is your library using other performance indicators either in addition to or
instead of the CAUL indicators? If so, please give brief details of indicators
used and whether they were developed by your institution or by an
external agency (eg IFLA, SCONUL, ISO 11620) and whether they are
available in kit form or were adapted in house from a standard or manual.
3. Colin Taylor’s 1992 survey identified a range of indicators that covered all
the major service areas. In late 1995, a review of the priority areas was
undertaken. In view of the developments in information access and
delivery and declining budgets, there is value in revisiting and reevaluating
the list. Please indicate which you feel are still important:

129
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Indicators Very Important Important Not Important


Market penetration
Opening hours compared to demand
Collection quality review by expert checklists
(viz. Conspectus)
Collection use ( viz. IFLA)
Catalogue Quality -Known item search
Catalogue Quality -Subject search
Extend to electronic information services?
Acquisition speed
Book processing speed
Document delivery
ILL speed
Reference fillrate
User satisfaction
Cost Efficiency
Costing methodologies
Electronic Resources—Quality
Electronic Resources—Availability
Adequacy of retrieval software and/or data

4. Are there any other indicators that you feel are important or very
important?
5. If CAUL were to undertake further developmental work on performance
indicators to which area/s would you allocate the highest priority?
6. Are there any changes you would like to see made to the existing CAUL
indicators? Please give brief details.
7. Have you used the information gathered from performance indicators as a
basis for benchmarking, process improvement or other quality
improvement initiatives? If so please give brief details.
8. Are any type of training or awareness sessions provided for staff involved
in the use of performance indicators? Please give brief details.

130
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Appendix C:
Quality/best practice/performance
measurement survey

As you are aware, the last five years have seen considerable emphasis on
quality, best practice and performance measurement. CAUL and its members
have responded positively through sponsorship of performance indicators,
publications, seminars, awards and other activities. This Evaluation and
Investigations Project (EIP) provides an opportunity to review our position on
these matters. In responding to this survey, please consider the following
definitions of terms as a guide:
• Quality: ‘The totality of features and characteristics of a product or service
that bear on the library’s ability to satisfy stated or implied needs’ (ISO
11620).
• Best Practice: ‘The pursuit of superior performance. The way sites of
excellence manage and organise their operations. It is a moving target. As
these organisations continue to improve, the ‘best practice’ goalposts are
constantly moving. Thus the concept of continuous improvement is integral
to the achievement of best practice. ’ (Australian Best Practice
Demonstration Program, 1991). Such sites of excellence would be expected
to provide lessons on how and why they work so effectively in the specified
areas which other organisations, through benchmarking, could adapt to
their own operations.
• Performance Measurement: ‘Comparing what a library is doing
(performance) with what it is meant to do (mission) and wants to achieve
(goals). Performance is the degree to which a library is achieving its
objectives, particularly in terms of users’ needs (IFLA, 1996).’ ’ Performance
measurement involves the evaluation of an activity, program or service in
relation to its appropriateness, effectiveness and efficiency. Performance
indicators are developed to measure these criteria.’ (Schmidt, 1990)

131
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

A. General views
1. In your view, what place should a formal quality management program
have in the management of contemporary academic and research libraries?
2. If such a program has value, what conditions are necessary for its
successful implementation?

B. Views from experience


Please respond to the following questions if you have implemented
or are implementing any formal programs for quality, best practice,
performance measurement.
1. Please provide a brief description of the programs.
2. Were the programs one-off exercises or were they built into a quality
improvement cycle?
3. How have the outcomes of the programs informed subsequent practice?
4. How have quality management, best practice, performance measurement
been integrated into the way your library does things?
5. What effects has the introduction of these programs had on the staff
or library?
6. What awareness/training has been provided for staff in relation to
these programs? Was this done as a ‘one-off’ or is there an ongoing
training program?
7. Which features of your organisational structure/management have
facilitated the successful implementation of such programs (eg self
managed work teams, Quality Coordinator/Committee)?
8. Does your library use/plan to use any recognised quality assessment/audit
frameworks (eg ISO, Australian Quality Awards, Australian Business
Excellence Framework). Please describe.
9. Has your library been successfully audited or received recognition through
any of the recognised quality frameworks? Please give details.
10. Do you have any further comments relating to quality, best practice,
benchmarking and performance measurement, which have not been
addressed in the EIP surveys?

132
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Appendix D:
Institutions visited

Curtin University of Technology


Acting University Librarian: Ed Willis

Deakin University
University Librarian: Sue McKnight

Queensland University of Technology


Director, Library Services: Gaynor Austen

State Library of Victoria


Manager, State Library Services: Prue Mercer

Swinburne University of Technology


University Librarian: Fran Hegarty

University of Melbourne
Deputy University Librarian: Angela Brigland

University of New South Wales


University Librarian: Marian Bate

University of Queensland
University Librarian: Janine Schmidt

University of Western Australia


University Librarian: John Arfield

University of Wollongong
University Librarian: Felicity McGregor

133
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Appendix E:
Useful sources

Benchmarking

Australasian sources
Benchmarking Communications Limited 1999, Benchmark: Learning From the
Best (http://www.best-practice.co.nz/)

‘A management information business’. Although a commercial site some


interesting information including profiles of best practice organisations,
a bi-monthly newsletter, practical advice. Case studies. Links to related
sites.

Benchmarking Library Best-Practice for Performance Improvement (1998:


Sydney) Benchmarking Library Best-Practice, 23rd–24th November 1998, the
Gazebo Hotel, Sydney, Key Performance Measures and Best-Practices for
Superior Library Service & Efficient Internal Work Practices, IES Conferences
Australia, Chatswood, N.S.W.

Conference papers include:


• 'Common library benchmarking problems and how to overcome them,
Isabella Trahn. Outlines current approaches to benchmarking both within
Australia and overseas. UNSW have engaged in a number of benchmarking
projects—this paper uses these experiences to explore issues including
training/participation of staff, potential deficiencies and partner relations.
• ‘Using benchmarking to take your library towards best practice—Where to
start? What’s really involved?, Stephen Robertson.

Benchmarking Self-Help Manual: Your Organisation’s Guide to Achieving Best


Practice 1995, 2nd edn, Australian Government Publishing Service, Canberra.

This manual was commissioned as part of the Australian Best Practice


Demonstration program and is designed to explain ‘what benchmarking
is and how to do it’. The Best Practice Program aimed to accelerate the
adoption of a best practice culture within Australian organisations.
Provides an introduction to the various approached to benchmarking
and a step by step guide to the process. Project guidelines provide a
good source of useful planning material and help identify potential

135
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

pitfalls. Whilst case studies focus in the main on manufacturing


industries this manual is a good starting point for the novice.

Evans, A. 1994, Benchmarking: Taking Your Organisation Towards Best


Practice!, Business Library, Melbourne.

A useful and readable guide to the fundamentals of benchmarking.


Outlines the basics and provides a five step model—plan the project;
form the teams; collect the data; analyse the data; take action.

Evans, A. & Coronel, P. 1999, Benchmarking in Australia,


(http://www.ozemail.com.au/~benchmrk/)

Homepage of Benchmarking PLUS. Provides links to a bulletin board,


list of benchmarking books and other resources (courtesy of
Amazon.com), and a best practice database providing access to
information about best practice and benchmarking with a focus on
Australia and New Zealand. Several articles for downloading including
‘The nuts and bolts of benchmarking’ which provides a comprehensive
overview of benchmarking practice. In need of some updating, however
a useful source of information hints.

Neumann, L. 1996, ‘Benchmarking and the Library: report on current


infrastructure and project work for improvement and comparative standards,’
Ex Libris: Newsletter of the University of Melbourne Library, Issue 45, pp. 5–7.

In 1996, the University of Melbourne as part of its quality improvement


program formed a ‘Benchmarking Project Team’ to identify, inform and
provide training to staff involved in benchmarking activity. The aim was
to draw together the strands that were currently in place with the key
outcome being the recognition of staff that benchmarking as a
methodology ‘provides a way or method for ongoing improvement’.
This article describes the process developed to encourage the adoption
of benchmarking as a continuous improvement tool across the
University of Melbourne libraries, and provides a useful case study that
illustrates how the methodology is being adopted for supporting
ongoing development.

Robertson, M. & Trahn, I. 1997, ‘Benchmarking academic libraries: An


Australian case study,’ Australian Academic and Research Libraries, vol. 28,
no. 2, pp. 126–141.

Outlines a benchmarking project comparing acquisitions and


cataloguing, document delivery and research support services between
Queensland University of Technology and the University of New South
Wales. Provides a useful insight into the development, methodology,
implementation and outcomes of a major benchmarking exercise from

136
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

the perspective of both the benchmarker (QUT) and the partner


library (UNSW).

Wilson, A. 1999, Benchmarking, (http://www.ntu.edu.au/admin/isd/qsdc/)

In 1994, the Northern Territory University was successful in obtaining


funding as part of the Australian Best Practice Demonstration Program.
As part of this project NTU Quality and Staff Development Coordinator,
Anne Wilson, initiated two benchmarking projects framed within the
University Library’s Best Practice in Research Information project. The
first involved an examination of internal processes in the Library
Purchasing, Cataloguing and Processing Branch. The second, which was
conducted at a more informal level, involved a series of site visits to
research/reference departments in academic libraries in the United
States. These Web pages describe each project in detail, outline results
and provide basic information about benchmarking, including links to
several other key sites.

International sources
Association of Commonwealth Universities University Management
Benchmarking Club, http://www.acu.ac.uk/chems/benchmark.html

The CHEMS approach is to ask participants to respond to a specially


prepared framework of open questions on a process to indicate
strengths and weaknesses, and to illustrate through current
documentation. The participants also provide contextual data in order to
assess ‘fitness for purpose’ in what they are doing. The approach to
marking (as in 1966) was based on the EQFM approach judging.

• approach (policy or technique and how ‘fit’ this was);


• application (extent to which it is applied); and
• outcome (how successful it was, how it is monitored and updated

An interim composite model of ‘good practice ’ and reports formed the


basis for a workshop. Identified ‘Best in group’ band of institutions are
identified. A final report to members after the workshop includes

• a summary of workshop discussions including main issues; and

• key features of what members and assessors agree to be best


practice. This is to be used as a self-assessment model using a
simple 1–5 scale against each best practice element.

The management of library and information services was benchmarked


in 1998.

137
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Brockman, J. ed. 1997, Quality Management and benchmarking in the


information sector. New Providence, Bowker-Saur.

Although benchmarking features extensively in the headings in this


book there is very little specifically about benchmarking contained in it.
The investigations of Garrod and Kinnell Evans at that indicated that
benchmarking was not understood or practiced in the UK higher
education sector. The, as yet, incomplete activities of the Benchmarking
Group of SCONUL reported under the item on Stephen Town would
seem to be at the forefront in Britain. Academic libraries continue to rely
on ‘informal’ benchmarking related activities.

Canadian Health Libraries Association/Association des bibliotheques de la


sante du Canada CHLA/ABSC Benchmarking Task Force, 1998, Benchmarking
Tool Kit, CHLA/ABSC, Toronto

Structured around a 6 step process including:

1. Administering a sample library services questionnaire (instructions are


given as well as the survey);

2. Library profile questionnaire which can be filled in by ticking boxes;

3. Gathering variable—this is a fill in section, based as much as possible


on statistical returns to published sources, with some material
needing to be gathered from data gathered internally over the
past year;

4. Indicator formulas—the variables filled in the workbook are


re-entered and the formula required included, with space for the
indicator value found. The qualitative sections of the indicator are all
obtained from the common services survey and entered;

5. Indicator values—a summary of the above (Step 4); and

6. Comparisons with benchmarking partners. Indicative non-identifying


data is give by broad type of library to give some perspective to
individual results.

NACUBO 1998–99 Benchmark Program. The National Association of College


and University Business Officers site at http://www.nacubo.org/website/
benchmarking/program.html

The NACUBO claims to provide indicators of efficiency and best practice.


Whilst the overall focus of the National Association of College and
University Business Officers is necessarily on the administrative
functions, amongst their List of Benchmarks and Processes is
information under Library.

138
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Standing Conference of National and University Libraries (SCONUL).


Benchmarking Group, http://www.sconul.ac.uk/

Beginning from a SCONUL sponsored Benchmarking Seminar at the end


of 1997, a range of British university libraries undertook to pilot
benchmarking activities in the area of Advice Services (reference
including IT inquiries), Inter Library Loans, Information Skills, Counter
Services and Integration with Teaching and Learning. As yet the results
of their activities have not been published. As part of this project a
handbook on benchmarking was to be compiled.

Town, S. 1996, ‘Benchmarking as an approach to quality’ in BUOPOLIS 1:


routes to Quality. Bournemouth, Bournemouth University Library and
Information Services, pp.71–79.

This definitive article is based on Town’s own benchmarking


experiences at Cranfield University’s Shrivenham campus starting in
1993. Since the publication of this article Stephen Town has become the
coordinator of the SCONUL benchmarking projects because of his
position as the benchmarking expert on the SCONUL Advisory
Committee on Performance Indicators.

Performance indicators and measures

Australasian sources
Byrne, A. 1997, ‘CAUL’s interest in performance measurement, Council of
Australian University Librarians; paper presented at a pre-conference to the
1996 ALIA conference’, Australian Academic & Research Libraries, vol. 28, no.
4, pp. 252–258.

Outlines the development of a suite of three Performance Indicators—


the Library/Clientele Congruence Indicator; Document Delivery Quality
Indicator and Materials Availability Indicator by the Council of Australian
University Librarians (CAUL). Published in 1995 these indicators have
been applied in a number of Australian university libraries. A 1996
survey identified the level of usage of indicators, suggested
improvements to existing indicators and potential areas for development
of additional indicators. This survey has been updated 1998/1999 as part
of the DETYA/EIP ‘Best practice in Australian university libraries’ project.
Survey results available (http://www.anu.edu.au/caul/).

139
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Calvert, P. J. 1994, ‘Library effectiveness: The search for a social context’,


Journal of Library and Information Science, vol. 26, pp. 15–24.

Calvert reports on a survey of public library effectiveness in


New Zealand. Libraries are viewed as social agencies which must
be responsive to the needs and wishes of various constituencies.
The author suggests that, in the process of analysing library
effectiveness, it is necessary to determine not only what was done,
but also which of these tasks the library was supposed to do. Calvert
enumerates and discusses a large number of service indicators, that
have been grouped into broader service categories, which were used
in measuring library effectiveness.

Cargnelutti, T., D’Avigdor, R. & Ury, J. 1996, ‘’KIN key indicators: Decision
making tools for managing library databases’, in Electronic dream? Virtual
nightmare, the reality for libraries, 1996 VALA Biennial Conference
Proceedings, Victorian Association for Library Automation, Melbourne,
pp. 331–359.

—— 1999, ‘Finding one’s web feet. Revisiting KIN: Key indicators of


electronic resource usage in the Web environment’, in Robots to Knowbots:
The Wider Automation Agenda, 1999 VALA Biennial Conference. Proceedings.
Victorian Association for Library Automation, Melbourne, pp. 279–296.
Paper also available at: (http://www.library.unsw.edu.au/~eirg/vala98.html)

The authors of this paper initially reported and described a range of key
indicators of database and electronic resource usage at the 1996 VALA
conference. This paper focuses on the difficulties associated with
measuring electronic resource use and delivery, in particular web based
resources, and outlines additional key indicators based on changes to
the type and delivery of information now available.

CAVAL Reference Interest Group Working Party on Performance Measures for


Reference Services 1995, First Report, CAVAL, Melbourne.

—— 1997, Second Report, CAVAL, Melbourne.

—— 1998, Final Report, CAVAL, Melbourne.

Consolidates the previous work done by this group to identify


performance measures and indicators currently used to evaluate
reference services in Victorian academic libraries. Outlines a model of
reference service effectiveness and identifies 12 Key Performance
Indicators used by library staff and customers. Three dimensions of
service evaluation are described—Attributes, Support, Knowledge. The
identification of potential appropriate measures forms a part of the

140
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

report. A comprehensive and well-timed report that makes a valuable


contribution to an area of library service regarded by many as difficult
to measure.

CAUL performance indicators:


• Gorman, G. & Cornish, B. 1995, Library/Clientele Congruence, CAUL
Performance Indicator A, Council of Australian University Librarians,
Canberra.
• Robertson, M. 1995, Document Delivery Performance, CAUL Performance
Indicator B, Council of Australian University Librarians, Canberra.
• Taylor, C. 1995, Materials Availability, CAUL Performance Indicator C,
Council of Australian University Librarians, Canberra.
Ellis-Newman, J. & Robinson, P. 1998, ‘The cost of library services: Activity-
based costing in an Australian academic library,’ in Journal of Academic
Librarianship, vol. 24, no. 5, pp. 373–380.

Activity-based costing is normally associated with budget planning


and allocation. More latterly a number of libraries have costed activities,
largely within technical service operations, as an adjunct to
benchmarking. This article discusses the value of activity-based
costing to libraries, its relationship to the collection of library statistics
and its potential use as a measure of service quality, with reference
to operations at Edith Cowan University and the University of
Western Australia.

Exon, F.C.A. & Williamson, V. 1996, Performance Indicators Database:


Selecting the Best Performance Indicators for Your Library—Operating
Manual, Curtin University of Technology Library and Information Service,
Perth, W.A.

This database provides an index to the performance indicators described


in literature published up to the end of 1995. It includes indicators
suitable for measurement of the performance of libraries and
incorporates a spreadsheet describing each indicator, a reference to the
source(s) of the indicator, a statement of how it is calculated and an
index of applications that it might be used for.

141
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Hernon, P. & Calvert, P. J. 1996, ’ Methods for measuring service quality in


university libraries in New Zealand’, Journal of Academic Librarianship, vol.
22 , no. 5, pp. 387–391.

Outlines the development and subsequent testing of a questionnaire


(based on Hernon and Altman’s conceptual framework Service Quality
in Academic Libraries) designed to measure service quality at seven
academic libraries in New Zealand. Includes the survey instrument and
the recommendation that the instrument should be tailored to fit specific
library environments—’service quality is a local issue’.

International sources
Association of Research Libraries. Access and Technology Program/ILL/DD
Related Resources Measuring the Performance of Interlibrary Loan and
Document Delivery Services http://www.arl.org/access/illdd/illss.shtml

This valuable site includes the March 1998 Executive Summary of the
study results, an article which also appeared in the ARL Bimonthly
Newsletter of Research Library Issues and Actions, and information from
two subsequent symposiums and workshops on strategies to redesign
ILL/DD services using the identified characteristics of low cost high
performing ILL operations. Four performance measures were covered:

• direct cost;
• fill-rate;
• turn around time; and
• user satisfaction

Site visits to the ‘best practice’ organisations to interview staff about their
workflows will form part of the final report.

Barton, J. 1998, ‘Recommendations of the Cranfield project on performance


indicators for academic libraries in Standing Conference Of National and
University Librarians’ SCONUL Newsletter, Summer/Autumn, pp.13–17.

The set of management statistics recommended to the HEMS


group were:

• total library expenditure per user;


• expenditure on information provision per user;
• expenditure on staffing per user;
• seat hours per week per user;

142
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• seat hours occupied per week per user or average number of visitors
in library at any given time;
• loans per user;
• stock on loan per user;
• inter library loans as a percentage of all loan; and
• staff hours spent on user education and information services per
week or percentage of staff time spent on user education and
information services.

Contextual data at library and at institutional level was also


recommended for collection:

Library

• number of libraries;
• space occupied;
• size of collection; and
• total library expenditure.

Institution

• percentage of post graduate students;


• number of FTE students;
• percentage of part-time students; and
• number of FTE academic and research staff.

Barton, J. & Blagden, J. 1998, Academic library effectiveness: a comparative


approach, British Library Research & Investigations Report 120, London,
British Library.

The remit of this investigation was to develop a small set of


performance indicators which would enable funding bodies, vice-
chancellors and other senior university managers to compare library
effectiveness across the UK higher education sector. The report
recommends a small set of management statistics (as opposed to
performance indicators) covering per capita expenditures, seat hours per
week per user, lending and user education data. The report also
recommends the provision of ‘contextual’ data largely on the size of the
institution to facilitate interpretation of the management statistics.
Recommendations for further work on the electronic library,
benchmarking, user satisfaction, document availability, information
services, user education, impact, in-house use and access vs holding
are also included.

143
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Bertot, J. & McClure, C. have just released a number of working papers—


Developing National Public Library and Statewide Network Electronic
Performance Measures and Statistics at http://www.albany.edu/~imlsstat/

Elements are to be grouped according to the following:

WHAT DOES THE LIBRARY HAVE?

• collections (CD-ROMs, electronic subscription services,


software packages);
• equipment and access (computers available for users, internet
access available for users, library home page services, printers
available for users);

HOW MUCH DOES IT COST?;

• expenditure on hardware;
• electronic access expenditure;
• electronic format expenditures (CD-ROMs, disks and tapes, software);
• expenditure on maintenance of hardware available to users; and
• telecommunication expenditures.

HOW ARE THE LIBRARY’s ELECTRONIC RESOURCES BEING USED?

• access to Library’s web pages;


• use by users of electronic subscription services; and
• OPAC use by users.

Brophy, P. 1998, ‘It May Be Electronic but is it any good?’ in Electronic


dream? Virtual nightmare, the reality for libraries, 1996 VALA Biennial
Conference Proceedings, Victorian Association for Library Automation,
Melbourne, pp.216–229 also available on the VALA web site,
http://avoca.vicnet.net.au/~vala/valaweb/vala.htm

This is an important summary of the state of research in this area as it


was early in 1998 and outlines European and US work yet carries this
work forward another step. Brophy asks what is quality in the electronic
context; suggests a model of comprehensive service and specific ways
to measure the quality of products and services and assure quality and
summarises US/Europe and on-going work. He also includes an eight
part functional map for electronic library services covering:

• access negotiation (selection, contract review);


• resource capture, storage & access (local storage, universal
accessibility, metadata provision);

144
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• advisory services (Help desk, subject advice, information skills);


• resource discovery (resource identification, location identification);
• resource delivery (request, acquire, deliver to user);
• resource utilisation (exploitation, tools);
• infrastructure provision (space, equipment, software, net-works,
support services); and
• resource preservation (identification, selection,
conservation, renewal).

This is mapped against a product quality criteria framework to give a


possible map of what quality electronic services might look like. Some
of the indicators proposed were:

• PC hours pa/FTE student;


• FTE student per networked PC;
• queuing time;
• down time (as % of total time);
• availability (as % of attempted access);
• proportion of a notional number of data-sets available; and
• user satisfaction services/infrastructure.

CAMILE project at http://www.staff.dmu.ac.uk/~camile/

CAMILLE is an integrated source set up to promulgate results from four


important EU projects now completed. There are links on the CAMILE
pages for all four projects.

Cotta-Shoenberg, M. 1995, ‘Performance measurement in the context of


quality management’ in 1st Northumbria Conference on Performance
Measurement (see below)

The value of this contribution to professional practice is two fold. One,


the linkage between performance measurement and the development of
a quality management framework: the second in the practical common
sense approach including the philosophy of acceptance and adaptation
only of what is useful for continuous improvement.

DECIDE (decision support system for academic and public libraries)


coordinated by Carpenter Davies at http://www.pro-net.co.uk/efc/DECIDE/

The very useful Matrix of performance measures and indicators from


recent major studies by John Sumsion forms part of this report (see
below for another location).

145
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

DECIMAL (decision research for the development of integrated library


systems) coordinated by Manchester Metropolitan University at:
http://www.mmu.ac.uk/h-ss/dic/research/decimal.htm

There are two deliverables available on the web, the Integrative


Research report and the User Requirements Specifications. The
Performance Measurement chapter in the final report summarises results
of use of and interest in the Toolbox indicators mentioned below.
Unsurprisingly many libraries express interest in a range of measures but
the range of institutions and influences leads to the overall conclusion
that almost all suggested measures may be useful for some of the
libraries some of the time so that a fixed ‘box’ content is not the best
way to proceed.

EQLIPSE—Evaluation and Quality in Library Performance: System for Europe


coordinated by the University of Central Lancashire at http://www.sub.su.se/
henrik/eqlipsehome.htm

SUPERCEDED BY EQUINOX

The EQLIPSE project aimed to produce:

• international agreement on standard performance measures for the


electronic library environment; and
• develop and test an integrated quality management and performance
measurement software tool

The EQLIPSE project massaged IFLA, ISO, De Montfort and HECFE


performance indicators into a list of 52. These indicators were tested in
participating libraries and a manual written for the software
specifications to support collection and manipulation. Concrete
outcomes of this project included the prototype library performance
measurement support software with implementation manual. As a result
of testing in eight European and UK academic libraries, survey data was
collected on the most valuable, least valuable, most problematic to
collect data sets and the resulting suggested listing was included in the
final report.

EQUINOX Library Performance Measurement and Quality Management


System site at http://equinox.dcu.ie/

This brings together unfinished aspects of earlier European Union


projects including CAMILE which was to publicise the outcomes of
EQLIPSE, MINSTREL, DECIMAL, DECIDE. There are direct links.
In May 1999 EQUINOX posted draft electronic performance indicators.
There are 14:

146
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

1. Percentage of target population reached by electronic library services


2. Number of log-ins to electronic library services per capita per month
3. Number of remote log-ins to electronic library services per capita
per month
4. Cost per log in per electronic library service
5. Electronic documents delivered, per electronic library service, per
capita per month
6. Cost per electronic document delivered per electronic library service
7. Reference enquiries submitted electronically per capita per month
8. Facilities use rate
9. Number of library computer workstations per capita
10. Library workstation hours used per capita per month
11. Rejected log-ins as a percentage of total log-ins
12. Systems availability
13. Queuing time for access to PCs
14. IT expenditure as a percentage of total library expenditure

International Standards Organisation, 1996, ISO/DIS 11620 Information and


documentation—Library performance indicators, Geneva, ISO.

The indicators listed have to meet the criteria of being already tested, in
common use, and applicable to almost any type of library. Hence the
29 indicators are conservative and cover only traditional services.
Coverage includes:
• User satisfaction;
• General (4 indicators on use/cost);
• Providing documents (6 indicators on availability/use);
• Retrieving documents (2 indicators on retrieval times);
• Lending documents (and document delivery) (6 indicators on
use/cost);
• Enquiry & reference services (1 indicator on ‘correct answer’ fill rate);
• Information searching (2 indicators on cataloguing searching success);
• Facilities (4 indicators on availability/use);
• Acquiring and processing documents (2 indicators on median
times); and
• Cataloguing (1 indicator on cost per title).
The inclusion of definitions, scope and methods of producing and
interpreting each indicator is useful.

147
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Issues in Research Library Measurement: a special issue of ARL: a bimonthly


newsletter of research library issues and actions, no.197, April 1998 available
in printed form and at http://www.arl.org/stats/perfmeas/

This contains the following:

Kyrillidou, M. ‘An overview of performance measures in higher


education and libraries,’ Martha although considering the usefulness of
the International Standards Organisation and International Federation of
Library Associations measures (see below) to be limited to internal
library time series comparisons rather than being tools which can be
used to compare across institutions, recommended to US research
libraries that they begin some cross institutional comparisons using three
of the IFLA/ISO11620 indicators which seem important and not difficult
to collect :

• market penetration of circulation;


• collection use (loans plus in house uses/documents held or just
loans/documents held); and
• satisfaction/importance for users overall, with specific services, locally
and remotely.

Jewell, T. Recent Trends in ARL Electronic and Access Services Data at


http://www.arl.org/stats/specproj/etrends.htm

This paper is most useful for indicating the direction in which the ARL
statistical database is moving. Although the existing database is drawn
from fairly traditional sources it does have the very useful facility that
the user can select, search and customise data according to
requirements. This flexibility, in addition to the coverage and number of
years the ARL data has been collected, makes the existing database a
useful tool. This paper is a look at the future. It summarises an
Association of Research Libraries project on ‘The character and nature of
research library investments in electronic resources.’ The project looked
at US research library responses to supplemental questions on electronic
resources as part of collections or which are accessible through library
system terminals and their impact as part of library materials
expenditures. Document delivery and ILL expenditures, expenditures on
bibliographic utilities and consortia are included and discussed. US
libraries can compare their own figures with median and average figures
for the group.

148
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Joint Funding Council. Ad-hoc group on Performance Indicators for Libraries,


1995, The Effective Academic Library (EAL), Bristol, HEFCE

This document is summarised in the Sumsion work and the indicators


appear in the indicator chart. Brophy’s work takes the EAL indicators as
the starting point for his translation to equivalent measures in an
electronic context. Produced by senior academic librarians in 1995 as a
response to the Follett report. The content of this includes 21 indicators
with debts to CEC Toolbox, the ISO Draft Standard and existing
SCONUL statistics. Jane Barton and John Blagden have just reported to
SCONUL the findings of a project to test and refine these. (see below)

McClure, C. & Lopata, C. 1996, Assessing the Academic Networked


Environment, Syracuse, N.Y., School of Information Studies,
Syracuse University

This paper which forms the basis for a current Coalition for
Networked Information Project of the same name. Both the
original paper http://istweb.syr.edu/~mcclure/network/toc.html
and the original and progress reports on the project are available at
http://www.cni.org/projects/assessing/reports/. Key elements in the
original paper include sections on collecting and using qualitative data,
measures under the headings of:

• users;
• costs;
• network traffic;
• use;
• network services; and
• support services.

Model user surveys are included, together with links to two self
assessment frameworks.

Measuring the impacts and value of networking and networked


information has emerged as a major issue. In 1997–1998 the Coalition
conducted a coordinated field test of the assessment measures outlined
in McClure and Lopata’s Assessing the Academic Networked
Environment: Strategies and Options. The field test was intended to
facilitate institutional collaboration on assessment issues, to develop a
compendium of assessment measures, and to inform the community of
approaches and best practices in assessing networked resources and
services. In 1998–1999 we will complete this effort by reporting results
to the broader community.

149
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

There are also participating institutions’ views on the CNI Project by


staff responsible in a range of institutions including Virginia Polytechnic
Institute and State University (student access to the Library web, use of
resources on the library web with cost and processing implications),
Brown University (library electronic resources, help desk, training,
instructional support, applications support) and University of
Washington (use of networked library and other information resources,
teaching and learning, ,impact on research/information seeking
behaviour) at http://www.cni.org/projects/assessing/reports. At Virginia
the range of investigations included:

• communication network services;


• university libraries;
• user services;
• media services/educational technologies;
• involvement of each specific area (discussed under the headings);
• narrative;
• questions to answered;
• methodology;
• performance indicators; and
• value.

The library is participating in surveys of students in early years in


specific disciplines, which have a high component of electronic
resources, in their program on Library Web and Electronic resources in
1998. Sample indicators encompass student usage and information
pathways, relationships to specific assignments, frequency of use of
electronic versus paper resources, trends over past five years in
catalogue use and paper usage and preferred pathways.
http://www.cni.org/projects/assessing/reports/virginia.html
The key elements in the paper include sections on collecting and using
qualitative data, measures under the headings of:

• users;
• costs;
• network traffic;
• use;
• network services; and
• support services;

150
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

MIEL Management in the Electronic Library. MIEL 2 final report at


http://www.ukoln.ac.uk/dlis/models/studies/mis/mis.rtf

MIEL2 Final Report ranges over possible sources for appropriate


management information for the electronic library which can be used
for planning, evaluation and day to day management. Possible sources
and types of information suggested include commercial suppliers
(documents delivered not time spent on-line, transaction log analysis,
monitoring the Centre for Information Quality Management medium
user/producer communications (see http://www.la-
hq.org.uk/liaison/ciqm/ciqm.html), reports from electronic data service
providers, ELINOR software development (De Montfort University at
http://www.iielr.dmu.ac.uk/Research/index.html) usage to page level.
Further development in close liaison with the implementation of
Effective Academic Library (see below):

The areas where work is suggested in MIEL 2 include

• testing the indicators in a range of university libraries — old/new,


converged/non-converged, etc.;
• clarifying the concept of electronic library; and
• doing more work on user satisfaction using SERVQUAL/stakeholder
approaches.

At the end of 1998 the next stages of MIEL were just beginning.
MIEL 3–5 are just beginning and cover the following areas. Outcomes
are not due until 2000:

• MIEL 3 (part of EQUINOX focussed on international standards


activity);

• MIEL 4 management information in a co-operative setting for clumps


and hybrids; and

• MIEL 5 management information needs for libraries delivering to


remote or dispersed populations.

MINSTREL (management information software tool) coordinated by


De Montfort University. Final Report at http://www.dmu.ac.uk/~camile/
Minstrel.htm
The objective of the project was to ‘create and test a transformer which
will allow for the development of simple ratios suitable for producing
performance indicators, and bridging the gap between data sets
collected for library management information purposes and tools used
by librarians for decision support and library operation modelling’.
Prototype software was developed and tested.

151
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Northumbria International Conference on Performance Measurement in


Libraries and Information Services, (2nd: 1997: Longhirst Hall, England).
Proceedings of the 2nd Northumbria International Conference on
Performance Measurement in Libraries and Information Services held at
Longhirst Management and Training Centre, Longhirst Hall, Northumberland,
England, 7 to 11 September 1997. Newcastle, Information North, 1998.

Valuable because of the fairly recent date, wide source coverage


(including major contributions from Australasia, the US, the UK and
Europe) and the exploration of the themes of:

• performance measures and indicators;


• benchmarking;
• qualitative measurement ;
• electronic/digital library measurement; and
• managing information services (role for PM in changing styles,
structures, procedures).

Pickering, H. & Crawford J. C.1996 The Stakeholder approach to the


construction of performance measures: a report to the British Library Research
and Development Department Glasgow Caledonian University, 1006:

Fifteen British academic libraries used a Calvert and Cullen inspired


method, in the tradition of Van House, to design and administer a
survey to 10 stakeholder groups. Identifies priorities across overall and
specific groupings of stakeholders based on their 1–7 rating of
importance. A few overriding considerations were identified. The results
compared pre and post 1992 UK universities and made comparisons
with Follett, EAC, ISO and IFLA sources. The stakeholder approach
based on interview and focus groups should be used in relation to
measurement of electronic services and of future information
planning strategy.

Includes brief comments about the manuals sourced for the indicators
in the matrices.

Poll, R. & Boekhorst, P. te 1996, Measuring Quality: international guidelines


for performance measurement in academic libraries, IFLA Publication 76,
Munich, Saur.

Concentrates on user oriented and effectiveness measures for


academic libraries of all types. Describes in some detail definitions,
methods of data gathering and interpretation. Limits itself to around
16 indicators and is strong on catalogue information effectiveness.
Covers the following:

152
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• user satisfaction (including services for remote use);


• general (market penetration, opening hours compared with demand);
• providing & retrieving documents (expert checklists, collection use,
subject collection use, documents not used);
• enquiry and reference services (correct answer fill rate);
• information searching (Known-item-search, subject search);
• acquiring and processing documents (acquisition, processing);
• lending and document delivery (time); and
• availability (proportion of documents available almost immediately).

Sprehe, J. & McClure, C. Performance Measures for Federal Agency Websites at


http://istweb.syr.edu/~mcclure/PARS.Pro.April20.html

The measures to be developed fall under the following headings:

• completeness (how complete and comprehensive?);


• timeline (how up to date is the information?);
• customer satisfaction (how satisfied are the site users?);
• efficiency (what is it costing?)
• effectiveness (how well is the site perceived to meet the needs
of the agency?); and
• policy compliance (to what degree does it operate within
set guidelines?).

Sumsion, J. et al, 1995, Library performance indicators and management


models, Office of the European Communities, Final report of the PROLIB/PI
project. EUR16448EN

This includes a long list of data elements and core measures and indicators.
(The ToolBox mentioned above). A brief summary also appears in the
First Northumbria conference under Sumsion, John and Suzanne Ward,
‘EC Toolbox project: general findings and some particular proposals’, pp.
135–145.

Sumsion, J. Matrices of Performance Indicators at http://www.staff.dmu.ac.uk/


~camile/matrices/intro.htm (Commissioned as part of the DECIDE project)

This useful summary for busy practitioners provides a brief overview


sufficient to show what indicators recent (1995) major studies or
publications encompass. Sumsion, apart from producing a very useful
grid of indicators also reviewed the major literature and included links
where appropriate. The actual items need be consulted only if detailed
knowledge is required.

153
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Willemse, J. 1998, ‘Performance Assessment in IFLA and United Kingdom


Academic Libraries, ’south African Journal of Library and Information
Science, vol. 66, no. 4, pp. 161–165.

This study found that less than one third of surveyed libraries in 1996
and 1991 had policies in place relating to performance measures or
indicators, and less than a quarter were using indicators to evaluate
services on a regular basis in relation user/document delivery services.
Less than 40% evaluated enquiry services, and less than fifty evaluated
user education. Of those with results, less than half disseminated these
outside the library. The IFLA study was based on a 1993 survey by
Morgan of British higher education libraries.

Best practice/quality management

Australasian sources
Australian Quality Council 1998, (http://www.aqc.org.au/)

Australian Business Excellence Awards framework, Awards, publications.


Management Competency Standard, organisational self-assessment,
Australian Benchmarking Edge. The starting point for libraries looking to
learn more about a quality improvement program that fits within
recognised Australian guidelines, and that has been successfully adopted
within the sector.

Chestnut, B. 1997, Quality Assurance: An Australian Guide to ISO 9000


Certification, Longman, Melbourne.

Textbook type material, but one of the few guides to implementing


ISO 9000 within an Australian context. Clear and concise—useful for
those wanting an understanding of basic concepts, procedures,
documentation and the auditing process. One of the few to offer some
guidelines for, service focussed organisations in the need for and
interpretation of, the standard.

Harman, G. 1998, ‘Quality assurance mechanisms and their use as policy


instruments: Major international approaches’, European Journal of Education,
vol. 33, no. 3, pp. 331–339.

Focuses on quality assurance mechanisms for higher education and their


use as policy instruments by Australian government over the last
decade, approaches adopted by individual institutions, the national
Australian quality assurance program adopted by the Commonwealth

154
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

government over the period 1993–1995, and subsequent changes to this


program by the current Coalition Government. A useful article that
provides context for the development of quality programs within the
library sector over the last 5 years.

The QMS in VET Project—How to develop Best Practice Quality Management


Systems in VET 1997, Australian National Training Authority, n. p.

Recent mergers between Higher Education and TAFE institutions have


led to increased interest in the level of adoption of quality frameworks
within TAFE and how they might be extended across the new larger
institutions. The QMS in VET project involved a consortium (Central
Gippsland Institute of TAFE, Swinburne University of Technology (TAFE
Division), TAFE Queensland and the Western Australian Department of
Training) identifying the ‘features of best practice in their different
Quality Management Systems’. Outlines approaches to the management
of quality within VET organisations, provides case studies, and identifies
eight Critical Success Factors for Best Practice Quality Management
Systems in VET. Swinburne University was one of ten libraries visited as
part of the ‘Best practice in Australian University Libraries’ project. A
summary version of this document is available at
http://www.swin.edu.au/qed/qmsvet.htm. The complete text can be
downloaded from the same URL.

The relationship between ISO series of Standards and Australian Business


Excellence Framework 1999, [Document included in TAFE Institute
Organisational Self-Assessment Manual Resource Book, Office of Training
and Further Education, Victoria, 1999].

Examines the relationship between the two most common frameworks


used within the State Training System—ISO 9000 Series of Standards and
Australian Business Excellence Framework. Describes both the main
similarities and differences between the two frameworks, and ways in
which organisations can move from ‘quality assurance to business
excellence’. Has particular relevance for university libraries operating in
cross-sector environments.

Williamson, V. & Exon, F. 1996, ‘The quality movement in Australian


university libraries (findings of a quality audit survey)’, Library Trends,
vol. 44, pp. 526–544.

This definitive article outlines the progress of quality management


activity in Australian academic libraries within the framework of the
1993–1995 DEET quality audit of Australian universities. The authors
conducted a survey of university librarians ‘perceptions of, and

155
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

participation in, the quality audit process’. The results of this survey
indicated that a significant number of university libraries were involved
in the development of quality assurance processes and management,
and have set a base upon which subsequent progress can be measured.

International sources
Assessment Tools for Quality Management in Public Libraries: a project funded
by the British Library Research and Innovation Centre and managed by
Loughborough University (DILS) Department of Information Studies (Professor
Margaret Evans and Kathryn Jones) and University of Sheffield (Dr Bob
Usherwood), http://info.lboro.ac.uk/departments/dils/research/qualman.htm

This project, which reports to BLRIC in January 1999, considers three


self-assessment approaches appropriate for public libraries:

• The Democratic Approach (from the UK Institute of Public Policy


Research 1991);
• The Quality Framework (from the UK Local Government Management
Board, 1989); and
• The Business Excellence Model (the European Framework for Quality
Management 1997).

Criteria have been mapped and customised to make them appropriate


for British public libraries. A tool-kit to assist local public libraries to
implement a self-assessment process is also being developed as part of
the project, and may be published in 1999 by the Library Association.
Articles on the findings will appear on the above web site in 1999.

Brophy, P. & Coulling, K. 1996, Quality management for information and


library managers. London, ASLIB/GOWER.

An authoritative work by two practitioners who initiated the ISO


accreditation process for the Library at the University of Central
Lancashire and contributed to early research and publication in the area.
Covers an introduction to quality frameworks, with short chapters on
ISO9000, performance measurement from the customer’s perspective,
national quality awards and TQM, as well as the British Citizen’s Charter
approach. Benchmarking and performance measurement are dealt with
as tools which contribute towards TQM. Brophy and Coulling discuss a
wide range of approaches under these headings:

• Clear purpose;
• Vision commitment and leadership;

156
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

• Teamwork and involvement;


• Customer oriented design;
• Systematic processes;
• Supplier awareness; and
• Training and education;

The last part of the book focuses on the application of quality


frameworks to libraries. There is an interesting checklist in the final
chapter, but little to grasp amongst the non-library examples quoted.

Ellis, D. & Norton, B.1993, Implementing BS5750/ISO9000 in Libraries,


London, ASLIB.

Now an older text but still very useful for its simplicity, brevity, and
common sense. Still the only real ‘how to’ text available. The most
valuable segments include a library-oriented checklist of quality system
requirements according to the standard, and a simple framework for a
library quality plan which covers inquiring, processing and delivering
information. There are also words of wisdom on keeping
documentation simple. Since it is aimed at the small special library
market this manual may be insufficient for a university library but still
worth consulting and more practical than the Scandinavian guidelines
publication. (see below).

European Foundation for Quality Management (EFQM), 1996, Self Assessment


Guidelines for Public Sector. Brussels, EFQM, http://www.efqm.org/

The sectors specifically addressed include healthcare, education and


local and central government. Part 1 of the booklet outlines the EFQM
and the nine criteria:

• leadership;
• policy and strategy;
• people management;
• resources;
• processes;
• customer satisfaction;
• people satisfaction;
• impact on society; and
• business results.

157
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

The balance of the publication covers six different approaches to self-


assessment using the criteria and outlining their relative benefits. A
number of university libraries in Europe are currently working on
adapting this framework for quality management purposes in their
own libraries, including the Copenhagen Business School and libraries
in the Netherlands. This model has also been included in the
Loughborough University report on the application of quality
frameworks mentioned above.

Excellence 21 at Purdue University, http://thorplus.lib.purdue.edu/ex21/

An interesting example of a site, based at the library, but describing a


long-standing institution wide program of Continuous Quality
Improvement adapted from TQM. Has many useful links to other North
American sites in higher education. Another example is the M-Quality
program at University of Michigan Ann Arbor, but which does not have
such a good web site. Staff from UMICH library presented at some
North American conferences on TQM and academic libraries.

Investors in People in Higher Education. http://www.lboro.ac.uk/service/


sd/iipihe/iipinh11.htm

This paper examines at the application of this British Human Resources


focused framework. A number of British university libraries, including
the Universities of Central Lancashire and Wolverhampton, have already
measured library human resources practices against this framework. The
paper outlines the requirements, including examples of policy
statements from British universities. The paper covers:

• commitment;
• planning;
• action;
• evaluation; and
• references.

A link at http://www.lboro.ac.uk/service/sd/iipinhe/matrix.html leads to


documentation used by the Pilkington Library at Loughborough
University in June 1998 to support an internal assessment using the IIP
framework. This gives an excellent outline of the application of this
framework in a university library setting. There are also useful links to
the IIP home page.

158
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

ISO 9000 for Libraries and Information Centres: A Guide, 1996, report of a
project supported by NORDINFO, The Hague, FID, FID Occasional Paper,
no. 13.

The principal interest of this publication is the provision of specific


examples of how periodical services, database searching, establishing
and internal database and information management in two small
specialist Scandinavian libraries stepped through the processes which
could lead to application to certification against the ISO 9000 series of
standards. The results of a survey on attitudes to quality management
frameworks are included. The limited scope of the work makes it of use
only as a specialised guide.

St Clair, G. 1997, Total Quality Management in Information Services, London,


Bowker-Saur.

St. Clair outlines a series of critical questions to be examinec around


the headings

• desire for quality;

• support of senior management;

• customer service;

• continuous improvement;

• measurement;

• trust and teamwork; and

• follow up review and ongoing quality.

Staff competencies

Australasian sources

Competency standards
Australian Library and Information Association 1996–7, Competency Standards
and the Library Industry: A Workshop Series, ALIA, Canberra.

The ALIA Board of Education has produced four workshops on


competency standards and the library industry. Designed as self-paced
packages workshop topics are: An insight into competencies and
competency standards; An introduction to levels of competency and the

159
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Library Industry Competency Standards; Linking the industry


competency standards to your organisation; Workplace assessment and
the Library Industry Competency Standards. Further information and an
overview of related publications available from
(http://www.alia.org.au/competencies.html).

Australian Library and Information Association 1999, Work Level


Guidelines for Librarians and Library Technicians 1998,
(http://www.alia.org.au/publications/wlg/).

Published both online and in print these guidelines form the basis of
defining roles and tasks for staff working in different positions within
libraries. Effective performance within the roles outlined is linked to the
Library Industry Competency Standards and Australian Qualifications
Framework levels.

Workplace experiences
Bridgland, A. 1996, ‘Potential applications of the library industry competency
standards at the University of Melbourne Library’, Education for Library and
Information Services: Australia, vol. 13, no. 1, pp. 83–89.

Bridgland, A. & Dott, A. (unpub), How Competent Is Competent


When it Comes to it? An Implementation Strategy for the University of
Melbourne Library.

Cuthbert, S. 1997, ‘Library industry competency standards: State of the art,


State Library of Victoria’, Australian Library Journal, vol. 46, no. 3, pp.
322–329.

Nicholson, J. & Bridgland, A. (unpub.), Library Industry National Training


Package: Friend or Foe for CPE?

Describes the development of the Library Industry National Training


Package and the adoption and use by the University of Melbourne
Library of the industry standards in development of IT competencies
aligned to position levels. Discusses related issues including training
needs, assessment, staff resistance and motivation and concludes
that there is a place for competency standards in providing a means
of both improving and ensuring quality assurance of CPE programs
across sectors.

Williamson, V. 1996, ‘Competency standards in an industrial context: The


experiences of Curtin University of Technology’, Education for Library and
Information Services: Australia, vol.13, no. 1, pp. 63–72.

160
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Other perspectives
Australian Quality Council 1999, Management Competency Standard,
(http://www.aqc.org.au/mcsfbe.htm).

AQC have developed a range of competencies designed to help


organisations identify individual management skills across a number of
levels of individual responsibility and authority. Aligned to the Business
Excellence Framework these competencies are being looked at with
increasing interest from libraries keen to define management position
responsibilities in terms of industry accepted standards.

Hase, S., Cairns, L., Malloch, M. 1998, Capable Organisations: The


Implications for Vocational Education and Training, Australian National
Training Authority, n.p.

Capability has been described as the ability to move competencies


beyond the familiar and into new environments and situations. One of
the key criticisms and barriers to the use of competencies to define
work within university and other libraries on a wide scale has been the
perception that competencies cannot be used to describe or measure
the often complex problem-solving skills and application of prior
knowledge into changing situations needed by library staff working in a
constantly changing environment. This report defines and describes the
characteristics of a ‘capable’ organisation, and raises questions about
competencies and their role within a broader ‘capability’ context.

Please note, only Australian sources on staff competencies were reviewed for
inclusion in useful sources list.

161
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Appendix F:
Suggested outline for a ‘Best
Practice Handbook for Australian
University Libraries’

Introductory note
As commented elsewhere in this report, many libraries are deterred from
implementing initiatives in regard to benchmarking, performance indicators,
and quality frameworks by the perception that it is too costly to investigate
the appropriate applications for their circumstances. Since there is a wealth of
experience currently available and continuing to come on stream within
Australasia and internationally it would be a practical outcome arising from
this investigation and an effective investment in the better management of
Australian university libraries to make up to date information and advice
easily available and updateable. The information would be relevant for
libraries who have done little in this area and for the more experienced
institutions who wish to keep up to date or to promote an application which
may be new to them.

Whilst a pared down version of the handbook would make a useful one off
paper publication, because of the dynamic nature of the data it would be of
most use if made available as part of a web site, the most obvious choice
being that of the Council of Australian University Librarians (CAUL) site.
Somewhat in the manner of the United States American Research Libraries’
Statistics site, the current AARL statistics site could be expanded to become a
gateway for all libraries looking for information and sources of experience.

The outline below gives an indication of the possible scope of the “how to”
component of the handbook.

163
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Best Practice Handbook


• HOW TO USE THIS HANDBOOK/SITE
• CONTENTS
• GLOSSARY
Already available in this report. A quick guide for those lost in the
acronym maze.
• CONTEXT
Material from this report can be massaged to provide the content for this
section.
• CURRENT OPINION ON QUALITY ISSUES
Material from this report can be massaged to provide the content for this
section. Since this area needs annual updating, if the recommended
part-time project person is employed to keep the CAUL indicator issues
up to date, updating this area would be a necessary part of making sure
work in Australasia is up with world best practice. The required time
would be minimal.
• OVERSEAS EXEMPLARS
Comment as for the Current Opinion section above.
• LOCAL EXEMPLARS AND USEFUL PUBLICATIONS AND CONTACTS
Again the basic material is here in the report. The same comment on
updating applies to this section. The exemplar contacts would be available
to foster the recommended mentor relationships suggested in a
recommendation of this report. Useful publications could provide links to
whatever is published in the Australasian literature, thereby encouraging
practitioners to publish information about developments within their
libraries.
• BENCHMARKING
This section would outline the theoretical models and approaches and
provide checklists of do’s and don'ts. It would be a concise, library-
oriented compact benchmarking handbook, without the extraneous
material in commercial equivalents. The emphasis would be on checklists
and visual approaches. Much of this material is already available.
• PERFORMANCE MEASUREMENT
This part of the handbook/site would fulfil a two-fold purpose:
– To keep the community informed of the status of the CAUL indicators.
– To provide up to date matrices by function/area of all published
indicators (with the permission of the authors) so that librarians
searching for available indicators for specific areas can find what is

164
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

available in a matter of minutes. The matrices are already referred to in


this report and would simply need to be added to through updating.
• QUALITY FRAMEWORKS
This section would provide diagrams and visual summaries of the main
frameworks together with information on published criteria used for award
applications and self-evaluation, etc. There would also be links to
framework sponsors so further information could be readily obtained. The
material is easily sourced from material referred to in this report.
• TRAINING AND RELATED TOPICS
This section could include sources of information on quality management
competencies, outlines of relevant training workshops and materials in use
in Australasia and their sources so that those needing to develop
appropriate training could adapt material or commission exemplar libraries
to deliver already developed training programs for them. Most of this
information is already contained in this report.

165
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Bibliography

Benchmarking

Australasian sources
Benchmarking in Australia, http://www.ozemail.com.au/~benchmrk/
– 1998, Benchmarking Library Best-Practice for Performance Improvement:
Key Performance Measures and Best-Practices for Superior Library Service &
Efficient Internal Work Practices, IES Conferences Australia, Chatswood,
N.S.W.
• ‘Common library benchmarking problems and how to overcome them’,
Isabella Trahn.
• ‘Using benchmarking to take your library towards best practice—Where
to start? What’s really involved?, Stephen Robertson.
– 1995, Benchmarking Self-Help Manual: Your Organisation’s Guide to
Achieving Best Practice, 2nd edn, AGPS, Canberra.
Byrne, A. 1995, ‘Best practice at the Northern Territory University,’ Australian
Academic and Research Libraries, vol. 26, no. 1, pp. 17–24.

Evans, A. 1994, Benchmarking: Taking Your Organisation Towards Best


Practice, Business Library, Melbourne.

Evans, A. ed, 1997, International Benchmarking Sourcebook:


A Complete Bibliography and List of Resources Needed for Benchmarking,
Melbourne, Alpha.

Evans, A. & Coronel, P. 1999, ‘The nuts and bolts of benchmarking’,


http://www.ozemail.com.au/~benchmrk/

Jantti, M. et al 1998, ‘Benchmarking—Industrial tourism or a tool for


continuous improvement’ in Pathways to Knowledge, Australian Library and
Information Association 5th. Biennial Conference and Exhibition, Conference
paper available http://www.alia.org.au/conferences/adelaide98/

Macneil, J. et al 1994, Benchmarking Australia, Longman, Melbourne.

McKinnon Walker/IDP Education Australia, Project to Trial and Develop


Benchmarking Criteria

167
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Neumann, L. 1996, ‘Benchmarking and the library: Report on current


infrastructure and project work for improvement and comparative standards’,
Ex Libris: Newsletter of the University of Melbourne Library, Issue 45, pp. 5–7.

Robertson, M. & Trahn, I. 1997, ‘Benchmarking academic libraries:


An Australian case study’, Australian Academic and Research Libraries,
vol. 28, no. 2, pp. 126–141.

Universitas 21, http://222.universitas.edu.au/members.html

Wilson, A. 1998, Benchmarking,


(http://www.ntu.edu.au/admin/isd/qsdc/bench.html

International sources
Allen, F. 1993, ‘Benchmarking: Practical aspects for information professionals,’
Special Libraries, vol. 84 , no. 3, pp. 123–130

Association of Commonwealth Universities. University Benchmarking Club,


http://www.acu.ac.uk/chems/benchmark.html

Association of Research Libraries. Access and Technology Program, ILL/DD


Related Resources: Measuring the Performance of Interlibrary Loan and
Document Delivery Services, http://www.arl.org/access/illdd/
illdd-measperf9712.shtml

Association of Research Libraries. Access and Technology Program. From Data


to Action: an ARL workshop on strategies to redesign ILL/DD Services,
http://www.arl.org/access/performance/illddwork.shhtml

Benchmarking Centre, http://www.benchmarking.co.uk/

Benchmarking Exchange, http://www.benchmarking.org/

Brockman, J. ed. 1998, Quality management and benchmarking in the


information sector, Munchen, Bowker Saur

Camp, R. 1989, Benchmarking: the search for industry best practices that lead
to superior performance, Milwaukee, Wisconsin, ASQC Quality Press

Canadian Health Libraries Association/Association des Bibliotheques de la


Sante du Canada CHLA/ABSC 1998, Taskforce on Benchmarking for Health
Libraries, Benchmarking Tool Kit, Toronto, CHLA/ABSC

Fielden, J. 1997, ‘Benchmarking university performance’, a.c.u. bulletin of


current documentation, no. 131, pp. 20–25

Garrod, P. & Kinnell, M. 1997, ‘Benchmarking development needs in the LIS


sector’ Journal of Information Science, vol. 23, no.2, pp. 111–118

168
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Garrod, P. & Kinnell, M. 1996, ‘Performance measurement, benchmarking and


the UK library and information sector’, Libri, vol. 46, pp. 141–148

Garrod, P. & Kinnell, M. 1995, Quality Management Issues: a select


bibliography for library and information services managers, The Hague, FID
(FID Publication no. 710: Occasional Paper no. 10: BLRD Report 6220 )

Garrod, P. & Kinnell, M. 1995, Towards library excellence: Best practice


benchmarking in the library and information sector, London, British Library
Research & Development Division

Gohlke, A. 1997, ‘Benchmarking for strategic performance management’,


Information Outlook, vol. 1, no. 8, pp. 22–24

Jackson, M. 1997, ‘Measuring the Performance of Interlibrary Loan and


Document Delivery Services’, ARL Bimonthly Newsletter of Research Library
Issues and Actions, Issue 195 (ARL Access and Technology Program/ILL/DD
Related Resources)

Library Benchmarking International, http://www.world-net/users/lbi/includes:


• Collecting and analyzing benchmarking data: a librarian’s guide, Texas, LBI
• Conducting a preliminary benchmarking analysis: a librarian’s guide,
Texas, LBI
• Presenting benchmarking results: a librarian’s guide, Texas, LBI

Library Trends, 1994 Special Issue

Loughborough University. Department of Information and Library Studies.


Library and Information Statistics Unit, Seminar on academic library statistics
and benchmarking, 2–3 June 1998

Lund, H. ‘Benchmarking in UK Universities’ Commonwealth Higher Education


Management Services Paper, No. 22, http://www.acu.ac.uk/chems/publications

National Association of College and University Business Officers (NACUBO),


http://www.nacubo.org/website/benchmarking/program.html

Northumbria International Conference on Performance Measurement in


Libraries and Information Services (1st: 1995: Longhirst Hall, Northumberland,
England) Proceedings of the 1st Northumbria international Conference on
Performance Measurement in Libraries and Information Services held at
Longhirst Management and Training Centre, Longhirst Hall, Northumberland,
England, 31st August–4th September 1995. Information North 1995. In
particular see:
• Town, S. ’ Benchmarking and Performance Measurement’, pp. 83–88

169
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Northumbria International Conference on Performance Measurement in


Libraries and Information Services ( 2nd : 1997 : Longhirst Hall, England)
Proceedings of the 2nd Northumbria International Conference on
Performance Measurement in Libraries and Information Services held at
Longhirst Management and Training Centre, Longhirst Hall, Northumberland,
England, 7 to 11 September 1997. Newcastle, Information North 1998.

Town, S. 1996, ’ Benchmarking as an approach to quality’ in BUOPOLIS 1:


routes to Quality. Bournemouth, Bournemouth University Library and
Information Services, pp.71–79

Tri-University Library Project, http://www.lib.uwaterloo.ca/TUG/index.html

University Management Benchmarking Club see Association of


Commonwealth Universities http://www.acu.ac.uk/chems/benchmark.html

Universitas 21 http://www.universitas.edu.au

Performance indicators and measures

Australasian sources
Byrne, A. 1997, ‘CAUL’s interest in performance measurement, Council of
Australian University Librarians; paper presented at a pre-conference to the
1996 ALIA conference’, Australian Academic & Research Libraries, vol. 28,
no. 4, pp. 252–258.

Calvert, P. 1994, ‘Library effectiveness: The search for a social context’,


Journal of Library and Information Science, vol. 26, pp. 15–24.

Calvert, P. 1997, ‘Measuring service quality: From theory into practice’,


Australian Academic & Research Libraries, vol. 28, no. 3, pp. 198–204.

Calvert, P. 1998, ’service quality in academic libraries: Research in New


Zealand and Singapore’, in Pathways to Knowledge, Australian Library and
Information Association 5th. Biennial Conference and Exhibition, Conference
paper available http://www.alia.org.au/conferences/adelaide98/

Calvert, P. & Cullen, R. 1995, ‘The New Zealand public libraries effectiveness
study and the New Zealand university libraries effectiveness study’, Australian
Academic and Research Libraries, vol.26, no, 2, pp. 97–106.

Cargnelutti, T., D’Avigdor, R. & Ury, J. 1996, ‘’KIN key indicators: Decision
making tools for managing library databases’, in Electronic dream? Virtual
nightmare, the reality for libraries, 1996 VALA Biennial Conference
Proceedings, Victorian Association for Library Automation, Melbourne,
pp. 331–359.

170
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Cargnelutti, T. 1999, ‘Finding one’s web feet. Revisiting KIN: Key indicators of
electronic resource usage in the Web environment’, in Robots to Knowbots:
The Wider Automation Agenda, 1999 VALA Biennial Conference. Proceedings.
Victorian Association for Library Automation, Melbourne, pp. 279–296.
Paper also available at http://www.library.unsw.edu.au/~eirg/vala98.html

CAVAL Reference Interest Group Working Party on Performance Measures for


Reference Services 1995, First Report, CAVAL, Melbourne.

—— 1997, Second Report, CAVAL, Melbourne.

—— 1998, Final Report, CAVAL, Melbourne.

CAUL performance indicators:


• Gorman, G. & Cornish, B. 1995, Library/Clientele Congruence,
CAUL Performance Indicator A, Council of Australian University
Librarians, Canberra.
• Robertson, M. 1995, Document Delivery Performance, CAUL Performance
Indicator B, Council of Australian University Librarians, Canberra.
• Taylor, C. 1995, Materials Availability, CAUL Performance Indicator C,
Council of Australian University Librarians, Canberra.

Cram, J. 1996, ‘Performance management, measurement and reporting in a


time of information-centred change’, Australian Library Journal, vol. 45,
no. 3, pp. 225–239.

Cullen, R. J. 1997, ‘Does performance measurement improve organisational


effectiveness? A postmodern analysis’, see Proceedings of the 2nd.
Northumbria International Conference on Performance Measurement in
Libraries and Information Services, Information North for the Department of
Information and Library Management, University of Northumbria at
Newcastle, Northumberland, England, pp. 3–20.

Cullen, R. J. 1996, ‘New Zealand Libraries Effectiveness Project: Dimensions


and concepts of organisational effectiveness’, Library and Information Science
Research, Vol. 18, no. 2, pp. 99–119.

Cullen, R. J. & Calvert, P.J. 1995, ’stakeholder perceptions of university library


effectiveness,’ Journal of Academic Librarianship, vol. 21, no. 6,
pp. 438–448.

D’Avigdor, R. 1998, ‘Indispensable or indifferent? The reality of information


service performance measurement at the University of NSW library’,
Australian Academic & Research Libraries, vol. 28, no. 4, pp. 264–280.

171
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Ellis-Newman, J. & Robinson, P. 1998, ‘The cost of library services: Activity-


based costing in an Australian academic library’, in Journal of Academic
Librarianship, vol. 24, no. 5, pp. 373–380.

Exon, F.C.A. & Williamson, V. 1996, Performance Indicators Database:


Selecting the Best Performance Indicators for your Library—Operating
Manual, Curtin University of Technology Library and Information Service,
Perth, W.A.

Hernon, P. & Calvert, P. J. 1996, ‘Methods for measuring service quality in


university libraries in New Zealand’, Journal of Academic Librarianship,
vol. 22 no. 5, pp. 387–391.

Hoffman, H. 1998, ‘Performance indicators for technical services in academic


libraries’, Australian Academic & Research Libraries, vol. 28, no. 4,
pp. 177–189.

Kena, J. 1998, Performance Indicators for the Electronic Library,


http://oxemail.com.au/~jkena/perf.html

–1995, Key Performance Indicators: A Practical Guide for the Best Practice
Development, Implementation and Use of KPIs, South Melbourne, Pitman

Novak, J. 1992, ‘Performance indicators: why do we use them?’ in Libraries:


the heart of the matter: proceedings of the Australian Library & Information
Association 2nd Biennial Conference, Albury, 27 September—2 October 1992,
Melbourne, Thorpe

Paton, B. 1996, ‘Performance indicators for reference and information


services’, in Reading the Future: Proceedings of the Biennial Conference of the
Australian Library and Information Association, Australian Library and
Information Association, Melbourne, 6-pp. 101–108.

–1994, Performance measurement in library and information services,


Adelaide, AUSLIB.

International sources
Abbott, C. 1994, ‘Performance indicators in a quality context,’
The Law Librarian, vol. 25, pp. 205–208

Abbott, C. 1994, Performance measurement in library and information


services, London , Aslib

Abbott, C. 1990, ‘What does good look like? The adoption of performance
indicators at Aston University Library and Information Services,’ British
Journal of Academic Librarianship, vol. 5, no. 2, pp. 79–84

172
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Adams, Roy, et al 1993, Decision support systems and performance assessment


in academic libraries, New Providence, Bowker-Saur

Altman, E. & Pratt, A. 1998, ‘The JAL guide to professional literature:


Performance measurement,’ Journal of Academic Librarianship, vol. 24,
no. 4, p. 346

Armstrong, C. 1996 ‘The quality of publicly available databases: Wysiwyg or


what?’ BUOPOLIS 1: Routes to quality Proceedings of a conference held at
Bournemouth University, 29–31 August 1995, edited by B. Knowles,
Bournemouth, Bournemouth University Library and Information Services

American Research Libraries. Access and Technology Program, ILL/DD Related


Resources: Measuring the Performance of Interlibrary Loan and Document
Delivery Services, http://www.arl.org/access/illdd/illss.shtml

American Research Libraries. SPEC Kit, No. 196

Bancroft, A. et al, 1998 ‘A forward looking library use survey: WSU libraries in
the 21st century ’ Journal of Academic Librarianship, vol. 24, no 3 pp.
216–223

Barton, J. 1997, ’ Performance indicators for university libraries’ SCONUL


Newsletter vol. 1, Summer/Autumn, pp. 8–9

Barton, J. 1998, ’ Recommendations of the Cranfield project on performance


indicators for academic libraries’ SCONUL Newsletter, Summer/Autumn,
pp. 13–17

Bertot, J. & McClure, C. 1999, Analysis of FSCS Coordinator suggested data


elements for networked information resources and services: working paper,
http://www.albany.edu/~imlsstat

Bertot, J. & McClure, C. 1999, Analysis of state library data elements for
networked information resources and services: working paper,
http://www.albany.edu/~imlsstat

Bertot, J. & McClure, C. 1999, Developing National Public Library and


Statewide Network Electronic Performance Measures and Statistics,
http://www.albany.edu/~imlsstat/

Blagden, J. 1990, How good is your library? London, Aslib

Bloor, I. 1991, Performance indicators and decision support systems for


Libraries: a practical application of keys to success, London, British Library
Research & Development Division

173
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Boekhorst, P. te 1995, ’ Measuring quality: the IFLA guidelines for


performance measurement in academic libraries,’ IFLA Journal, vol. 21, no. 4,
pp.278–81

Brophy, P. 1998, ’ It may be electronic but is it any good?’ Victorian


Association for Library Automation Proceedings, pp. 216–229 available on the
VALA web site http://avoca.vicnet.net.au/~vala/valaweb/vala.htm

Brophy, P. & Wynne, P. 1997, ‘Performance measurement and management


information for the electronic library (MIEL)’ see 2nd Northumbria Conference

Brophy, P. & Wynne, P. 1997, Management Information Systems and


Performance Measurement for the Electronic Library: an eLib Supporting Study
(MIEL2 Project) Final report, Preston, Centre for Research in Library and
Information Management, University of Central Lancashire

CAMILE see EQUINOX

Carbone, P. 1995, ‘The committee draft of international standard ISO CD


11620 on library performance indicators,’ IFLA Journal, vol. 21, no. 4,
pp. 274–277

Carbone, P. 1993, ’survey of the development of library performance


measures in France’ INSPEL, vol. 27, no. 3, pp. 196–198

CAUSE, Self assessment for Campus Information Technology Services,


Professional Paper, no.12, http://cause-www.colorado.edu/information-
resources/ir-library/abstracts/pub3012.html

Centre for Information Quality Management, Recommended Readings for


Database Evaluation and Quality issues, http://www.i-a-l.co.uk/
ciqm/Readings.htm

Chacha, R. & Irving, A. 1991, ‘An experiment in academic library performance


measurement,’ British Journal of Academic Librarianship, vol. 6, no. 1,
pp. 13–26

Ciliberti, M. et al 1998, ‘Empty handed? A materials availability study and


transaction log analysis verification,’ Journal of Academic Librarianship
vol. 24 , no. 4, pp. 282–289

Cotta-Shonberg, M. ‘Performance measurement in the context of quality


management,’ see 1st Northumbria Conference on Performance Measurement

Crawford, J. 1997, ‘Report on the general satisfaction survey conducted at


Glasgow Caledonian University Library Feb/March 1997 and a linked focus
group investigation,’ SCONUL Newsletter, no. 11, Summer/Autumn, pp. 11–16

174
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Crawford, J. et al 1998, ‘The stakeholder approach to the construction of


performance measures,’ Journal of Librarianship and Information Science,
vol. 30, no. 2, pp. 87–112

DECIDE see EQUINOX

DECIMAL see EQUINOX

Developing indicators for academic library performance: ratios from ARL


statistics. Washington, ARL, 1992 –

–1995, The effective academic library: a framework for evaluating


performance of UK academic libraries, Bristol, HEFCE, 1995 Joint Funding
Council. Ad-hoc group on Performance Indicators for Libraries.

EQLIPSE see EQUINOX

EQUINOX site, http://equinox.dcu.ie/

Evans, M. , Jones K. & Usherwood, B. Assessment Tools for Quality


Management in Public Libraries: a project funded by the British Library
Research and Innovation Centre and managed by Loughborough University
(DILS) Department of Information

Ford, G, 1989, ‘Approaches to performance measurement: some observations


on principles and practice,’ British Journal of Academic Librarianship, vol. 4,
no. 2, p.79

Franklin, B. 1994 , ‘The cost of quality: its application to libraries,’ Journal of


Library Administration, vol. 20, no. 2, pp. 67–79

Gale, M. 1996, ‘Performance indicators and time analysis in the bibliographic


records section of the Aston University and Information Services,’ Library
Review, vol. 45, no. 2, pp. 58–67

Garrod, P. & Kinnell, M. 1996, ‘Performance measurement, benchmarking and


the UK library and information sector’, Libri, vol. 46, pp. 141–148

Garrod, P & Kinnell, M. 1995, Quality Management Issues: a select


bibliography for library and information services managers, The Hague, FID
(FID Publication no.710: Occasional Paper no.10: BLRD Report 6220 )

Gedeo, J, & Rubi, R. 1999, ‘Attribution theory and Academic Library


Performance Evaluation,’ Journal of Academic Librarianship, vol. 25,
no. 1, pp. 18–25

Goodall, D. 1988, ’ Performance Measurement: a historical perspective,’


Journal of Librarianship, vol. 20, no. 2, pp. 128–144

175
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Harris, M. 1991, ‘The user survey in performance measurement,’ British


Journal of Academic Librarianship, vol. 6, no. 1, pp. 1–12

HEIRA Alliance Evaluation Guidelines for Institutional Information Resources

HEIRA Alliance, An Example of the Information Technology Environment at


an information-resource-intensive institution,
http://www.educause.edu/collab/heirapapers/hei1061.html

Henty, M. 1989, ‘Performance indicators in higher education libraries,’ British


Journal of Academic Librarianship, vol. 4, no. 3, pp. 177–191

Hernon, P. & Altman, E. 1996, Service Quality in Academic Libraries,


Norwood, Ablex

Hernon, P., Nitecki, D. & Altman, E. 1999, ’service quality and customer
Satisfaction,’ Journal of Academic Librarianship, vol. 25, no. 1, pp. 9–17

IFLA, 1996, Measuring quality: international guidelines for performance


measurement in academic libraries, Munchen, Bowker-Saur

IFLA, 1996 Section for University Libraries and General Research Libraries.
Discussion group on performance indicators, IFLA Conference, Beijing,
25 August 1996

Indiana University, University Information Technology Services, IUITS/IUPUI


annual survey summary 1996, http://www.indiana.edu/~uitssur/survey/
iupui_summary98.html

International Standards Organisation 1996, ISO/DIS 11620 Information and


documentation —Library performance indicators, Geneva, ISO

Issues in Research Library Measurement, American Research Libraries,


no. 197, April 1998 special issue, and at http://www.arl.org/stats/perfmeas/

Jackson, M. 1997, ‘Measuring the Performance of Interlibrary Loan and


Document Delivery Services,’ ARL Bimonthly Newsletter of Research Library
Issues and Actions, Issue 195, (ARL Access and Technology Program/ILL/DD
Related Resources)

Jewell, T. Recent Trends in ARL Electronic and Access Services Data,


http://www.arl.org/stats/specproj/etrends.htm

Joint Funding Council. Ad-hoc group on Performance Indicators for Libraries,


1995, The Effective Academic Library (EAL)

John Minter & Associates, Academic Library Statistical Norms, Boulder,


Colorado, John Minter & Associates

176
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Kania, A. 1988, ‘Academic library standards and performance measures,’


College and Research Libraries, vol. 49, no. 1, pp. 16–23

Kelley, P. 1991, ‘Performance measures: a tool for planning resource


allocations,’ Journal of Library Administration, vol. 14, no. 2, pp. 21–36

Kyrillidou, M. 1998, ‘An overview of performance measures in higher


education and libraries,’ Issues in Research Library Measurement: a special
issue of American Research Libraries, no. 197, April 1998

Lancaster, F. W. 1993, If you want to evaluate your library, London,


Library Association

Lancaster, F. W. ‘Evaluating the digital library,’ see 2nd Northumbria


International Conference on Performance Measurement in Libraries and
Information Services

Leach, K. & Smallen, D. 1998, ‘What do information technology support


services really cost?’ Cause/Effect, vol. 21, no. 2, pp. 38–45

Library Trends,1994 Special Issue

Loughborough University .Department of Information and Library Studies.


Library and Information Statistics Unit, Seminar on academic library statistics
and benchmarking, 2–3 June 1998

Loughborough University. Department of Information and Library Studies.


Library and Information Statistics Unit, Library and Information Statistics
Tables, LIST annual—

Lynch, M. 1983, ‘Measurement in public library activity: the search for


practical methods,’ Wilson Library Bulletin, vol. 57, pp. 388

Mackenzie Owen, J. S. & Wiercx, A. Knowledge Models for Networked Library


Services: Final report to the European Commission (in press in early 1998),
also at http://www.nbbi.nl/kms/kmsshort.html

McLure, C. & Lopata, C. 1996, Assessing the Academic Networked


Environment: Strategies and Options, Washington. D.C, Coalition for
Networked Information

MIEL2 final report at http://www.ukoln.ac.uk/dlis/models/studies/mis/mis.rtf


see Brophy, P.

MINSTREL see EQUINOX

Morgan, S. 1993 , ‘Performance assessment in higher education libraries ’


Library Management, vol. 14, no. 5, pp. 35–42

177
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Morrow, J. 1996, ‘The development of technical services performance


measures in Newcastle University Library,’ Library Review, vol. 45, no. 2,
pp. 15–19

Nitecki, D. 1996, ‘Assessment of service quality in Academic libraries: focus


on the applicability of SERVQUAL,’ see 2nd Northumbria Conference on
Performance Measurement in Libraries

Northumbria International Conference on Performance Measurement in


Libraries and Information Services (1st: 1995: Longhirst Hall, Northumberland,
England) Proceedings of the 1st Northumbria international Conference on
Performance Measurement in Libraries and Information Services held at
Longhirst Management and Training Centre, Longhirst Hall, Northumberland,
England, 31st August–4th September 1995. Information North 1995

Especially articles by:


• Sumsion, J. & Ward, S. ‘EC Toolbox project: general findings and some
particular proposals,’ pp. 135–145
• Town, S. ’ Benchmarking and Performance Measurement,’ pp. 83–88

Northumbria International Conference on Performance Measurement in


Libraries and Information Services ( 2nd : 1997 : Longhirst Hall, England)
Proceedings of the 2nd Northumbria International Conference on
Performance Measurement in Libraries and Information Services held at
Longhirst Management and Training Centre, Longhirst Hall, Northumberland,
England, 7 to 11 September 1997. Newcastle, Information North 1998.
Especially articles by:
• Cullen, R ‘Does performance measurement improve organisational
effectiveness? A postmodern analysis,’ pp. 3–20
• Hart, E. et al ‘The use of focus groups in the evaluation of services,’ pp.
133–138
• Zwart, R. ‘DocUtrans and ISO 9002 quality assurance,’ pp. 267–274

Parasuraman, A. et al 1988, ’sERVQUAL: a multiple item scale for measuring


consumer perceptions of service quality,’ Journal of Retailing, vol. 61, pp.
12–40

Poll, R. & Boekhorst, P. te 1996, Measuring Quality: international guidelines


for performance measurement in academic libraries, IFLA Publication 76,
Munich, Saur

Poll, R. 1991, ‘Problems of performance evaluation in academic libraries,’


INSPEL, vol. 25, no. 1, pp. 24–36

178
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Porter, L. 1992, ‘Quality assurance: going round in circles,’ Aslib Information,


vol. 20, no. 6, pp. 240–241

Redfern, M. 1990, ‘Giving account: performance indicators for libraries,’


Library Review, vol. 39, no 5, pp. 7–9

Richard, S. 1992, ’ Library use of performance indicators,’ Library Review,


vol. 41, no. 6, pp. 22–36

Richard, S. (service agreements) see 1st Northumbria International Conference


on Performance Measurement in Libraries and
Information Services

Rowley, J. 1996, ‘Implementing TQM for library services: the issues,’


Aslib Proceedings, vol. 48, no. 1, pp. 17–21

– 1995, Routes to Quality: practical approaches to quality in library and


information services,. Bournemouth, BUOPOLIS

Shapiro, B. 1991 ‘Access and performance measures in research libraries in


the 1990’s, Journal of Library Administration, vol. 15, nos. 3–4, pp. 49–66

Sprehe, J. & McClure, C. 1999, ‘Performance measures for federal agency web
sites,’ http://istweb.syr.edu/~mcclure/PARS.Pro.April20.html

Standing Council of National and University Librarians, 1992, Performance


Indicators for university libraries: a practical guide, London, SCONUL

Stein, J, 1995, ‘Feedback from a captive audience: reflections on the results of


a SERVQUAL survey of interlibrary loan services at Carnegie Mellon University
Libraries,’ see 2nd Northumbria International Conference on Performance
Measurement in Libraries and Information Services

Summerfield, M. et al, Columbia’s Online Books Evaluation Project at


http://www.columbia.edu/cu/libraries/digital/olbdocs/protocol/

Sumsion, J. & Ward, S. 1995, see 1st Northumbria International Conference on


Performance Measurement in Libraries and Information Services

Sumsion, J. et al, 1995, Library performance indicators and management


models Office of the European Communities, Final report of the PROLIB/PI
project. EUR16448EN

Sumsion, J. 1998 Matrices of Performance Indicators,


http://www.staff.dmu.ac.uk/~camile/matrices/intro.htm (Commissioned as
part of the DECIDE project)

Sumsion, J. 1997, Matrix of Performance Indicator Standards and Toolkits,


http://www.efc.co.uk/DECIDE/REPORT/zzannexes.htm

179
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Talbot, D., Lowell, C. & Martin, K. 1998, ‘From the User’s Perspective—the
UCSD Libraries User Survey Project,’ Journal of Academic Librarianship,
vol. 24, no. 5, pp. 357–364

Tiefel, V. 1989, ‘Output or performance measures: the making of a manual,’


College and Research Libraries News, no. 6, pp. 475–478

Tri-University Library Project, http://www.lib.uwaterloo.ca/TUG/index.html

Van House, N., Weil T. & McClure, C. 1990, Measuring academic library
performance: a practical approach, Chicago, ALA

Ward, S., Sumsion, J. & Bloor, I. 1995, Library performance indicators and
library management tools, Luxemberg, EC DG-XIII-E3,

Willemse, J. 1998, ‘Performance assessment in IFLA and United Kingdom


academic libraries,’ South African Journal of Library and Information Science,
vol. 66, no. 4, pp. 161–165

Winkworth, I. 1993, ‘Into the house of mirrors: performance measurement in


academic libraries,’ British Journal of Academic Librarianship, vol. 8, no. 1,
pp. 17–33

Winkworth, I. 1991, Performance measurement and performance indicators,


Aldershot, Gower

Young, P. 1998, ‘Measurement of electronic services in libraries: statistics for


the digital age,’ IFLA Journal, vol. 24, no. 3, pp. 157–160

Zwart, R. 1997, ‘DocUtrans and ISO 9002 quality assurance,’ see 2nd
Northumbria International Conference on Performance Measurement

Best practice/quality management

Australasian sources
Bundy, A. 1997, ‘Investing for a future: Client focussed Australian academic
libraries in the 1990’s’, Australian Library Journal, vol. 46, no. 4, pp. 354–369.

Calvert, P. 1998, ’service quality in academic libraries: Research in New


Zealand and Singapore,’ in Pathways to Knowledge, Australian Library and
Information Association 5th. Biennial Conference and Exhibition, Conference
paper available http://www.alia.org.au/conferences/adelaide98/

Calvert, P. 1997, ‘Measuring service quality: From theory into practice,'’


Australian Academic & Research Libraries, vol. 28, no. 3, pp. 198–204.

180
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Cook, B. 1995, ‘Managing for effective delivery of information services in the


1990’s: The integration of computing, educational and library services’, in
Information Online and On Disc 95: The virtual information experience:
Proceedings of the seventh Australasian Information Online and On Disc
Conference and Exhibition, Proceedings, no. 7, pp. 1–19.

Cooper, M. 1996, ‘The use of total quality management (TQM) in libraries and
information services in Australia and overseas’, (1995 Metcalfe Medallion
essay), Australian Library Journal, vol. 45, no. 2, pp. 92–101.

Exon, F.C.A., Exon, M. J., Calvert, P. J. 1995, Review of Library and


Information Services in Australia and New Zealand, (British Library R&D
Report 6194), British Library, London.

Groenewegan, H. & Lim, E. 1995, ‘TQM and quality assurance at Monash


University Library’, Australian Academic & Research Libraries, vol. 26, no. 1,
March, pp. 6–16.

Harman, G. 1998, ‘Quality assurance mechanisms and their use as policy


instruments: Major international approaches,’ European Journal of Education,
vol. 33, no. 3, pp. 331–339.

Hayes, H. 1994, ‘The management of change in libraries for achieving quality


services’, Australian and New Zealand Theological Library Association
Newsletter, no. 23, pp. 6–17.

Kruithof, J. & Ryall, J. 1994, The Quality Standards Handbook: How to


Understand and Implement Quality Systems and the ISO9000 Standards in a
Context of Total Quality and Continuous Improvement, Business Library,
Melbourne.

McCarthy, J. & Marrie, J. 1995, ‘An evaluation of the reference service at the
Educational Resource Centre, University of Melbourne,’ Australian Academic
& Research Libraries, vol. 26, no. 1, pp.33–42.

McGregor, F. 1997, ‘Quality assessment: Combating complacency’, Australian


Library Journal, vol. 46, no. 1, pp. 82–92.

Mikol, M. 1996, Quality Assurance in Australian Higher Education: A Case


Study of the University of Western Sydney Nepean,
http://www.nepean.uws.edu.au/dimps/dist/parisrpt.html

Moore, H. 1998, ‘Quality accreditation as a tool to improve client services


in a TAFE library’ in Pathways to Knowledge, Australian Library and
Information Association 5th. Biennial Conference and Exhibition,
http://www.alia.org.au/conferences/adelaide98/

181
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Nicholson, F. & Rochester, M. (eds) 1996, Best Practice: Challenges in Library


Management Education, AUSLIB, Adelaide.

Nicholson, F. & Rochester, M. 1996, ‘Reading the management future for


libraries: Implications for library management education’, in Reading for the
Future: Proceedings of the Biennial Conference of the Australian Library and
Information Association, ALIA, Melbourne, pp. 75–83.

Parker, D. 1995, ‘TQS at the Victoria University of Technology’, Australian


Academic & Research Libraries, vol. 26, no. 1, pp. 25–30.

Phillips, A. 1998, ‘Quality matters’, Ex Libris, Issue 52, pp.10–12.

Presser, P. & Garner, J. 1999, ‘Quality self assessment project’, Ex Libris,


Issue 57–58, pp. 9–10.

– 1994, The Quest for Quality: Implementing Quality Principles in Libraries,


North Queensland Division, the Australian Library and Information
Association, n. p.

–1997, The QMS in VET Project—How to develop Best Practice Quality


Management Systems in VET, Australian National Training Authority, n. p.
Summary document available at http://www.swin.edu.au/qed/qmsvet.htm

– 1999, The Relationship Between ISO series of Standards and Australian


Business Excellence Framework, [Document included in TAFE Institute
Organisational Self-Assessment Manual Resource Book, Office of Training and
Further Education, Victoria

Rice, T. 1997, ‘Total Quality Management: Curse or cure? The University of


Wollongong Library’s quality and service excellence program’ in Interaction:
The Client, the Profession, the Technology, Proceedings of the 9th National
Library Technicians Conference, Interaction, Canberra, pp. 53–64.

Shearer, G. 1995, ’solving reshelving backlogs in a university library: A case


study using an interactive problem solving technique with a TQM
application’, Australian Library Journal, vol. 44, no. 1, pp. 27–46.

Stevens, J. 1995, ‘Adopting best practice’, Incite, August, vol. 16, no. 8, pp. 12.

Swinburne University. Office for Quality Education, The QMS in VET


Project—How to develop Best Practice Quality Management Systems in VET,
http://www.swin.edu.au/qed/qmsvet.htm

Watty, K. 1994, Issues to Consider in the Application of Total Quality


Management to Education at Universities, Royal Melbourne Institute of
Technology, Melbourne

182
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Williamson, V. & Exon, F. 1996, ‘The quality movement in Australian


university libraries (findings of a quality audit survey)’, Library Trends, vol.
44, pp. 526–544.

Wilson, A. 1996, Best Practice in Research Information Services: Final report


May 1996, http://www.ntu.edu.au/admin/isd/qsdc/bppage.htm

Wilson, A. & Byrne, A. 1996, ‘Implementing best practice in the Northern


Territory University library: A case study’, in Nicholson, F. & Rochester, M.
(eds), Best Practice: Challenges in Library Management Education, AUSLIB,
Adelaide.

International sources
Abbott, C. 1994, ‘Performance indicators in a quality context,’ The Law
Librarian, vol. 25, pp. 205–208

Adams, Roy, et al 1993, Decision support systems and performance assessment


in academic libraries, New Providence, Bowker-Saur

Barnard, S. 1993 ‘Implementing Total Quality Management: a model for


research libraries,’ Journal of Library Administration, vol. 18, nos. 1–2,
pp. 57–70

Barton, J. & Blagden, J. 1998, Academic library effectiveness: a comparative


approach, British Library Research & Investigations Report 120, London,
British Library

Brockman, J. 1992, ‘Just another management fad? The implications of TQM


for library and information services,’ Aslib Proceedings, vol. 44, nos. 7/8,
pp. 283–288

Brockman, J. 1998, Quality management and benchmarking in the


information sector, Munchen, Bowker-Saur

Brophy, P., Coulling, K., & Melling, M. 1993, ‘Quality management: a


university approach,’ Aslib Information, vol. 21, no. 6, pp. 246–248

Brophy, P. & Coulling, K. 1996, Quality management for information and


library managers, London, Aslib/Gower

CAUSE. Self assessment for Campus Information Technology Services,


Professional Paper no.12, http://cause-www.colorado.edu/
information-resources/ir-library/abstracts/pub3012.html

Citizen’s Charter, 1994, Charter Mark Scheme: Guide for Applicants,


London, HMSO

183
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Cotta-Shonberg, M. 1995, ‘Performance measurement in the context of quality


management’ 1st Northumbria International Conference on Performance
Measurement in Libraries and Information Services

Dow, P. 1998, ‘Using assessment criteria to determine library quality,’ Journal


of Academic Librarianship, vol. 24, no. 4, pp. 277–281

– 1995, The effective academic library: a framework for evaluating


performance of UK academic libraries. Bristol, HEFCE, Joint Funding Council.
Ad-hoc group on Performance Indicators for Libraries.

Ellis, D. & Norton, B. 1993, Implementing BS5750/ISO9000 in Libraries,


London, Aslib

European Foundation for Quality Management, EFQM Excellence Model,


http://www.efqm.org

Evans, M., Jones K. & Usherwood, B. Assessment Tools for Quality


Management in Public Libraries: a project funded by the British Library
Research and Innovation Centre and managed by Loughborough University
(DILS) Department of Information

Fisher, D. 1995, Baldridge on Campus: the assessment workbook for higher


education, New York, Quality Resources

Franklin, B. 1994 , ‘The cost of quality: its application to libraries,’ Journal of


Library Administration, vol. 20, no. 2, pp. 67–79

Garrod, P. & Kinnell Evans, M. 1995, Towards library excellence: Best practice
benchmarking in the library and information sector, BLRD Report, London,
British Library Research & Development Division

Garrod, P. & Kinnell, M. 1995, Quality Management Issues: a select


bibliography for library and information services managers, The Hague,
FID (FID Publication no. 710: Occasional Paper no.10: BLRD Report 6220 )

Great Britain, 1991, The Citizen’s Charter: raising the standard, London,
HMSO

HEIRA Alliance, An Example of the Information Technology Environment at


an information-resource-intensive institution, http://www.educause.edu/
collab/heirapapers/hei1061.html

HEIRA Alliance Evaluation Guidelines for Institutional Information Resources

INSPEL, 1994, Quality and quality management in libraries, INSPEL Special


Issue vol. 28, no. 2

184
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

International Standards Organisation. ISO 9000 at explained at


http://www.iso.ch/9000e/magical.htm

Irving, A. 1992, ‘Quality in academic libraries: how shall we know it?’ Aslib
Information, vol. 20, no. 6, pp. 244–246

Issues in Research Library Measurement, American Research Libraries,


no. 197, April 1998 special issue and at http://www.arl.org/stats/perfmeas/

Jurow, S. & Barnard, S. (eds) 1993, Integrating Total Quality Management in


a library setting, New York, Haworth Press,

Kerslake, E. & Evans, M. 1997, Quality management for library and


information services: policy forum 12 December 1996 Report to British Library
Research and Innovation Centre, Loughborough, Loughborough University
Department of Information and Library Studies

Kinnell, M. 1995, ‘Quality management and library and information services,’


IFLA Journal, vol. 21, no. 4 , pp. 265–273

Leach, K. & Smallen, D. 1998, ‘What do information technology support


services really cost?’ Cause/Effect, vol. 21, no. 2 , pp. 38–45

Library Trends, 1994 Special Issue

Loughborough University of Technology. Department of Information Science,


http://info.lut.ac.uk/departments/dils/Research/Majpro.html

Mackenzie Owen, J. S. & Wiercx, A. Knowledge Models for Networked Library


Services: Final report to the European Commission (in press in early 1998)
also at http://www.nbbi.nl/kms/kmsshort.html

Manchester Metropolitan University. Centre for Research in Library and


Information Management (CERLIM), http://www.mmu.ac.uk/h-ss/cerlim/

National Institute for Standards and Technology (US). The Malcolm Baldridge
National Quality Award Criteria for Education, http://www.quality.nist.gov/
docs/99_crit/99crit.htm#education

National Quality Institute of Canada. Canada Awards for Excellence (CAE)


criteria, http://www.nqi.ca/new_web/english/html/nqi.html

NORDINFO, ISO 9000 for Libraries and Information Centres: A Guide, Report
of a project supported by NORDINFO, The Hague, FID,1996. FID Occasional
Paper no. 13.

Northumbria International Conference on Performance Measurement in


Libraries and Information Services (1st: 1995: Longhirst Hall, Northumberland,
England) Proceedings of the 1st Northumbria international Conference on

185
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Performance Measurement in Libraries and Information Services held at


Longhirst Management and Training Centre, Longhirst Hall, Northumberland,
England, 31st August–4th September 1995. Newcastle, Information North.
Especially article by:
• Brophy, P. ‘Quality management in libraries,’ pp. 77–81

Northumbria International Conference on Performance Measurement in


Libraries and Information Services ( 2nd : 1997 : Longhirst Hall,
Northumberland, England) Proceedings of the 2nd Northumbria International
Conference on Performance Measurement in Libraries and Information
Services held at Longhirst Management and Training Centre, Longhirst Hall,
Northumberland, England, 7–11 September 1997. Newcastle, Information
North 1998.

Orenstein, D. 1999, ‘Developing quality managers and quality management:


the challenge to leadership in library organizations,’ Library Administration
and Management, vol. 13, no. 1, pp. 44–51

Rowley, J. 1996, ‘Implementing TQM for library services: the issues,’ Aslib
Proceedings, vol. 48, no. 1, pp.17–21

– 1995, Routes to Quality: practical approaches to quality in library and


information services, Bournemouth, BUOPOLIS

Shaughnessy, T. 1993, ‘Benchmarking, Total Quality Management and


libraries,’ Library Administration and Management, vol. 7, no. 1, pp. 7–12

Stone, J. 1997, Increasing effectiveness: a guide to quality management,


London, Palmer Press

St Clair, G. 1997, Total Quality Management in Information Services, London,


Bowker-Saur

Tann, J. 1993, ‘Dimensions of quality in a library setting’ n Quality


management: towards BS5750. Proceedings of a seminar held in Stafford on
21 April 1993, edited M. Ashcroft and D. Barton, Stamford, Lincs, CPI

Whitehall, T. 1992, ‘Quality in library and information service: a review,’


Library Management, vol. 13, no . 5, pp. 23–35

Winkworth, I. 1993, ‘Performance indicators and quality assurance,’ Aslib


Information, vol. 21, no. 6, pp.

186
CONTENTS

Guidelines for the Application of Best Practice in Australian University Libraries

Staff competencies
Arts Training Australia 1995, Library Industry Competency Standards, Arts
Training Australia, Woolloomooloo, N.S.W.

Australian Library and Information Association 1996–7, Competency Standards


and the Library Industry: A Workshop Series, ALIA, Canberra.

Australian Library and Information Association 1999, Work Level


Guidelines for Librarians and Library Technicians 1998,
http://www.alia.org.au/publications/wlg/

Australian Quality Council 1999, Management Competency Standard,


http://www.aqc.org.au/mcsfbe.htm

Bridgland, A. & Dott, A. (1999), ‘How Competent Is Competent When it


Comes to it? An Implementation Strategy for the University of Melbourne
Library’, in People and Technology Doing IT Right: EDUCAUSE in Australasia,
Sydney, EDUCAUSE,

Bridgland, A. 1998, The Impact of the National Training Reform Agenda and
Workplace Rearrangement on Staff Development in Australian Academic and
State Libraries, Doctoral Thesis, University of Melbourne, Melbourne.

Bridgland, A. 1996, ‘Potential applications of the library industry competency


standards at the University of Melbourne Library’, Education for Library and
Information Services: Australia, vol. 13, no. 1, pp. 83–89

CREATE Australia (n.d.), Libraries and Museums Industry Training Package,


In preparation.

Nicholson, F. & Rochester, M. (eds) 1996, Best Practice: Challenges in Library


Management Education, AUSLIB, Adelaide.
• Trahn, I. ‘HRM Competencies for Library Staff: An Industry View from
the University Sector’.
• Williamson, V. ‘Competency Standards: Implications for Library Managers’.
Nicholson, J. & Bridgland, A. (unpub.), Library Industry National Training
Package: Friend or Foe for CPE

Williamson, V. & White, S. 1996, Competency Standards in the Library


Workplace, AUSLIB, Adelaide.

187

You might also like