Professional Documents
Culture Documents
(2000) Guidelines For The Application of Best Practice in Australian University Libraries
(2000) Guidelines For The Application of Best Practice in Australian University Libraries
E I P
00/11
Anne Wilson
Leeanne Pitman
Isabella Trahn
Department of Education,
Training and Youth Affairs
DETYA No. 6478.HERC 00A
ABN: 51 452 193 160
CONTENTS
Department of Education,
Training and Youth Affairs
Anne Wilson
Leeanne Pitman
Isabella Trahn
• Alternatively, use the arrow icons on your keyboard to navigate through the document.
– use the magnifying glass by clicking on the area you wish to enlarge or by forming
a marquee over the area you wish to view (ie. hold the mouse button down and
drag the magnifying glass over the area); or
– use the view options menu bar at the bottom of the Acrobat screen.
• To pan out from the page, hold down the option button on your keyboard to change the
+ve symbol on the magnifying glass to a –ve symbol , then click the mouse.
• To search for a word or phrase use the binoculars icon on the menu bar.
• The Contents pages are live, ie. if you click on a topic you will go to that page.
• You can return to the Contents page by clicking your mouse on ‘Contents’ on the top of
each page.
CONTENTS
This work is copyright. Apart from any use as permitted under the Copyright
Act 1968, no part may be reproduced by any process without permission
from Ausinfo. Requests and inquiries concerning reproduction and rights
should be addressed to the Manager, Legislative Services, Ausinfo, GPO Box 84,
Canberra ACT 2601.
The views expressed in this report do not necessarily reflect the views of the
Department of Education, Training and Youth Affairs.
Contents
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .vii
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
1.1 Origin and background of the investigation . . . . . . . . . . . . . . . . . . . .1
1.2 Project description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
1.3 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.4 Research methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.4.1 Literature review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .2
1.4.2 Surveys . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3
1.4.3 Site visits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3
1.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4
2 Literature review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5
2.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .5
2.2 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6
2.2.1 Australasian sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6
2.2.2 International sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8
2.3 Performance indicators and measurement . . . . . . . . . . . . . . . . . . . .11
2.3.1 Australasian sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11
2.3.2 International sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .13
2.4 Quality/best practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
2.4.1 Australasian sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
2.4.2 International sources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17
3 Benchmarking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
3.1 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
3.2 Methodologies for library benchmarking . . . . . . . . . . . . . . . . . . . . .21
3.2.1 CHEMS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .23
3.2.2 Universitas 21 methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24
3.2.3 AQC benchmarking methodology . . . . . . . . . . . . . . . . . . . . . . . . . .24
MORE CONTENTS
iii
CONTENTS
MORE CONTENTS
iv
CONTENTS
v
CONTENTS
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .167
Tables
Table 4.1 Use of CAUL and other performance indicators . . . . . . . . . . . . . . . . . . . .50
Table 4.2 Breakdown of ratings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
Table 4.3 Additional and important performance indicators . . . . . . . . . . . . . . . . . . .54
Table 4.4 Further developmental work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .55
Table 4.5 Use of performance indicators in benchmarking and
quality management . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .57
ACKNOWLEDGEMENTS
vi
CONTENTS
Acknowledgments
The project team thanks the representatives of the following institutions who
gave up their time to host the team in a series of site visits as part of the
information gathering process.
vii
CONTENTS
ix
CONTENTS
xi
CONTENTS
Executive summary
The project ‘Best Practice for Australian University Libraries' has investigated
current ‘best practice' activities within Australian academic libraries and
compared these with those in selected overseas countries. Within the context
of this report, ‘best practice activities' are considered to encompass the
implementation of quality frameworks, and the use of benchmarking and
performance measurement as tools for the continuous improvement of
products, processes and services. Staff competencies required for the effective
application of these frameworks and tools were also investigated.
The survey population included all Australian (38) and New Zealand
university libraries (6), and the major Australian non academic research
libraries (State Libraries, National Library of Australia, CSIRO).
xiii
CONTENTS
To enhance the practical value of the project, guidelines for the application of
best practice will be developed in the form of a practical manual which
Australian academic libraries can use to assist them in implementing ‘best
practice' initiatives. A number of recommendations, formulated as a result of
this investigation, are referred to the Council of Australian University
Librarians (CAUL), DETYA and the Australian higher education community.
xiv
CONTENTS
1 Introduction
1
CONTENTS
1.3 Terminology
One of the challenges of the project investigation has been achieving
consensus on terminology. There are no universally accepted definitions for
best practice, benchmarking and performance measurement. However, it was
considered important that terminology be agreed at the outset to ensure that
survey respondents were all working with the same definitions. The literature
also supports the difficulty of finding agreement on terms, but stresses the
importance of finding and using an accepted/endorsed definition. The
problem surrounding the identification of a definition of benchmarking
highlights the whole problematic area of ‘quality’ terminology. Garrod and
Kinnell state ‘Many LIS managers balk at the use of jargon and view it as
cant. It is often regarded as the way in which professed ‘experts’ describe
what is basic common sense and good practice. Jargon tends to alienate the
uninitiated by combining the elements of exclusion and exclusivity. There is
therefore a need to find vocabulary to describe ‘quality’ methods and
practices that is acceptable to the majority of the LIS profession’ (1997,
p. 115). Experience has shown, both within Australia and internationally, that
once terms become more widely used and applied in a practical sense, they
become more acceptable.
2
CONTENTS
The references included in the literature review were gathered from national
and international databases, the Internet, contacts with colleagues, visits both
within Australia and overseas, email discussion lists and other appropriate
sources. (Please note, URLS cited in the Literature Review, Useful Sources list
and Bibliography may change over time. They were last verified in June 1999).
In reviewing this material, emphasis has been on providing some insight into
the current level of activity of academic and research libraries involved in the
development and implementation of quality frameworks and tools, both
within Australian and internationally. For this reason, the focus for the
material reviewed and included in both the full Bibliography and Useful
Source List has been, with the exception of some key resources, on material
published within the literature over the last five years.
1.4.2 Surveys
Three surveys were developed and distributed to all Australian and
New Zealand university libraries and the major Australian non academic
research libraries. The surveys aimed to identify best practice activities being
undertaken within this sector, and focused on the identification of quality,
benchmarking, and performance measurement implementation, and self-
identified areas of best practice.
3
CONTENTS
1.5 Results
From the investigations described above, the report has been able to:
• provide a detailed evaluation of currently available methodologies for
library benchmarking;
• analyse the effectiveness of the current CAUL performance indicators with
recommendations for the amendment and extension of these indicators.
(A report on the matter will be submitted to CAUL later in 1999); and
• demonstrate the applicability of TQM, quality assurance processes and
quality management frameworks to academic library management.
From the overseas case studies and the survey findings, it is evident that
Australian and New Zealand university libraries are pursuing similar initiatives
to overseas libraries, and implementing in many cases similar frameworks and
programs. Both sides of the globe could benefit from a greater sharing of
these experiences. The project team will present a paper on the project and its
outcomes at the 3rd International conference on Performance Measurement in
Libraries and Information Services, in England in August 1999.
4
CONTENTS
2 Literature review
2.1 Overview
In undertaking a review of the literature related to the key focus areas of this
project, the project team has concentrated on locating material to provide
In addition, the team also aimed to identify key material of benefit to those
requiring a concise introduction to the practical frameworks, resources and
the experiences of others in implementing best practice. The focus for the
material reviewed and included in both the full Bibliography and Useful
Source List (Appendix E) has therefore been, with the exception of some key
resources, on material published within the literature over the last five years.
In reviewing this material, emphasis has been placed on providing some
insight into current level of activity. As well, material provided by those
involved in the development and implementation of quality frameworks and
tools, both within Australian and internationally, has been included.
5
CONTENTS
Even such a selective review of the literature may omit useful sources. This
review, and the chapters which follow, provides additional references to key
theoretical papers, standards and frameworks. References cited within the
report are included in the full bibliography. The annotated material in the
Useful Source List, (Appendix E) is included to provide those interested with
additional content from a small number of sources of particular interest.
A very real difficulty with evaluating the literature in this field is the
interchangeable and, to the novice, potentially confusing use of the
terminology. Terms ‘are sometimes used almost as synonyms’ (Calvert
1995, p.438). Each source needs to be approached with caution to clarify
exactly what the author defines as ‘statistic’, ‘metric’, ‘benchmark’,
‘performance indicator’, ‘performance measure’, ‘performance management’,
‘best practice’, etc.
2.2 Benchmarking
6
CONTENTS
The other key Australian web site is that of the Australian Quality Council
(http://benchnet.com/aqc/), which contains access to information on
Australian benchmarking networks and also restricted access to the
benchmarking data repository Benchmarking Edge. University libraries are
participants in these activities in an, as yet, small way.
Some of these activities have been reported in both the print literature
(Robertson & Trahn 1997; Neumann 1996) and electronically via the World
Wide Web, (http://www.unilinc.edu.au/curtin.html, Curtin Cataloguing project,
http://www.ntu.edu.au/library/bench1.htm, NTU Acquisitions, cataloguing
and processing project).
7
CONTENTS
8
CONTENTS
As in other countries, it seems to be the special libraries, with their small size
and frequent organisational links with large commercial organisations, that
have undertaken useful work in the area. The Canadian Health Libraries
Association/Association des bibliotheques de la sante du Canada CHLA/ABSC
1998 Benchmarking Task Force Benchmarking Tool Kit is a model of
practicality and a kit which will assist these mainly small libraries
to find benefit from benchmarking. The U.S. Special Libraries Association also
has an active interest in benchmarking and useful publications on their
web site.
Whilst the overall focus of the National Association of College and University
Business Officers Benchmark Program is necessarily on the full range of
university administrative functions, amongst their List of Benchmarks and
Processes, located at http://www.nacubo.org/website/benchmarking/
program.html, is information under the heading Library. This is one of the
rare sources of publication on benchmarking activities in North American
9
CONTENTS
For snap shots on the state of the art in 1995 and 1997 in all the areas
covered by this report, including benchmarking, albeit with a European focus,
the best source is the two published proceedings of the Northumbria
International Conferences on Performance Measurement in Libraries and
Information Services. Articles by Stephen Town in both proceedings to date
cover British developments in benchmarking. Town undertook, at the
Shrivenham campus of Cranfield University, the first major benchmarking
exercise within the UK university library scene in 1993, and any of his
publications since refer to that and subsequent exercises. The 1999
conference will include the first English language report of a noteworthy
benchmarking exercise in the Netherlands. This will be useful, as English
language reports on European activity are rare.
University libraries and information services around the world have recently
(1998) participated in an Association of Commonwealth Universities
University Management Benchmarking Club exercise, which examined
libraries using a specifically developed framework. Detailed results are not
publicly available, but some information exists about the CHEMS approach on
the ACU web site. The EFQM framework mentioned in the quality framework
sections of this report forms a part of the CHEMS approach. See
http://www.acu.ac.uk/chems/benchmark.html and Fielden (1995).
10
CONTENTS
Taking the AARL statistics as a starting point, an index and description of the
performance indicators described in the literature published up until the end
of 1995 was complied into ‘Performance Indicators Database: Selecting the
Best Performance Indicators for your Library’ (Exon & Williamson 1996). This
database includes indicators suitable for measurement of the performance of
libraries, and incorporates a spreadsheet describing each indicator, a
reference to the source(s) of the indicator, a statement of how it is calculated,
and an index of applications for which it might be used. Whilst not
specifically focussing on indicators developed within Australia, it does
provide a useful insight into the level of development in this area up until
this time. The Exon paper in the first Northumbria Conference proceedings in
1995, Developing Performance Indicators for an Australian University Library
covered the same territory.
11
CONTENTS
across the sector. The 1996 survey results are available on the CAUL website
and the 1999 survey results are due to be posted there mid 1999
(http://www.anu.edu.au/caul/).
The search for adequate measures of library effectiveness has been carried
out extensively in New Zealand by Calvert and Cullen, and will be referred to
in the following chapters. Of the many articles and reports published
describing their work, ‘Stakeholder perceptions of University Library
Effectiveness’ (1995), provides a good overview of the research methodology,
and outlines subsequent indicators developed from responses to the study
questionnaire (based on a literature search of previously identified indicators).
Within this study, libraries are viewed as social agencies that must be
responsive to the needs and wishes of various constituencies. The authors
suggest that in the process of analysing library effectiveness, it is necessary to
determine not only what was done but also which of these tasks the library
was supposed to do. Calvert has also been active in the area of service
quality measurement and has applied the service quality framework
developed by Hernon and Altman (1996) to a study focussed on ‘moving
theory into practice’, with the adoption and testing of the framework in seven
university libraries in New Zealand (Calvert 1997; 1999).
12
CONTENTS
Whilst the impressive ARL statistics site provides longitudinal data and a
sophisticated ability to manipulate the data for customised reports, Jewell
(1998) at http://www.arl.org/stats/specproj/etrends.htm summarises the results
of an ARL exploration of possible financial indicators for electronic resources
and services. This was a pilot project attempting to move the statistics
database into the electronic age. In the tradition of the ARL statistics, the
initial focus is on resource allocation in the new arena.
For electronic services indicators, the seminal paper remains Assessing the
Academic Networked Environment: by Charles McLure and Cynthia Lopata
(1996), http://istweb.syr.edu/~mcclure/network/toc.html. The Coalition for
Networked Information project of the same name is also useful for an outline
of reports on a series of activities in a number of institutions attempting to
put into use some of the indicators suggested by McClure. Virginia State
University, Brown University and University of Washington are among the
participants. (http://www.cni.org/projects/assessing/reports/)
13
CONTENTS
The single most useful site for up to date information on current European
work is the new EQUINOX Library Performance Measurement and Quality
Management System site at http://equinox.dcu.ie/which brings together
unfinished aspects of earlier European Union projects, including CAMILE,
which was to publicise the outcomes of EQLIPSE, MINSTREL, DECIMAL,
DECIDE. There are direct links to each project.
14
CONTENTS
The electronic age in libraries has brought new areas into consideration.
Measuring quality in databases, electronic sources and services projects, web
sites and so on has emerged as new roles and mechanisms become
important. The Centre for Information Quality Management, originally seeded
by the British Library Association and On-Line Users Group, has developed a
useful role over the past few years http://www.la-
hq.org.uk/liaison/ciqm/ciqm.html. An article by Armstrong on Metadata, Pics
and Quality which gives an indication of current work in this area can be
found at (http://www.ariadne.ac.uk/issue9/pics).
There are a number of key works which serve as a starting point for much
current work. In Britain Brophy and Sumsion draw on the Joint Funding
Council, Ad-hoc Group on Performance Indicators for Libraries publication,
the Effective Academic Library (EAL) 1995. Another refinement of this work
appeared in 1998 in the Cranfield project, reported as Academic library
effectiveness: a comparative approach by Jane Barton and John Blagden. Their
listing of key management statistics appropriate for university administrators is
also worth examination.
University library staff also need to refer to Poll and te Boekhorst’s 1996 IFLA
publication Measuring Quality: international guidelines for performance
measurement in academic libraries. With their international authority and
thorough approach to definition and methods of compilation, both
publications are key references.
15
CONTENTS
the Effective Academic Library, ISO and IFLA publications are included. The
report recommends that the stakeholder approach, based on interview and
focus groups, should be used in relation to measurement of electronic
services and in future information planning strategy.
As with each of the areas discussed in this project, there are papers in both
proceedings of the Northumbria International Conference on Performance
Measurement in Libraries and Information Services (1995 and 1997) in relation
to:
• Performance measures and indicators;
• Qualitative measurement;
• Electronic/digital library measurement; and
• Managing information services (role for performance measurement in
changing styles, structures, procedures.
16
CONTENTS
of quality assurance processes and management, both during, and prior to,
this period, and this activity has provided a base upon which subsequent
progress can now be measured.
The Australian Best Practice Demonstration Program, which operated prior to,
and during, the quality audit period referred to above, was successfully used
by the Northern Territory University Library to ‘improve research information
services from a client perspective’ (Byrne 1995, p. 17). A number of specific
projects were developed within the broader program. NTU library was one
of only a few service organisations to adopt and work with guidelines
largely aimed at commercial operations, and their experience remains a
leading case study in the adoption of best practice and quality improvement
processes and programs within the Australian academic library sector.
A comprehensive report of their experience (Wilson and Byrne 1996) outlines
the program background, guidelines, principles and methodology, and
offers some lessons for others. The final report of this program is available
at (http://www.ntu.edu.au/admin/isd/qsdc/bppage.htm).
17
CONTENTS
18
CONTENTS
19
CONTENTS
3 Benchmarking
3.1 Terminology
Library benchmarking has been described as ‘a friendly competitive
intelligence activity’ (Gohlke, 1997, p. 22). In trying to provide a universally
acceptable definition of benchmarking it is useful to describe the
characteristics of benchmarking. Although there are different types of
benchmarking and various models or approaches have been tried and tested,
a general consensus as to what benchmarking is, and what it involves, has
gradually emerged. The language which is used may vary but the principles
are the same:
• A structured or systematic approach to finding improvements and
implementing best practice;
• A continuous process of measuring products, services and practices
against leaders;
• A focus on processes (individual processes, which are deemed vital to
customer satisfaction, are suitable choices for benchmarking programmes);
• An emphasis on learning. Benchmarking should not be regarded simply as
a comparative exercise, or be totally results oriented (Garrod & Kinnell,
1996, pp. 142–143); and
• A foundation of sound measurement and comparison.
21
CONTENTS
• recognise the need for change, gain commitment and set the scope;
• identify process to be benchmarked (subject) and how the process will
be performed (approach);
• select team and train members;
• analyse own processes within the broad area already defined;
• define and understand the process to be benchmarked;
– identify measures and collect process data;
• establish (call for) benchmarking partner(s);
• seek background information and process data;
• analyse and compare data against own internal process;
• finalise partner(s);
• conduct visits;
• analyse results;
– calculate measures and define within partner organisations, practices
in use in own organisation;
– compare values and identify differences;
– quantify effect of difference in practices and measures between
own organisation and partners;
– relate quantifiable differences to the practices employed and
determine which are significant to the goal of improving the
benchmarking process;
• develop action plans;
– determine cost effective means of achieving desired improvement
in benchmarked process and produce plan to be used to implement
the improvement;
• implement and monitor;
– put action plan to work and improve process;
– measure the improvement and identify causes, if any, for differences
between expected level of improvement and level attained; and
• benchmark again if necessary.
22
CONTENTS
3.2.1 CHEMS
CHEMS is the management consultancy service of the Association of
Commonwealth Universities based in London. In 1996 CHEMS launched an
international `University Management Benchmarking Club’ for universities
primarily, but not exclusively, from the Commonwealth. This inheritance
explains the considerable level of Australian involvement in their activities.
The Club offers participating institutions the opportunity to compare their key
management processes with those of a range of comparable institutions.
Unlike most benchmarking initiatives, this Club focuses on the effectiveness
of university wide processes and not narrow departmental functions. It is
developmental in the sense that each year project members review the
methodology and refine it further (CHEMS, 1999).
23
CONTENTS
24
CONTENTS
3.3 Background
An examination of the literature prior to 1994 reveals that little has been
published on benchmarking. However, benchmarking as a term has been
used for a long time in the context of establishing ‘benchmarks’. It is the
concept of a standard to measure against that has been retained in
applications in quality management. ‘Benchmarking techniques were applied
in the early eighties but were not well documented until the first fully
descriptive book about benchmarking was published by Camp in 1989.
Camp's book was the first attempt to provide focus and solidify the concept’
(Allen 1993, p. 123).
To summarise, benchmarking in the early part of the decade was less likely
to focus on a particular process or subprocess and lacked the systematic
approach which characterises formal benchmarking. The similarity between
the formal and informal approach has been in the philosophy behind the
activity—the desire to improve products, processes and services by
comparing performance with others, usually but not always in the
same industry.
Until the mid 1990s formal benchmarking activity was largely confined to the
manufacturing industry, and was associated with the measurement of
operations used in the manufacture of tangible products. ‘In both the United
States of America and the United Kingdom, the health sector was probably
the first service based industry to utilise benchmarking to improve
25
CONTENTS
performance and thus demonstrated for the first time that benchmarking
techniques could be adapted to suit the needs of the service environment’
(Garrod, 1996, p. 143). As a result ‘benchmarking, as a proven practical
management tool, is now spilling into the not-for-profit arena and is being
embraced by local, state, and federal governments and by academic and
other types of institutions’ (Gohlke, 1997, p. 24).
(The fifth respondent (SCU) gave no details as to what had initiated the
activity).The remaining respondents indicated that they were either intending
to investigate benchmarking in the near future or were interested in finding
out more about the process. A few stated that they were engaged in what
could be termed informal benchmarking. From the limited information
available from this survey it appears that the purpose and the outcomes of
these early projects were markedly similar to the later EIP survey findings,
for example:
26
CONTENTS
What has changed since 1995 is the substantial increase in the number of
academic libraries using benchmarking successfully in the formal sense as a
tool for continuous improvement. Although this has not been accompanied
by a corresponding increase in the literature on library benchmarking it does
indicate an increase in the focus on quality improvement generally, and a
desire on the part of librarians to undertake a more systematic and rigorous
approach to the improvement of products, processes and services.
The above statement from a British practitioner has been echoed in the
current project survey responses, and in discussions with senior library
managers where the following additional reasons emerged:
• Benchmarking challenges and opportunities include the enhanced role
of the information professional; impact of technology—multiple
mediums; direct access to information sources by clients; operating as
a business unit; budgets stretched—multiple formats; budgets decreasing;
corporate downsizing; contestability in government sector; outsourcing
(Robertson, S., 1998, p. 120);
27
CONTENTS
In 1997 Annette Gohlke wrote ‘Librarians in all types of libraries are finding
themselves in the position where they must build a solid and effective case
on how their library adds significant value to the organisation or institution
28
CONTENTS
The results indicated that between 1995 and 1998 eighteen Australian
university libraries, one New Zealand academic library, two state libraries and
the National Library of Australia participated in benchmarking either as
invited partners or project initiators. Benchmarking activities were undertaken
at library, institutional and external level (eg through library participation in
CHEMS and Universitas 21 and through the AQC benchmarking network). To
date, benchmarking has not been utilised by academic libraries in New
Zealand to the same extent as in Australia, although the University of Otago
intends to benchmark information skills with the University of Queensland in
1999. (The University of Auckland is a member of U21, no New Zealand
universities are as yet, members of CHEMS). Useful data was also received
from the non academic research library respondents and demonstrated the
usefulness of comparing the approaches between academic and other
research libraries. A visit was undertaken to the State Library of Victoria and
provided useful insights into the challenges of responding to a different set of
priorities, client groups etc.
29
CONTENTS
3.4.1 Partners
Benchmarking partners included:
• other university libraries, primarily Australian with some activity about to
be initiated by a New Zealand partner (University of Otago);
• Universitas 21 (participating libraries from Australia, New Zealand,
Singapore, China, Canada, USA, UK);
• CHEMS (participating libraries from Australia, Canada, Hong Kong,
United Kingdom, Africa);
• state, national and TAFE libraries;
• non library partners such as enquiry services eg Student Union,
Student Services, Personnel, Buildings and Grounds; and
• other industries eg law firms, pharmaceutical companies, Telstra,
Australian Consumer Association, and a hospital through the Australian
Quality Council Benchmarking Network.
CHEMS benchmarking was initiated by CHEMS, the libraries did not make
direct contact. In the case of U21, partners are still working out potential
projects so the roles have not yet been defined. Because partners come from
around the world with different political, economic, educational and social
cultures, it is anticipated that relationships will be complex. In Australia, U21
activities are currently informal but it is anticipated that some will progress to
the formal benchmarking stage. To date, Australian partners have reached
agreement on the scope and methodology of projects, followed by agreement
to undertake selected activities. The level of activity between the Australasian
partners will probably continue to be more intensive than with the overseas
30
CONTENTS
partners for the immediate future. The AQC benchmarking project required
participants to deliver presentations, participate in forums, collect and provide
data on activities.
31
CONTENTS
3.4.3 Timeframe
Project timeframes varied from one week to eighteen months, some are still
ongoing (U21). Responses indicated that the length does impact on project
outcomes (see lessons/outcomes below).
32
CONTENTS
33
CONTENTS
CHEMS
Feedback from CHEMS participants on the 1998 benchmarking exercise varied
from ‘limited value’ to ‘confirmation of best practice approach’, and ‘rigour in
completing questionnaire and accompanying benefits’.
34
CONTENTS
Universitas 21
In the case of U21 benchmarking activities, the focus is on comparable
activities and is externally imposed. Activities between Australasian partners
to date have included both quantitative (materials availability), qualitative
(client satisfaction) projects, comparative profiling and activity based costing.
In the area of client satisfaction surveys, results are being input into the
Rodski database, to create comparative profiles and provide useful partner
information for future benchmarking against other institutions and non-library
sector companies. U21 benchmarking is not a formal, process based
benchmarking model.
35
CONTENTS
However, benchmarking in the formal sense was not always viewed as the
most appropriate method for improving performance:
It can also be seen as an exercise that has the potential to fix and
organisation at a particular point in time. It has the potential to fix
sights in one direction while ignoring customer interest/need in another.
It is preferable to focus on continuous improvement through informal
internal benchmarking rather than institutional/library comparisons
(Deakin).
36
CONTENTS
The Library and Information Services Working Party of this project identified 8
benchmarks which sought to answer the general question ‘How would a Vice
Chancellor know that the library/information technology service was
performing relative to good practice'?
• Effectiveness of planning process;
• Contribution to the quality of research;
• Contribution to the quality of teaching and learning;
• Contribution to the quality of corporate information systems;
• Effectiveness of staff resources;
• Effectiveness of collaborative alliances;
• Effectiveness of networks and communications infrastructure and
services; and
• Efficiency of help desk services.
37
CONTENTS
The draft benchmarks are being trialled in August and September 1999. At
this stage it is difficult to know whether the final outcome of the project will
be a set of benchmarks which will be used by an institution to monitor its
own progress or be used more systematically to compare institutions. The
project is in collaborative dialogue with CHEMS Benchmarking Club but
differs in concentrating on outcomes rather than processes. The academic
library community will monitor progress with interest.
One member of the EIP team had, however, in June/July 1998 undertaken a
series of international visits to libraries, financed by her home library, UNSW,
to examine a select few interesting ‘quality’ sites in order to investigate how
quality frameworks were implemented, and related activities had been
undertaken. This was done in order to better position the UNSW Library in
terms of improving its own quality management processes. Small case studies
have been outlined here, drawn directly from those face to face meetings and
observations, to indicate some international practices in these areas. In no
way do they give a comprehensive outlook but they do illuminate
international opinions and work situations. The literature review (Chapter 2)
and useful source materials (Appendix E) provide a better overall picture of
international activity.
38
CONTENTS
Case studies
On the European scene there is a great divide between the former Western
European countries and the countries of the former Soviet bloc. These are
seen as the new ‘underdeveloped’ regions, with priority for library ‘aid’
activities to the immediate east focusing all spare resources and educational
assistance. The attitude of former communist areas to concepts and
movements such as quality and benchmarking reflects a deep suspicion
derived from their political background, and the considerable differences in
resourcing make the former borders also borders for co-operative activities
between like institutions. Much of the current activity in the EU is aimed at
simply raising awareness in the eastern European countries of current notions
such as quality principles, accepted in Western Europe but still ‘foreign’ to the
East. Eastern bloc countries may be unable to participate in co-operative
activities on an even footing for some time.
For many European libraries the real leadership and focus remains at the
state and not the national level, with the exception of the small European
nations. The German National Library, for example, focuses solely on
producing records for books published in Germany. Librarians seldom meet
on a national basis, and energies are fed into regional groupings, which are
the natural focus for benchmarking activities. Westphalia/Rheinland (the
region in which Muenster lies) has a group of thirteen university librarians
who meet monthly and carry out co-operative initiatives, including some
benchmarking under the dynamic leadership of Roswitha Poll, the Muenster
University Librarian and European expert on performance indicators,
benchmarking and activity costing. The group has a reputation for energy and
new initiatives within Germany. Its activities also include running a regional
co-operative cataloguing centre and a staff development centre where most
staff development activities for the librarians of the group are held.
39
CONTENTS
There is a perception that the more conservative European libraries focus too
much on statistics per se and cannot see the role of useful indicators in
performance improvement.
40
CONTENTS
One of the key areas of interest for potential benchmarking for DBS has been
materials availability, since there are 15 000 students, of whom half are full
time and half are part time, a very small library materials budget until
recently, and teaching and research in an area where most material becomes
quickly out of date.
Views were expressed that the university libraries really needed some type of
defined quality management process/framework in place in order to translate
the results of the gap analysis identified by benchmarking into tangible
41
CONTENTS
Both the SCONUL group and the Cranfield benchmarking exercises had
extended time frames, because of the complexity of local demands and the
scarcity of resources which could be dedicated to the work. Since their first
experiences in 1993, all of which have been documented in the literature in
articles by Town, Shrivenham have undertaken a range of less usual
benchmarking activities including:
• Strategic Benchmarking study which encompassed:
– buildings (housing integrated services/electronic libraries);
– electronic libraries (matrix of who had, and had done what in the whole
spectrum; and
– flexible teaching and learning.
• IT support (10 partners/3 in libraries). This was done by a consultant and
may have been more superficial than some participants would have liked.
Some non-library organisations fared well because of their Help Desk
systems;
• Document delivery/help desk with Surrey Institute of Art & Design, plus
three academic and one non-library partner; and
• ILL/Inquiries (with Loughborough and a couple of other participants).
42
CONTENTS
Berkeley, plus the CIC institutions, etc. The group discussed benchmarking
but the consensus was similar to the Purdue attitude that, at this point in time,
detailed benchmarking or comparative performance measures are politically
unacceptable. Less formal exchange of views and co-operation on new
initiatives or joint lobbying for a greater good were considered acceptable.
Within the Libraries, some small scale benchmarking has occurred as part of
the team approach to process improvement which is encouraged by Excellence
21. For example, one of the Purdue Excellence 21 TQM/CQI Library
improvement teams (the mail room/loading dock) had used benchmarking as
part of their improvement process. They used a team approach, and
benchmarked against comparable public service and commercial equivalents,
and had also devised client surveys adapting SERVQUAL TQM methodology to
their requirements, flow-charting all processes.
43
CONTENTS
4 Performance indicators
4.1 Terminology
Unlike the topic of benchmarking, the literature abounds with articles on
performance measurement across the whole library sector. However, like
benchmarking, there has been little agreement to date on a standard
definition of what is meant by the term performance measurement. ‘In the
ever-growing literature on library performance measurement, no
standardization of terminology has been established’ (Cullen, 1995, p. 438).
In selecting a useful definition, Cullen and Calvert state a preference for
Lynch’s definition which makes careful distinctions between terms which are
sometimes used almost as synonyms by writers in the field. ‘The results of
measurement can be used to evaluate the performance of a library, and
thereby determine whether or not it is effective’ (Lynch, 1983, p. 388).
Another simple but useful definition is provided by te Boekhurst
‘performance measurement is comparing what a library is doing
(performance), with what it is meant to do (mission), and wants to achieve
(goals). The extent to which goals are reached can be determined by using
performance indicators’ (1996, p. 279).
4.2 Background
Performance measurement has become a major issue within the Australian
library community. As the discussion progresses, there is an accelerating
emphasis upon the need to use such measures. Not infrequently, embedded
within this approach, is the hope that resulting performance indicators will be
comparable, and this in turn rests upon the assumption that one can look at
a ‘performance indicator’ and understand what is means. Performance
measures should not be viewed as ends in themselves; rather they are part of
the process of evaluation, and only within this context do they have meaning’
(Novak, 1992, p. 263).
45
CONTENTS
46
CONTENTS
47
CONTENTS
There have been a number of challenges associated with the CRIG project,
‘there are serious difficulties in developing a reliable and valid methodology
as has been extensively documented in the literature’ (Byrne, 1997, p. 255).
This was supported by Robertson and Trahn in their report on the
QUT/UNSW benchmarking project ‘in the area of research support, processes
are more diffuse than in the more procedural areas of library activity, and
there is very little agreement on appropriate output and performance
measures. A symptom of this is the absence of any AARL statistics for
reference transactions, or education, advice and information activities’ (1997,
p. 134). The QUT/UNSW project has been one of the few recorded attempts
to benchmark reference/research support services, and highlighted the lack of
publicly available performance data and the lack of a common methodology
for measuring performance in specific research support services, acquisitions
and cataloguing (ibid, p. 140).
48
CONTENTS
At the time of writing, the CRIG Report had been distributed to all
CAUL members for consideration of the recommendations made by the
Working Party.
As the survey responses indicate, there has been a shift and consequent
reprioritising of indicators deemed to be important by the academic library
community. The need for indicators for the electronic library has emerged as
a key issue, and impacts to some extent on the work done to date by CRIG.
Fortunately, some excellent work has already been undertaken in this area
internationally by McClure, Brophy and the European Commission sponsored
EQUINOX project, which are described in some detail in the Useful Sources
list (Appendix E). Survey responses also illustrate the large amount of effort
that libraries have invested in the development of in-house indicators. There
has not been an across the board adoption of the CAUL indicators, and it is
quite likely that there has been some duplication of effort in the development
of in-house indicators within various institutions. Exploration of the possibility
of sharing this effort, and the indicators developed to date, is worthy of
further investigation.
49
CONTENTS
continued
50
CONTENTS
Griffith A, B, C Adapted
Facilities use rate, service point
use rate surveys
In-house
Activity Costing—unit and total cost
of activities and services
JCU A, B (Irregular), C –
Macquarie A –
Monash B Adapted
Staff perception (U/Melbourne survey)
Client satisfaction (CAUL A &
Van House)
In-house
Distance education students survey
Murdoch A In-house
For selected core services
(acquisitions, cataloguing, serials)
ABN ILL management statistics
ILL statistics for WAGUL
NTU A, B, C ISO 11620 under consideration
In-house
Off campus requests
(fillrate & turnaround time), info skills,
service points
QUT B, C In-house
Reference, lending, technical services
Others listed in Service Charter
RMIT A, C (Infrequent) –
SCU No –
Swinburne No –
U/Ballarat C (once) In-house
Information desk satisfaction; ILL
(Exon & Williamson database)
U/Canberra No In-house
Accessibility, Availability, Adaptability,
Assistance from professional staff
ie all staff performing professionally
U/Melbourne C Adapted
Client satisfaction (CAUL A)
In-house
Aligned with Strategic Plan
UNE No –
continued
51
CONTENTS
52
CONTENTS
4.3.2 Ratings
Client satisfaction, and the need to extend performance indicators into the
electronic information environment, were viewed as the most important.
Document delivery was also viewed as very important. Not all institutions
provided ratings for all the indicators.
Market penetration 11 12 5
Opening hours compared to demand 14 12 4
Collection quality review by
expert checklists (viz. conspectus) 5 11 12
Collection use (viz. IFLA) 14 12 2
Catalogue quality—known item search 10 15 4
Catalogue quality—subject search 7 17 4
Extend to electronic information services? 23 4 2
Acquisition speed 12 15 2
Book processing speed 15 13 2
Document delivery 22 7 –
ILL speed 20 10 –
Reference fillrate 5 19 2
User satisfaction 26 4 –
Cost efficiency 18 11 –
Costing methodologies 13 13 3
Electronic resources—Quality 20 7 1
Electronic resources—Availability 24 5 –
Adequacy of retrieval software and/or data 16 11 1
53
CONTENTS
54
CONTENTS
55
CONTENTS
General
• Should be aligned with ISO 11620;
• Need to be adapted for the high volume of electronic resources available;
• Improve efficiency of data analysis techniques eg through use of
scannable; survey forms; and
• More coordination of sharing of results of use.
56
CONTENTS
continued
57
CONTENTS
58
CONTENTS
59
CONTENTS
One member of the EIP team had, however, in June/July 1998 undertaken a
series of international visits to libraries, financed by her home library, UNSW,
to examine a select few interesting ‘quality’ sites, to investigate how quality
frameworks were implemented, and related activities had been undertaken, to
better position the UNSW Library in terms of improving its own quality
management processes.
Small case studies have been outlined here, drawn directly from those face to
face meetings and observations, to indicate some international practices in
these areas. In no way do they give a comprehensive outlook, but they do
illuminate international opinions and work situations. The literature review
and useful source materials provides an additional overall picture.
The Muenster Library has put theory into practice as regards a comprehensive
suite of thirty performance indicators. Its use of indicators is an integral part
of planning structures, including the regular articulation of mission, goals,
objectives, performance indicators and performance targets for the Libraries.
These processes were regarded as quite radical when initiated within the
60
CONTENTS
Library, but other parts of the University have since been influenced by
this thinking.
Muenster has also regularly devised and run surveys such as materials
availability, and staff satisfaction. The staff satisfaction survey was internally
developed from a project run by one of the junior librarians. It was
developed with full union consultation. Overall results were published, and
the results for each department discussed internally within that department.
61
CONTENTS
The DBS was also a partner in the EQLIPSE project mentioned above, and
outlined in the Useful Sources list (Appendix E). Difficulties encountered in
this project included:
• the level of bureaucracy required for an international project;
• the instability of the trial software;
• the indicators list was conservative and perhaps not the most useful
listing; and
• the original project was based on the premise that library systems could
produce data for management purposes with a little work. This proved not
to be the case.
The successor EQUINOX project will take the outcomes, add in electronic
performance indicators and a requirement to come up with software that is
more attuned to user needs, and much closer to commercial reality.
62
CONTENTS
The EQUINOX group will also consider the use of broader quality
frameworks such as the relevant Quality Awards. Initially the focus was on
ISO 9000, but project surveys produced a surprising strength of feeling in the
European library world that individual institutions want to pick and choose
what suited them, and not be tied to inflexible programs which cost a
great deal in time and/or money, as is the current perception in the case of
ISO 9000.
The aim of the new slim line PI’s was to allow individual institutions to see
how they compared vis-a-vis other institutions, and check with themselves
whether this is what they want in this instance. It would also make
identifying sector leaders for certain areas easier, and allow a clearer picture
of each institution, so that individual institutions, which have some common
trait, may follow up with certain others for benchmarking purposes.
The recommendations in the report which arose from the project, included
some matters of interest:
• expenditure on information provision includes both acquisition and access
(i.e. cost to access via ILL or electronically);
• seat hours and seat hours occupied per week are used to measure access;
• user education and information services are dealt with as a combined
activity expressed in terms of staff hours (very similar to a recent
Australian Universitas 21 approach);
• the recommendations are set alongside contextual information (library
context/institutional context. This is also similar to a U21 approach);
63
CONTENTS
• impact was mentioned, but with the rider that the British Library is about
to select a number of tenders for research in this area, and that this should
be included farther down the track;
• measurement of in-house use compared to borrowings each time there is a
significant change in library service is recommended, but these data are
not comparable so this would be for operational use only;
• there are recommendations on the liberalisation of access to holdings
nationally, and the consideration by libraries of this type of access over
holdings strategy for marginal areas;
• availability studies should include consideration of electronic publications;
• specific availability studies should be conducted for specific categories of
users only initially; and
• data sets for the electronic library should be developed in concert
with UCISA.
Europe itself, for all the collective efforts of the European Union, is still very
much pre-occupied with regional concerns. European approaches are still
very localised, vary widely according to the region, and any move towards
cooperative initiatives like benchmarking, are unlikely to arise from
national initiatives.
64
CONTENTS
5 Quality/best practice
5.1 Terminology
One of the difficulties that becomes evident when discussing quality and best
practice activity in academic and research libraries, is that of determining a
shared understanding of the terminology. A common theme of discussions
during site visits to specific libraries throughout the course of this project, has
been that words such as ‘customer’ and ‘quality’, (perhaps because of their
more general association with the business environment), may elicit a degree
of cynicism amongst staff, and create a barrier to quality program
implementation. This is particularly evident for those further removed from
the actual implementation of ‘quality’ management programs and frameworks.
At both the University of Melbourne and the University of Wollongong, the
adoption of quality language and terminology is seen as integral to the
success of programs. ‘The use of some of the terminology such as ‘client’ or
‘customer’ may be unfamiliar at first, but it is vital in sharpening the service
focus and inculcating a professional view of the relationship between those
providing and those receiving the service’ (McGregor 1997).
Best practice and quality are often used synonymously, and, whilst there are
similarities, best practice has engendered its own definitions. The EIP ‘Best
65
CONTENTS
practice for Australian university libraries’ project team has adopted the
Australian Best Practice Demonstration Program (ABPDP) definition that
defines best practice as:
The pursuit of world class performance. It is the way in which the most
successful organisations manage and organise their operations. It is a
moving target. As the leading organisations continue to improve the
‘best practice’ goalposts are constantly moving. The concept of
continuous improvement is integral to the achievement of best practice.
(ABPDP 1994).
5.2 Background
The ‘quality’ movement in university libraries in Australia developed out of
the climate surrounding the then Commonwealth Labour Government’s
Quality Audit of the higher education system during the period 1993–1995.
This process, rather than focussing on educational outcomes, assessed the
level of quality assurance policies and practices in place, resulting in a
‘ranking’ of universities that was subject to intense criticism (Williamson &
Exon 1996). Nevertheless, the audit period created an impetus for the review
and adoption of quality management programs both broadly across the
universities, and also within individual university libraries.
66
CONTENTS
67
CONTENTS
The perceived need to ‘demonstrate to the funding authority that services are
of high quality’ (Monash), and to provide ‘effective management of programs
in a climate of decreasing resources’ (Ballarat), were cited by a number of
libraries implementing quality management programs. Perhaps these
arguments are best summed up by Felicity McGregor from the University of
Wollongong Library:
68
CONTENTS
69
CONTENTS
70
CONTENTS
71
CONTENTS
In contrast:
Although at the time the survey was conducted, more than 50 per cent of
the respondents had assigned responsibility for quality to one person, the
intention was not always to maintain the position indefinitely. Many
respondents also indicated that a team based culture and a relatively flat
organisational structure were crucial to the successful implementation of the
72
CONTENTS
The ‘7 Up’ group which consists of all library staff of HEW7 and above.
All members of `7 Up` are also members of a Priority Area taskforce
and responsible for promoting and propelling the Priority Area quality
initiatives throughout the library and involving other staff in the
initiatives. (Queensland).
73
CONTENTS
The frameworks formally adopted (Balanced Score Card, AQA, SQMS) have a
number of features in common. For example, all are multi-faceted and utilise
a number of similar techniques and tools. Both AQA and SQMS are heavily
reliant on self-assessment, and all have been adopted for their perceived fit to
the particular library’s/institution’s needs/strategic directions. Both the BSC
and AQA frameworks have their roots in the business environment, and have
undergone some adjustment or reinterpretation to encompass aspects of
library service delivery. A thorough and sustained implementation of a quality
framework, whether the result of a library only initiative or a university wide
initiative is not an easy process, as the relatively small number of fully
implemented frameworks indicates. How many of the institutions who
indicate they have ‘partially implemented’, or are ‘in progress’, or are
‘considering initiatives’, will achieve full and successful implementation of a
quality framework could, in part, be influenced by the ease with which
74
CONTENTS
libraries can make informed decisions about the most appropriate path to
follow, and the steps to take.
A number of reports and articles in the literature outlining the adoption and
implementation of these frameworks, in particular the AQA model (McGregor
1997, Presser & Garner 1999), have helped to increase awareness of their
potential usefulness and applicability across the sector. It is significant that
whilst only six of the twenty-nine CAUL member respondents to the survey
indicated that they had formally adopted these programs, a further eleven
libraries indicated that they were either making use of AQA criteria, or were
interested in working towards, or gaining ISO certification (18 libraries are
thinking/doing something as below).
75
CONTENTS
76
CONTENTS
Setting up the BSC has caused us to totally revise the way we present
our Strategic Plan, our Annual Report, what statistics are collected etc.
It has had profound influence at the management level and, hopefully,
once all bedded down, to all levels within the organisation. (Deakin).
77
CONTENTS
Framework description
The Balanced Scorecard (BSC) is basically a way of grouping performance
indicators with the additional advantage of providing a strategic management
system. Developed at Harvard Business School by Robert S. Kaplan and
David P. Norton, it has been primarily designed for businesses as a means of
focussing beyond financial measures, to incorporate criteria that measure
performance from three additional perspectives—customer satisfaction
(Clients), internal business processes and the organisation’s innovation and
improvement (learning and growth) activities.
From our client focus groups we ascertained the hierarchy of values (or value
models) of our clients, and these have been used to define the objectives
within the five perspectives. For each Objective there are a number of high
level performance indicators that are relevant in our environment. The high
level performance indicators cascade down to Unit level indicators, and by
the end of 1999, we hope, into individual performance indicators in the
Performance & Planning review process.
Why BSC?
BSC provides a framework that the Library believes can be easily explained
and understood by staff and others. It has been given increased relevance
within the library environment through the addition of a fifth perspective—
Information Resources (‘satisfying demand for information from Library and
other resources') to the existing categories specified by the Library as Clients
('providing value to clients to help them achieve their goals’), Financial
Resources (‘building financial strength to develop Library services and assets’),
Internal Processes (‘excelling at processes for fast, effective delivery of services
and resources’) and Learning and Growth (‘enabling staff to lead and
innovate’). ‘The BSC is a tool for monitoring all facets of our work and service
delivery. It is really a “quality tool” not just performance measurement’.
78
CONTENTS
Background
Approximately three years ago the Library began to focus on the library
strategic plan as a means of developing future direction. Involving (and
engaging) stakeholders in the process was seen as a key issue—this was
achieved through the use of focus groups targeting key customer
groups/levels of students. Groups were externally facilitated, and from the
results the strategic plan was developed, and arranged to reflect what the
stakeholders identified as key or important. There was some difficulty
identifying measures for strategic directions—the Balanced Scorecard (BSC)
was recommended by the consultants engaged to drive the process and after
review was subsequently adopted.
Training commitment
The management team has had many days of training about the BSC,
performance measurement, strategic planning etc. All professional staff have
had an overview of the BSC and have contributed to setting high level
performance indicators. All staff were given a presentation of the BSC and
what it means for the Library. Unit managers have discussed with their own
staff the BSC, strategic directions and have collectively set up operational
plans and targets.
Organisational features
Three teams have been formed (Information Skills, Information Resources,
Access and Delivery) to advise library management on various issues, and to
facilitate implementation of the Strategic Plan. All professional level staff are
involved in various working groups related to implementation. Non-
professional staff are not generally expected to participate but are given both
encouragement and the opportunity if interested.
79
CONTENTS
In addition the Library has a dynamic library (internal) intranet that aims to
provide staff with both access to the available tools and information needed,
together with internal communication and information exchange. Arranged
under four areas:
• Units—governance (annual reports, working copies of strategic and
operational plans, policies and procedures, BSC).
• Staff—(teams, training and development).
• Tools—(online library resources, project management resources,
templates).
• Help.
80
CONTENTS
groups will remain the main method of identifying client needs. Internal
customer feedback processes are in place via online and written feedback.
Indicators to measure electronic access and use are seen as a priority area for
development but it is extremely difficult to find effective measures. BSC
includes general measures and targets related to ‘Using physical and
electronic information resources,’ however these measures do not indicate
what is used or the quality of the resource. Some work may be done in
linking electronic measures to client satisfaction measures.
Challenges
A major challenge for Deakin may be the linking of BSC to other quality
frameworks. We are investigating quality frameworks such as ISO, AQC and
other models. We are revising all our policies and procedures using a
template that will stand us in good stead by these accrediting bodies.
However the prime motivation is to get consistent policies and procedures
rather than go for an award.
Achievements
Staff have a sense of where we are going (strategic directions); we know if
we are achieving what we set out to achieve (operational plans, BSC); we
know the priorities (strategic plan). Because we can see where we are at any
given moment in time we can easily communicate our achievements etc.
As a result staff morale is high and we get many accolades from the
University who can see that we are doing what they want and doing it well.
The University can also see how our strategic plan aligns with the University’s
strategic plan, and therefore how the Library is contributing to the
achievement of the University’s strategic objectives.
81
CONTENTS
Summary
Like Wollongong (see below), Deakin University Library has successfully
adopted and applied a program primarily aimed at commercial business
operations to a service environment. The extent to which BSC will continue
to influence the management planning process will be dependent on staff
support, and continuing tangible improvement of planning and operational
processes. Indications are however, that BSC has provided Deakin with the
means to focus activity without losing sight of customer and client values.
An important aspect of the adoption of BSC has been its ability to provide a
quality management and continuous improvement tool that can be
‘incorporated into all aspects of library practice’, allowing ‘quality issues’ to be
‘mainstreamed, rather than set up ‘quality management’ separately and run
the risk of staff seeing quality and continuous improvement as something
extra’.
Framework description
The Framework provides a roadmap for business improvement and long term
success. It is both an evaluation tool for the Australian Quality Awards for
Business Excellence, and a business improvement tool. It can be used for
internal self assessment by any organisation wishing to improve its business
results and ensure long term viability. It is also a useful tool to take stock of
where your organisation is now, and to involve your staff in getting to where
you want to be in the future. It focuses on key elements underpinning
effective management practices. The 1998 framework required organisations
to provide evidence using the ADRI (Approach, Deployment, Results,
Improvement) model against the following categories:
• Leadership
• Strategy and planning
• Information and analysis
• People
• Customer focus
• Processes, products and services
• Organisational performance
82
CONTENTS
Why quality?
'The changing nature of universities is one of accountability for the
management of resources. Quality management will help ensure a systemic
approach is taken in assessing the effectiveness and efficiency of resource
management.’ (McGregor 1997, pp. 83).
Choice of framework
‘A major factor in selecting a program emphasising performance measurement
was strategic; to ensure that the library would be equipped to meet future
challenges…Quality management was adopted…as a comprehensive and
integrating framework which was applicable to the library’s particular stage of
development and to the successful management of current and perceived
future environments’ (McGregor 1997, pp. 83). The adoption of a formal
quality management model has proven to be beneficial for us. We have
adopted the Australian Quality Council’s Australian Business Excellence
Framework for monitoring and measuring organisational performance. This
model was chosen as opposed to other management models for its
organisational fit, and congruence with existing principles and practices’.
Background
A formal quality management program was implemented in 1994, Quality
and Service Excellence (QSE). The objectives of QSE include:
• Development of excellent Library services through the implementation of a
total quality management program: Quality and Service Excellence;
• Development of a systematic approach to documenting the improvements
in client service which have been achieved to date, as well as providing a
basis for measuring future improvements;
• Library-wide commitment and priority to the application of quality
management principles to all processes and services;
• Establishment of a framework for regular self-assessment of the Library’s
activities and results;
• Library-wide focus on delivering increasing value to clients; and
• Staff who are empowered to use their individual and combined skills and
experience to improve processes and their outputs through needs-based
training and development.
83
CONTENTS
processes and services within the CSF/KPI framework. KPI reports are
produced annually, however, teams monitor their processes and services
regularly throughout the year. Benchmarking commenced in 1997 and we
have just embarked on our third project.
Training commitment
(1994) QSE program. A consultant provided training in basic TQM tools and
principles; some staff received training in facilitation skills. Inhouse
workshops on TQM Awareness were developed, and refresher training was
also provided for staff who had not had the opportunity to participate in a
quality improvement team, and to introduce new staff to QSE. These
workshops are repeated on a needs basis. A number of these programs
would form part of a staff development program in any organisation, but the
focus of the Wollongong sessions is totally informed by the QSE approach
and its requirements.
84
CONTENTS
Organisational features
• Teams structures were introduced in the early 1990’s and training was
provided to facilitate the process. The Library now has a relatively flat
structure and is team-based;
• Multiskilling and job enrichment opportunities have improved overall
operations knowledge for many staff;
• A Management Advisory Team, made up of all team coordinators and level
7 staff and above, meet monthly to review and develop policies and
communicate broader university issues to all staff;
• Staff empowerment is encouraged by supporting judgement and decision
making throughout the Library;
• Supervisors are now called coordinators;
• A Quality Steering Committee formed in 1995; and
• A Quality Coordinator appointed in 1996—the key aims of this dedicated
position were to: drive QSE and to facilitate the integration of QSE
principles throughout the Library; provide a resource to staff; take
responsibility for QSE documentation; and evaluation of quality programs.
85
CONTENTS
nurtured through all levels of the organisation. The organisation must develop
an infrastructure that encourages and reinforces the values and behaviours
that are congruent with how the organisation wants to operate.
86
CONTENTS
Challenges
• Initially, financial planning aspects were difficult to relate to work done
within the Library. With the cultural change that has occurred, there is a
feeling that this may no longer pose a problem as the Library has become
more business focused, and it is easier to see links and relationships
within this context. There has been an uptake of statistical measurement
and skills.
• Staff resistance was negated as team skills were built up using tools such
as Myer-Briggs to identify individual strengths within teams. Cross
functional teams support the framework, and enable the creation of links
between good ideas and subsequent change or improvement.
• Articulating high level concepts (leadership etc) and placing these into a
process flowchart.
• Process thinking applies less readily to some areas eg information literacy
as opposed to serials data entry.
• AQC project—benchmarking outside LIS—cost and inability to locate
appropriate internal or external funds might hinder participation.
Challenges also included the establishment of credibility of the library
within an external network of organisations.
Achievements
• Innovative solutions to problems with systems and resource limitations;
• Improved client focus from ALL staff—not just Information Services;
• Improved alignment of strategic objectives and team actions;
• Leadership demonstrated at all levels within the Library and commitment
and participation by staff in the development of improvement goals
and strategies;
• Benchmarking visits to partner businesses helped to place the role of the
library into financial/profit oriented framework. Within the library context,
profit was identified as maximising the value of the investment made by
students for their education, and looking at the full range of what you do
and drawing more out of it. There is value in visiting other libraries to
place your own work in context. Improving internal processes can
lead to more cost-effective practices and negate impetus for outsourcing
(for example).
87
CONTENTS
Advice/summary
We have to get away from the concept that ‘quality’ is something extra we
have to do. We expect, and often demand, quality service and products in
our day-to day interactions, and if we are stakeholders in other businesses or
groups, we expect that the resources will be managed well and provide value
to others. These expectations are also valid for libraries, and will become
increasingly important in a climate of economic uncertainty, technological
innovation and changing supplier services/products, and relationships
(McGregor, 1999).
88
CONTENTS
• awareness of the need to look beyond the University and beyond the
profession for ideas and opportunities;
• reflection on the nature of the organisation and its business; and
• use of information for decision-making and recognition of the need to
question the status quo, and constantly seek to improve.
Use of the Australian Quality Council’s framework has also informed the
approach of the Australian Universitas 21 libraries to information sharing and
improving performance.
Framework description
The SQMS model was originally based on the Scottish Quality Management
System which is a system based around the needs of educational institutions
(see http://www.sconto.demon.co.uk/sqms.htm for details on the system).
It is defined within Swinburne documentation as ‘A documented management
system that meets the requirements of a defined standard, and designed to
ensure that the quality of the programs and services provided by the
University meet the goals and objectives. This management system also
includes the documented process for continuous improvement’ (Glossary—
SQMS Formal Review Information Pack, quoted in Swinburne University of
Technology ‘Overview of SQMS')
89
CONTENTS
Why SQMS?
From the Library perspective, SQMS is seen largely as providing a framework
that has, unlike others, been developed specifically for educational
institutions; is flexible; lines up with ISO and AQC frameworks, and yet is a
quality management system in its own right. It has enabled the Library to
assess and improve a number of services formally by identifying
gaps/needs/areas for improvement, establishing the action needed to
improve the service, and then setting targets for improvement. Overall
quality management is seen to be ‘high priority’ and senior staff are now
committed to seeing the adopted model work, and appear to be active
’sellers’ to other staff.
90
CONTENTS
Training commitment
• On-going ‘quality training’ programs both in relation to the overview of
SQMS, Australian Quality Council programs, and in areas identified as
deficit eg. customer service training;
• A number of university staff, including members of library staff, have
received internal auditor training;
• Quality facilitators also have ISO 9000 training;
• Benchmarking training has been conducted by an external trainer; and
• Self-assessment teams are cross-functional and have been trained in best
practice for teams.
Organisational features
SQMS is seen as a process model for organisational change, working up from
unit level to corporate level. Self-assessment scoring occurs at unit level. It is
a top down and bottom up approach that encourages change from below,
linked to organisational and institutional strategic priorities. Recently, the
focus has changed from validation review, to validation through internal
benchmarking, on the basis that potential outcomes are likely to be better,
particularly in relation to continuous improvement and exchange of ideas. It
is useful therefore when more than one area undertakes assessment of the
same criteria in the same year.
91
CONTENTS
92
CONTENTS
ISO 9000
The TAFE division at Swinburne will be seeking ISO 9000 certification in
early 1999. The Library (which serves both TAFE and higher education
sectors), is considering the possibility of being party to this.
SLA’s include KPI’s/targets and costs for some components, with some
utilisation of the SCONUL performance indicators. It is intended that during
the period of the SLA, an internal review group will review performance
against targets within the SLA. This self-assessment will also cover program
delivery within the Library, and may be a better way of assessing reference
services. Participation in self-assessment activity is seen as a priority in normal
work, and staff involvement has helped communication and enthusiasm for
the process.
Challenges
Implementing any quality management system implies a number of
challenges. Those encountered by Swinburne include:
• Implementation within a multi-campus/TAFE/Higher Education
organisational environment;
• Communication—Effort has been put into communication through staff
meetings, in order to encourage all library staff to see how and where the
self-assessment process (for example) sits within both the overall program,
93
CONTENTS
Achievements
• Identified areas for corrective action; and
• Heightened awareness of the continuous improvement cycle.
Framework description
Priority areas
Priority Areas for the year are determined at the end of the previous year
during an Annual Review undertaken at a two day session involving all levels
of staff, some of whom are present by virtue of seniority, and others in a
representative capacity. At the Review an ‘appreciative inquiry process’ is
used to consider the progress achieved by the Library during the last year,
and to position the Library for the future. User needs for integration into
service and information access goals are identified, and taskforces formed to
address issues and prepare possible implementation strategies. Then,
throughout the next year, the taskforces address the Priority Areas,
implementing new services and improving existing ones. TQM processes are
being implemented in key areas, for example document delivery.
94
CONTENTS
Benchmarking
Areas identified for improvement are integrated into the Library’s Priority
Areas.
Performance measures
Both quantitative and qualitative performance measures are in place.
Additional measures are being developed. With regard to process measures,
many statistical details are kept, and measures developed from these as
required. The perspective of client satisfaction is the basis for measurement,
activity based costing is also being carried out.
95
CONTENTS
Training commitment
• The ‘appreciative inquiry process’ of the Annual Review is explained to
staff at the beginning of the Review and a consultant facilitates the process.
• With regard to specific Priority Areas, taskforces are led by team leaders
who arrange awareness/training for team members as required.
• All staff are informed of the results of quality management
initiatives/benchmarking exercises through library publications, and in
information sessions.
• Staff are trained in the collection of data, both quantitative and qualitative,
that is part of standard operating procedures.
Organisational features
The ‘7 Up’ group
This group consists of all library staff of HEW 7 and above. All members of
‘7 Up’ are members of a Priority Area taskforce. ‘7 Up’ is responsible for
promoting and propelling the Priority Area quality initiatives throughout the
Library, and involving other staff members in initiatives. Through this group
quality initiatives reach all sections of the Library. A system of communication
is in place whereby all staff are informed of initiatives and the results of
programs. Also, the Library has a Projects Coordinator who:
• coordinates library projects related to benchmarking and best practice;
• coordinates quality assurance projects, including the development of
quality assurance policies and procedures, and the development of
performance measures; and
• maintains a database to facilitate the collection, analysis and reporting of
library performance data and provides reports on library performance to
internal and external bodies as required
96
CONTENTS
Results of quality management initiatives are fed back into the quality
management program.
Challenges
• Would have benefited from more staff involvement and time (CHEMS);
• Difficulties in establishing the priority, change in focus (Benchmarking
document delivery project);
• Balance between keenness and knowledge, and the need to involve those
whose job responsibility it is (Planning and Appreciative Enquiry
processes); and
• Effective use of time, are managers spending too much time on the less
important, operational aspects of their jobs, and not enough on the
visionary forward focus aspects?
Achievements
Staff
• increased participation in strategic planning activities;
• opportunities for involvement in cross-sectional initiatives, and to work
with colleagues from other sections of the Library; and
• increased awareness of comparative data.
Library
• improved information on whether or not services are performing
as expected;
• improved delivery of existing services and implementation of
new services; and
• more efficient and effective use of resources.
97
CONTENTS
For example, amongst other items over the last few years, the quality
management program has resulted in/aided:
• extension of the virtual library;
• improvement in customer service, particularly with regard to information
skills programs and service at ‘non-service’ points;
• refurbishment of library buildings and development of a
new library (Ipswich);
• implementation of a new integrated Library Management System;
• development of the Library’s Home Page as an interface to all
library services;
• improved shelving practices;
• improved document delivery services; and
• development of plans on flexible delivery and service options.
External frameworks
University of Queensland is a member of the Commonwealth University
Management Benchmarking Club managed by the Commonwealth Higher
Education Management Service (CHEMS). Each year the Benchmarking Club
reviews different areas of member universities. In 1998, one of the areas was
Library and Information Services. The Library was surveyed about many
activities in which it participates, and then compared with other
Commonwealth Universities. The Library scored the maximum rating in all
categories, and was the only library to do so. The Library is also considering
using the AQC and ISO frameworks.
98
CONTENTS
One member of the EIP team had, however, in June/July 1998 undertaken a
series of international visits, financed by her home library, UNSW, to examine
a select few interesting ‘quality’ sites, in order to investigate how the quality
frameworks were implemented, and how related activities had been
undertaken. The intention was to better position the UNSW Library in terms
of improving its quality management processes.
Two of those 1998 observations are recorded here to demonstrate two very
different approaches, by two very different university libraries, in two
continents, using different quality frameworks. The team does not claim this
is a balanced representation of all aspects of the international scene, but
simply a set of snap shot case studies which illuminate extremes and
represent implementation scenarios, not paralleled within Australia. The
literature review in Chapter 2 reveals a fuller picture.
99
CONTENTS
General
University of Central Lancashire (UCLAN) is an example of a small ‘new’ UK
university, implementing innovative ways to support their non-traditional
student body, who tend to consist of part-time, mature age, first generation
university level, international, and minority students. At the bottom of UK
university ’ league tables ’ in terms of resources, it rates highly in terms of
student satisfaction with their experience.
Quality journey
UCLAN is a central city campus which is now one of the main employers in
the Preston area in this post-industrial era. The LLRS workforce is
exceptionally female in composition (even for a library), and has a high part-
time staff component. Library and computing arms of the university were
‘converged’ in 1995, with a significant impact on services and their quality
approach. As in Australia, converging technology and reduced resources have
ensured a continuing interest in convergence throughout Britain. Two well
published librarians, Peter Brophy and Kate Coulling, were the two key staff
initially involved at the LLRS. The LLRS Quality Coordinator oversaw the ISO
9002 certification process from 1992, and made a significant personal
contribution towards its success.
UCLAN LLRS has been ISO certified for almost five years, and has a mature
and simplified system. The perceived benefits of auditors feedback have
become less as fewer improvements to the quality management system are
recommended. At the time implementation was begun, there were compelling
reasons why a quantum leap in the quality and consistency of services was
crucial to the survival and positioning of UCLAN.
100
CONTENTS
The UCLAN as a whole has also been certified against the IiP program
(Investors in People). LLRS staff were part of the random group selected for
interview, to assess whether UCLAN could be awarded the IiP mark. (IiP is a
government supported assessment framework relating to only one area of the
AQA categories (PEOPLE). Since UCLAN have an annual development
assessment program which, in the LLRS is combined with the used of a
competencies profile, LLRS is well equipped in relation to personnel
practices. The results of the annual assessments in terms of training needs,
are aggregated by unit heads after the process has been completed, to
identify training which should be provided internally in the LLRS. UCLAN has
an excellent staff development program. The University is also a recipient of
the Charter Mark, which again is a restricted framework relating to service
industries and their customers. A range of clients were interviewed, as well as
staff who deliver service. The work for the ISO framework made preparation
for other certification visits easier.
Conclusions
The process has worked well in this setting because:
• Key senior staff were knowledgable and supported the process;
• UCLAN as a whole is small, flexible and aggressive in pursuing service as a
marketing advantage;
• A structured quality management system such as ISO has proved useful
given the high proportion of part-time and session only staff to ensure staff
learn the correct procedures and carry them out;
• Simplicity and common sense are key values within the organisation, and
their quality implementation reflects this;
101
CONTENTS
Future
The Quality Coordinator would like to see progress in using the European
Quality Award criteria to broaden the LLRS approach to quality.
General
M-Quality began with a 1991 report to University of Michigan management
on Enhancing Quality in an Era of Resource Constraints: Report of the Task
Force on Costs in Higher Education. Michigan is a very large institution with
25 000 employees on multiple campuses. There is a strong culture based on
control at the faculty or equivalent level. M-Quality is an adaptation of TQM
principles and its ideology and tools are presented to each area head for
discretionary use, although, initially there were some elements of
persuasiveness. So much controversy ensued initially over the use of the term
‘customers’, that HRD later introduced the euphemism ‘those we serve’ to
continued scepticism. The driver for this massive program was the then
radical notion of using the same or reduced money to deliver a better service
through reduced costs.
Some academic areas rejected the notions and some, like the business
areas, nursing, etc have embraced M-Quality because it fits with their
culture and other initiatives. It is at the faculty/unit level where the
continuous improvement is happening. At this level there are also Lead
teams, usually the executive/management group, who look strategically
at areas for improvement.
Team leader training of forty hours includes two hours introductory concepts,
eight hours refining tools and processes for selecting appropriate issues to
work on. This is to ensure that they select the critical business practices of
their unit, and choose something significant enough to make an impact, and
is within their power to change. All projects must meet certain criteria before
time is spent on them, and all leaders must be trained prior to team
formation. If a unit wishes to refine a topic area, or needs to learn skills to
102
CONTENTS
use in ‘Managing by Fact’, HRD can be called on for a consulting fee. Course
attendance is also charged out to customers units as HRD is required to
generate its own income.
For ordinary non-academic staff this means a range of things. It can mean
being on a team, or trying to implement ‘Quality in everyday activities’ (if not
officially on a team).
There have been M-Quality Expos, in which the Libraries’ staff have
participated, since 1994. Expos have been two day events, coordinated by a
team, with top officials of the university interacting with the staff
improvement teams, and information in a special lift-out of the university
newspaper. Team members keep their handouts and their materials and use
them to talk to their customers over the next year. The Expos have almost
evolved into an university expo, with an accompanying mini conference on
developing issues in running the university, which are open to all. In the
most recent Expo, new digital initiatives were demonstrated in the Millenium
Room collection of showcases for new technology.
In the academic area there was an ongoing team effort between academic
and support staff to improve the grant application process, and relations
between the two groups of staff. The first overall staff perception survey was
held in 1998.
103
CONTENTS
Quality journey
The Libraries have fielded the following teams over time:
• Library Student Hiring Team;
• M-RUSH Team (materials processing);
• Serials processing;
• Shelf reliability;
• Digital Library Production Services; and
• Some other smaller groups included ILL, MERLYN records group
and Labelling.
Almost all staff attended a brief general orientation to M-quality and its
concepts. About forty staff went through team leader training, and about a
dozen of these attended the forty hour facilitator training. Some staff
facilitated more than one team, others were short term leaders/facilitators
when others left the organisation. The Facilitator group had a management
literature awareness raising and experience sharing group going for a while.
Initially there was no university-provided management training at Michigan,
and staff in the Library received some defacto leadership training only
through M-Quality courses.
Some investigations stretched to eighteen months to two years and met for
only about an hour a week with occasional breaks, a very expensive learning
process. In the case of serials, it took the team 3–4 months to understand the
process initially, and fully flowchart it (15 feet long). Most of the teams did
not in the end come up with radical suggested changes. There was more of
an incremental improvement, hinging on defining categories of use and user,
and designing better forms. All groups made several presentations to the
senior management group with some material going on an internal web site.
104
CONTENTS
Conclusions
• Cross functional teams provided opportunities to get to know and work
with people outside the immediate unit;
• Developmental opportunities were offered to staff at lower levels, who
would not otherwise have been involved in this type of improvement
process work;
• Enhanced meeting skills and more acceptance of the need for facts
(i.e. managing by fact);
• Some of the data has been re-used for other purposes;
• Sound incremental improvements made by defining categories of use
and user, and designing better forms; and
• Staff gained recognition in the email and the staff newsletter.
Future
The M-Quality process has suffered a hiatus at the institutional level recently,
and also at the library level, through changing university staff and agendas.
105
CONTENTS
6 Staff competencies
6.1 Introduction
Staff competencies, including training, whilst not an integral component of
this report, have surfaced as an issue worth exploring. Survey responses have
indicated a diversity of approaches to training which range from ad hoc
informal awareness sessions to fully documented workshops. It is obvious
that the introduction of benchmarking, performance management and best
practice/quality improvement initiatives carries with it some need for a
corresponding focus on training for all staff, not just for those immediately
involved in the process. All staff need to be skilled in the use of the tools
and techniques which form part of these quality improvement processes.
Both the literature and the survey responses indicate that, whilst many
libraries have embraced quality improvement initiatives with some ardour, the
training and competencies required for staff to work effectively with the tools
and techniques has not always been adequately addressed. Some libraries
have recognised the need to ensure that staff have the appropriate blend of
skills to perform effectively in this new, more accountable, and therefore
more challenging environment, others have noted the need but have done
less of a concrete nature in terms of training to date. On the one hand, there
is an expectation that staff will apply new ‘business related’ principles and
concepts in their work, while on the other there is evidence that they are not
being given enough of the requisite knowledge and skills to interpret the
language, and apply the principles in practice.
107
CONTENTS
There are also sound arguments for quality principles to be integrated into
the routine staff development program of a Library, especially in the areas
of communication with clients (client service skills), communication between
library staff (team skills) and management training (leadership, strategic
planning, managing with information), just as the aim of quality management
frameworks is to become part of the routine way of operating. In many
situations, just in time, focussed training rather than a ’sheep dip’ approach is
far more effective, even when new approaches are being implemented. Large
scale training, particularly if not appropriate for the organisation, can lead to
a higher level of staff cynicism because the workshops or formal training did
not occur at the appropriate time, and in appropriate amounts,
to be either motivating or cost effective.
108
CONTENTS
In the process, some industries with far from outstanding records in the area
of skills recognition, staff development and pay levels have been able to
improve their performance. Some large public organisations have
implemented the National Training Agenda thoroughly throughout the
organisation. There has been a lower level of adoption within universities in
Australia. Within university libraries, whilst the standards have assisted with
the re-definition of jobs and the skills required, there has been little of the
formal assessment framework adopted to date. Whilst it is an attractive
proposition to be able to recognise the depth of expertise built up over time,
particularly by support staff who may not have formal educational
qualifications, significant resources are required for formal workplace
assessment frameworks to be fully implemented.
The library and information industry has now received its first set of national
competency standards. Whether they will be widely adopted by libraries, and
how they will be applied and assessed, remains to be seen. To date, the only
evidence of acceptance has been their preliminary use in two public library
systems in New South Wales, four academic libraries in Western Australia,
New South Wales, Victoria and the Northern Territory respectively and one
State Library (Tasmania). The majority of libraries appear to be adopting a
‘wait and see approach’ (Bridgland, 1998, p. 174).
109
CONTENTS
It is evident from the responses and feedback from site visits, that a more
rigorous approach to benchmarking training would help to demystify a
process which is still largely viewed as being more ‘business oriented’ than
library oriented.
110
CONTENTS
111
CONTENTS
112
CONTENTS
Once again, approaches to training varied from the formal to the informal.
Levels of formality also varied depending on the participants. It was more
likely to be more formally focused if aimed at senior management, or for staff
immediately involved in the project. A less formal broader based awareness
type session for other staff was in wider use. In most responses, training was
a ‘one-off’ session, program or workshop, conducted at the time the decision
to introduce an initiative was made. Institutions with ongoing training
programs which address quality management techniques (including
benchmarking and performance measurement) and skills are not common.
It is recognised that a significant investment of resources is required for the
successful implementation of this approach. It may be more likely to be in
place where the institution has a position dedicated to quality, staff
development and training. Training also tended to be more formal if the
program was a university initiative (CHEMS), and if it was conducted by an
external consultant/facilitator. For example, due to the rigorous requirements
of the Australian Business Excellence framework, the University of
Wollongong Library invested a large amount of time and effort into staff
training, and presented the most coordinated approach.
113
CONTENTS
114
CONTENTS
Bridgland states ‘the future role of library workers very much depends on
how well those within the sector ensure that their changing knowledge and
skills service changing client needs and information applications. Job growth
projections by DETYA consistently place librarians in the top percentile for
demand of skills over the next 5–10 years. Continuing education and training
therefore has a major role to play in securing the future for those within the
sector. (1999, n.p).
Whilst the Australian National Training Reform Agenda and its structure and
language closely follow its British counterpart, British university libraries
appear to have left work within this framework to the vocational education
sector. Consequently, the most extensive international training programs in
the quality area are related to United States TQM/CQI training programs,
usually implemented as part of an institution wide framework or program. All
of these include leadership and team training, customer skills, and a little on
creative problem solving and tools, making meetings effective and perhaps,
more recently, using the Baldrige criteria for organisational self assessment.
Some examples are given below.
The Danish work is also very interesting but not yet available in English.
115
CONTENTS
The training area is planning to incorporate sessions beyond the basics such
as Tapping Staff Potential which aims to:
• Foster the service oriented culture in a seamless way (ie without
unit barriers);
• Identify client needs;
• Provide exceptionally responsive service; and
• Present continuous process improvement at a more sophisticated level.
116
CONTENTS
Over the years the emphasis has changed towards performance measurement
and process improvement with more rigour in data collection and the
requirements to build data collection in to the workflow as new targets so
that subsequent performance deviations can be tracked easily. Another area
supported is the formation of cross campus teams to break down the usual
barriers (eg. executive assistants).
Staff development is provided through the business school, through the Royal
Danish Library School continuing education and the newly established Danish
Library Centre which focuses on programs for the newer skills for library staff
across all of Denmark.
At the lower level, training was extensive and consisted of team leader and
facilitator training of forty hours for facilitators, which included two hours of
introductory concepts, eight hours refining tools and processes and how to
select appropriate improvement projects to work. These had to be critical
business practices in individual units, something significant enough to make
an impact on yet also within their power to change. No team was allowed to
go off ‘doing’ something until the leader(s) had been through this training. All
projects had to meet certain criteria before time is spent on them. All teams
must have a leader and a separate facilitator who monitors the process rather
than the content.
117
CONTENTS
118
CONTENTS
7 Conclusions and
recommendations
7.1 Conclusions
There is considerable activity occurring in Australian university libraries
in the areas of benchmarking, performance measurement and best practice.
However, to this point, very little has been published which would enable
libraries to learn from the experiences of other institutions across the country.
In all areas of best practice, overseas experience and activity should also
be monitored. This study has drawn on these and has highlighted those
initiatives of particular note for the Australian environment.
7.2 Recommendations
7.2.1 Benchmarking
Recommendation 1: That the Australian academic library community further
investigate membership of organisations such as CHEMS and AQC as a means
to benchmark performance.
119
CONTENTS
both positive and useful. Increasing the membership of CHEMS in both the
Australian and New Zealand academic community could facilitate greater use
of benchmarking as a tool to improve performance. The requirement to rate
performance against statements of good practice would thus be shared
amongst a greater number of libraries.
At the present time, academic library performance in the above areas is not
included in the annual AARL Statistics. Given that performance measurement
in these areas has been problematic due to the qualitative rather than
quantitative nature of the measurement, more widespread documentation of
the results may assist in informing the academic library community on levels
of activity and performance within these areas.
To date, the QUT/UNSW benchmarking project has been one of the few
recorded attempts to benchmark reference/research support services. It
highlighted the lack of publicly available performance data and the lack of a
common methodology for measuring performance in specific research
support services, acquisitions and cataloguing. This view was also highlighted
by survey respondents in the current project study.
The Final report of the CRIG Working Party on Performance Measures for
Reference Services states that addition of statistics for reference services to the
AARL statistics may lend impetus to the development and adoption of the
recommendations made by CRIG in their report. This Report recommended:
• the proposed [ASK] model for evaluation of reference service effectiveness
be taken up by the profession;
• the twelve performance indicators for reference services be adopted by
Australian university libraries and additional performance measures be
developed where necessary; and
120
CONTENTS
It is evident, both from the priority areas identified for updating and
modification, and from the level of in-house adaptation of indicators reflected
in the survey responses, that if the CAUL Indicators are to remain a valid and
useful tool for libraries there must be consideration given to the need to
ensure their relevance in relation to:
• changes within the profession that may impact on service delivery
(particularly in the areas of Document Delivery and Customer
Satisfaction); and
• external factors that may impact on their continuing usefulness
(eg. technological advances, in particular the growing area of electronic
information, resources and services, Y2K).
Survey responses illustrate the large amount of effort that libraries have
invested in the development of in-house indicators. There has not been an
across the board adoption of the CAUL indicators, and it is quite likely that
there has been some duplication of effort in the development of in-house
indicators within various institutions. Exploring the possibility of sharing this
effort, and the indicators developed to date, was noted by a number of senior
library managers in the course of the EIP site visits and is thus worthy of
further investigation. Given that published literature on performance
measurement within the Australian academic library community is not prolific,
a database may fill a much needed gap in the availability of performance
information, and could facilitate wider participation in library benchmarking.
121
CONTENTS
122
CONTENTS
123
CONTENTS
124
CONTENTS
7.2.5 Overall
Recommendation 9: That given interest in Australian developments
in the area of best practice, every opportunity be taken by
Australasian practitioners to link Australian efforts into those underway
internationally.
125
CONTENTS
Appendix A:
Benchmarking survey
Name of partner/s
Type of organisation/s (ie library/other)
Duration of benchmarking exercise/dates
Scope of benchmarking (eg acquisitions process)
Key variables/indicators (eg supply time, materials availability
Your contact for the exercise (name, title, phone, email)
Report available? (Y/N)
127
CONTENTS
11. What was/is the relationship between benchmarking activities and other
quality management/improvement processes within your library?
128
CONTENTS
Appendix B:
Performance indicators survey
1. Does your library use the CAUL performance indicators? Please indicate
those used, frequency of administration and month/year of last use
[ ] Client Satisfaction Indicator A
Frequency: monthly/quarterly/annually/other: ____________
Last Use (MM/YY) ______________
129
CONTENTS
4. Are there any other indicators that you feel are important or very
important?
5. If CAUL were to undertake further developmental work on performance
indicators to which area/s would you allocate the highest priority?
6. Are there any changes you would like to see made to the existing CAUL
indicators? Please give brief details.
7. Have you used the information gathered from performance indicators as a
basis for benchmarking, process improvement or other quality
improvement initiatives? If so please give brief details.
8. Are any type of training or awareness sessions provided for staff involved
in the use of performance indicators? Please give brief details.
130
CONTENTS
Appendix C:
Quality/best practice/performance
measurement survey
As you are aware, the last five years have seen considerable emphasis on
quality, best practice and performance measurement. CAUL and its members
have responded positively through sponsorship of performance indicators,
publications, seminars, awards and other activities. This Evaluation and
Investigations Project (EIP) provides an opportunity to review our position on
these matters. In responding to this survey, please consider the following
definitions of terms as a guide:
• Quality: ‘The totality of features and characteristics of a product or service
that bear on the library’s ability to satisfy stated or implied needs’ (ISO
11620).
• Best Practice: ‘The pursuit of superior performance. The way sites of
excellence manage and organise their operations. It is a moving target. As
these organisations continue to improve, the ‘best practice’ goalposts are
constantly moving. Thus the concept of continuous improvement is integral
to the achievement of best practice. ’ (Australian Best Practice
Demonstration Program, 1991). Such sites of excellence would be expected
to provide lessons on how and why they work so effectively in the specified
areas which other organisations, through benchmarking, could adapt to
their own operations.
• Performance Measurement: ‘Comparing what a library is doing
(performance) with what it is meant to do (mission) and wants to achieve
(goals). Performance is the degree to which a library is achieving its
objectives, particularly in terms of users’ needs (IFLA, 1996).’ ’ Performance
measurement involves the evaluation of an activity, program or service in
relation to its appropriateness, effectiveness and efficiency. Performance
indicators are developed to measure these criteria.’ (Schmidt, 1990)
131
CONTENTS
A. General views
1. In your view, what place should a formal quality management program
have in the management of contemporary academic and research libraries?
2. If such a program has value, what conditions are necessary for its
successful implementation?
132
CONTENTS
Appendix D:
Institutions visited
Deakin University
University Librarian: Sue McKnight
University of Melbourne
Deputy University Librarian: Angela Brigland
University of Queensland
University Librarian: Janine Schmidt
University of Wollongong
University Librarian: Felicity McGregor
133
CONTENTS
Appendix E:
Useful sources
Benchmarking
Australasian sources
Benchmarking Communications Limited 1999, Benchmark: Learning From the
Best (http://www.best-practice.co.nz/)
135
CONTENTS
136
CONTENTS
International sources
Association of Commonwealth Universities University Management
Benchmarking Club, http://www.acu.ac.uk/chems/benchmark.html
137
CONTENTS
138
CONTENTS
Australasian sources
Byrne, A. 1997, ‘CAUL’s interest in performance measurement, Council of
Australian University Librarians; paper presented at a pre-conference to the
1996 ALIA conference’, Australian Academic & Research Libraries, vol. 28, no.
4, pp. 252–258.
139
CONTENTS
Cargnelutti, T., D’Avigdor, R. & Ury, J. 1996, ‘’KIN key indicators: Decision
making tools for managing library databases’, in Electronic dream? Virtual
nightmare, the reality for libraries, 1996 VALA Biennial Conference
Proceedings, Victorian Association for Library Automation, Melbourne,
pp. 331–359.
The authors of this paper initially reported and described a range of key
indicators of database and electronic resource usage at the 1996 VALA
conference. This paper focuses on the difficulties associated with
measuring electronic resource use and delivery, in particular web based
resources, and outlines additional key indicators based on changes to
the type and delivery of information now available.
140
CONTENTS
141
CONTENTS
International sources
Association of Research Libraries. Access and Technology Program/ILL/DD
Related Resources Measuring the Performance of Interlibrary Loan and
Document Delivery Services http://www.arl.org/access/illdd/illss.shtml
This valuable site includes the March 1998 Executive Summary of the
study results, an article which also appeared in the ARL Bimonthly
Newsletter of Research Library Issues and Actions, and information from
two subsequent symposiums and workshops on strategies to redesign
ILL/DD services using the identified characteristics of low cost high
performing ILL operations. Four performance measures were covered:
• direct cost;
• fill-rate;
• turn around time; and
• user satisfaction
Site visits to the ‘best practice’ organisations to interview staff about their
workflows will form part of the final report.
142
CONTENTS
• seat hours occupied per week per user or average number of visitors
in library at any given time;
• loans per user;
• stock on loan per user;
• inter library loans as a percentage of all loan; and
• staff hours spent on user education and information services per
week or percentage of staff time spent on user education and
information services.
Library
• number of libraries;
• space occupied;
• size of collection; and
• total library expenditure.
Institution
143
CONTENTS
• expenditure on hardware;
• electronic access expenditure;
• electronic format expenditures (CD-ROMs, disks and tapes, software);
• expenditure on maintenance of hardware available to users; and
• telecommunication expenditures.
144
CONTENTS
145
CONTENTS
SUPERCEDED BY EQUINOX
146
CONTENTS
The indicators listed have to meet the criteria of being already tested, in
common use, and applicable to almost any type of library. Hence the
29 indicators are conservative and cover only traditional services.
Coverage includes:
• User satisfaction;
• General (4 indicators on use/cost);
• Providing documents (6 indicators on availability/use);
• Retrieving documents (2 indicators on retrieval times);
• Lending documents (and document delivery) (6 indicators on
use/cost);
• Enquiry & reference services (1 indicator on ‘correct answer’ fill rate);
• Information searching (2 indicators on cataloguing searching success);
• Facilities (4 indicators on availability/use);
• Acquiring and processing documents (2 indicators on median
times); and
• Cataloguing (1 indicator on cost per title).
The inclusion of definitions, scope and methods of producing and
interpreting each indicator is useful.
147
CONTENTS
This paper is most useful for indicating the direction in which the ARL
statistical database is moving. Although the existing database is drawn
from fairly traditional sources it does have the very useful facility that
the user can select, search and customise data according to
requirements. This flexibility, in addition to the coverage and number of
years the ARL data has been collected, makes the existing database a
useful tool. This paper is a look at the future. It summarises an
Association of Research Libraries project on ‘The character and nature of
research library investments in electronic resources.’ The project looked
at US research library responses to supplemental questions on electronic
resources as part of collections or which are accessible through library
system terminals and their impact as part of library materials
expenditures. Document delivery and ILL expenditures, expenditures on
bibliographic utilities and consortia are included and discussed. US
libraries can compare their own figures with median and average figures
for the group.
148
CONTENTS
This paper which forms the basis for a current Coalition for
Networked Information Project of the same name. Both the
original paper http://istweb.syr.edu/~mcclure/network/toc.html
and the original and progress reports on the project are available at
http://www.cni.org/projects/assessing/reports/. Key elements in the
original paper include sections on collecting and using qualitative data,
measures under the headings of:
• users;
• costs;
• network traffic;
• use;
• network services; and
• support services.
Model user surveys are included, together with links to two self
assessment frameworks.
149
CONTENTS
• users;
• costs;
• network traffic;
• use;
• network services; and
• support services;
150
CONTENTS
At the end of 1998 the next stages of MIEL were just beginning.
MIEL 3–5 are just beginning and cover the following areas. Outcomes
are not due until 2000:
151
CONTENTS
Includes brief comments about the manuals sourced for the indicators
in the matrices.
152
CONTENTS
This includes a long list of data elements and core measures and indicators.
(The ToolBox mentioned above). A brief summary also appears in the
First Northumbria conference under Sumsion, John and Suzanne Ward,
‘EC Toolbox project: general findings and some particular proposals’, pp.
135–145.
153
CONTENTS
This study found that less than one third of surveyed libraries in 1996
and 1991 had policies in place relating to performance measures or
indicators, and less than a quarter were using indicators to evaluate
services on a regular basis in relation user/document delivery services.
Less than 40% evaluated enquiry services, and less than fifty evaluated
user education. Of those with results, less than half disseminated these
outside the library. The IFLA study was based on a 1993 survey by
Morgan of British higher education libraries.
Australasian sources
Australian Quality Council 1998, (http://www.aqc.org.au/)
154
CONTENTS
155
CONTENTS
participation in, the quality audit process’. The results of this survey
indicated that a significant number of university libraries were involved
in the development of quality assurance processes and management,
and have set a base upon which subsequent progress can be measured.
International sources
Assessment Tools for Quality Management in Public Libraries: a project funded
by the British Library Research and Innovation Centre and managed by
Loughborough University (DILS) Department of Information Studies (Professor
Margaret Evans and Kathryn Jones) and University of Sheffield (Dr Bob
Usherwood), http://info.lboro.ac.uk/departments/dils/research/qualman.htm
• Clear purpose;
• Vision commitment and leadership;
156
CONTENTS
Now an older text but still very useful for its simplicity, brevity, and
common sense. Still the only real ‘how to’ text available. The most
valuable segments include a library-oriented checklist of quality system
requirements according to the standard, and a simple framework for a
library quality plan which covers inquiring, processing and delivering
information. There are also words of wisdom on keeping
documentation simple. Since it is aimed at the small special library
market this manual may be insufficient for a university library but still
worth consulting and more practical than the Scandinavian guidelines
publication. (see below).
• leadership;
• policy and strategy;
• people management;
• resources;
• processes;
• customer satisfaction;
• people satisfaction;
• impact on society; and
• business results.
157
CONTENTS
• commitment;
• planning;
• action;
• evaluation; and
• references.
158
CONTENTS
ISO 9000 for Libraries and Information Centres: A Guide, 1996, report of a
project supported by NORDINFO, The Hague, FID, FID Occasional Paper,
no. 13.
• customer service;
• continuous improvement;
• measurement;
Staff competencies
Australasian sources
Competency standards
Australian Library and Information Association 1996–7, Competency Standards
and the Library Industry: A Workshop Series, ALIA, Canberra.
159
CONTENTS
Published both online and in print these guidelines form the basis of
defining roles and tasks for staff working in different positions within
libraries. Effective performance within the roles outlined is linked to the
Library Industry Competency Standards and Australian Qualifications
Framework levels.
Workplace experiences
Bridgland, A. 1996, ‘Potential applications of the library industry competency
standards at the University of Melbourne Library’, Education for Library and
Information Services: Australia, vol. 13, no. 1, pp. 83–89.
160
CONTENTS
Other perspectives
Australian Quality Council 1999, Management Competency Standard,
(http://www.aqc.org.au/mcsfbe.htm).
Please note, only Australian sources on staff competencies were reviewed for
inclusion in useful sources list.
161
CONTENTS
Appendix F:
Suggested outline for a ‘Best
Practice Handbook for Australian
University Libraries’
Introductory note
As commented elsewhere in this report, many libraries are deterred from
implementing initiatives in regard to benchmarking, performance indicators,
and quality frameworks by the perception that it is too costly to investigate
the appropriate applications for their circumstances. Since there is a wealth of
experience currently available and continuing to come on stream within
Australasia and internationally it would be a practical outcome arising from
this investigation and an effective investment in the better management of
Australian university libraries to make up to date information and advice
easily available and updateable. The information would be relevant for
libraries who have done little in this area and for the more experienced
institutions who wish to keep up to date or to promote an application which
may be new to them.
Whilst a pared down version of the handbook would make a useful one off
paper publication, because of the dynamic nature of the data it would be of
most use if made available as part of a web site, the most obvious choice
being that of the Council of Australian University Librarians (CAUL) site.
Somewhat in the manner of the United States American Research Libraries’
Statistics site, the current AARL statistics site could be expanded to become a
gateway for all libraries looking for information and sources of experience.
The outline below gives an indication of the possible scope of the “how to”
component of the handbook.
163
CONTENTS
164
CONTENTS
165
CONTENTS
Bibliography
Benchmarking
Australasian sources
Benchmarking in Australia, http://www.ozemail.com.au/~benchmrk/
– 1998, Benchmarking Library Best-Practice for Performance Improvement:
Key Performance Measures and Best-Practices for Superior Library Service &
Efficient Internal Work Practices, IES Conferences Australia, Chatswood,
N.S.W.
• ‘Common library benchmarking problems and how to overcome them’,
Isabella Trahn.
• ‘Using benchmarking to take your library towards best practice—Where
to start? What’s really involved?, Stephen Robertson.
– 1995, Benchmarking Self-Help Manual: Your Organisation’s Guide to
Achieving Best Practice, 2nd edn, AGPS, Canberra.
Byrne, A. 1995, ‘Best practice at the Northern Territory University,’ Australian
Academic and Research Libraries, vol. 26, no. 1, pp. 17–24.
167
CONTENTS
International sources
Allen, F. 1993, ‘Benchmarking: Practical aspects for information professionals,’
Special Libraries, vol. 84 , no. 3, pp. 123–130
Camp, R. 1989, Benchmarking: the search for industry best practices that lead
to superior performance, Milwaukee, Wisconsin, ASQC Quality Press
168
CONTENTS
169
CONTENTS
Universitas 21 http://www.universitas.edu.au
Australasian sources
Byrne, A. 1997, ‘CAUL’s interest in performance measurement, Council of
Australian University Librarians; paper presented at a pre-conference to the
1996 ALIA conference’, Australian Academic & Research Libraries, vol. 28,
no. 4, pp. 252–258.
Calvert, P. & Cullen, R. 1995, ‘The New Zealand public libraries effectiveness
study and the New Zealand university libraries effectiveness study’, Australian
Academic and Research Libraries, vol.26, no, 2, pp. 97–106.
Cargnelutti, T., D’Avigdor, R. & Ury, J. 1996, ‘’KIN key indicators: Decision
making tools for managing library databases’, in Electronic dream? Virtual
nightmare, the reality for libraries, 1996 VALA Biennial Conference
Proceedings, Victorian Association for Library Automation, Melbourne,
pp. 331–359.
170
CONTENTS
Cargnelutti, T. 1999, ‘Finding one’s web feet. Revisiting KIN: Key indicators of
electronic resource usage in the Web environment’, in Robots to Knowbots:
The Wider Automation Agenda, 1999 VALA Biennial Conference. Proceedings.
Victorian Association for Library Automation, Melbourne, pp. 279–296.
Paper also available at http://www.library.unsw.edu.au/~eirg/vala98.html
171
CONTENTS
–1995, Key Performance Indicators: A Practical Guide for the Best Practice
Development, Implementation and Use of KPIs, South Melbourne, Pitman
International sources
Abbott, C. 1994, ‘Performance indicators in a quality context,’
The Law Librarian, vol. 25, pp. 205–208
Abbott, C. 1990, ‘What does good look like? The adoption of performance
indicators at Aston University Library and Information Services,’ British
Journal of Academic Librarianship, vol. 5, no. 2, pp. 79–84
172
CONTENTS
Bancroft, A. et al, 1998 ‘A forward looking library use survey: WSU libraries in
the 21st century ’ Journal of Academic Librarianship, vol. 24, no 3 pp.
216–223
Bertot, J. & McClure, C. 1999, Analysis of state library data elements for
networked information resources and services: working paper,
http://www.albany.edu/~imlsstat
173
CONTENTS
174
CONTENTS
175
CONTENTS
Hernon, P., Nitecki, D. & Altman, E. 1999, ’service quality and customer
Satisfaction,’ Journal of Academic Librarianship, vol. 25, no. 1, pp. 9–17
IFLA, 1996 Section for University Libraries and General Research Libraries.
Discussion group on performance indicators, IFLA Conference, Beijing,
25 August 1996
176
CONTENTS
177
CONTENTS
178
CONTENTS
Sprehe, J. & McClure, C. 1999, ‘Performance measures for federal agency web
sites,’ http://istweb.syr.edu/~mcclure/PARS.Pro.April20.html
179
CONTENTS
Talbot, D., Lowell, C. & Martin, K. 1998, ‘From the User’s Perspective—the
UCSD Libraries User Survey Project,’ Journal of Academic Librarianship,
vol. 24, no. 5, pp. 357–364
Van House, N., Weil T. & McClure, C. 1990, Measuring academic library
performance: a practical approach, Chicago, ALA
Ward, S., Sumsion, J. & Bloor, I. 1995, Library performance indicators and
library management tools, Luxemberg, EC DG-XIII-E3,
Zwart, R. 1997, ‘DocUtrans and ISO 9002 quality assurance,’ see 2nd
Northumbria International Conference on Performance Measurement
Australasian sources
Bundy, A. 1997, ‘Investing for a future: Client focussed Australian academic
libraries in the 1990’s’, Australian Library Journal, vol. 46, no. 4, pp. 354–369.
180
CONTENTS
Cooper, M. 1996, ‘The use of total quality management (TQM) in libraries and
information services in Australia and overseas’, (1995 Metcalfe Medallion
essay), Australian Library Journal, vol. 45, no. 2, pp. 92–101.
McCarthy, J. & Marrie, J. 1995, ‘An evaluation of the reference service at the
Educational Resource Centre, University of Melbourne,’ Australian Academic
& Research Libraries, vol. 26, no. 1, pp.33–42.
181
CONTENTS
Stevens, J. 1995, ‘Adopting best practice’, Incite, August, vol. 16, no. 8, pp. 12.
182
CONTENTS
International sources
Abbott, C. 1994, ‘Performance indicators in a quality context,’ The Law
Librarian, vol. 25, pp. 205–208
183
CONTENTS
Garrod, P. & Kinnell Evans, M. 1995, Towards library excellence: Best practice
benchmarking in the library and information sector, BLRD Report, London,
British Library Research & Development Division
Great Britain, 1991, The Citizen’s Charter: raising the standard, London,
HMSO
184
CONTENTS
Irving, A. 1992, ‘Quality in academic libraries: how shall we know it?’ Aslib
Information, vol. 20, no. 6, pp. 244–246
National Institute for Standards and Technology (US). The Malcolm Baldridge
National Quality Award Criteria for Education, http://www.quality.nist.gov/
docs/99_crit/99crit.htm#education
NORDINFO, ISO 9000 for Libraries and Information Centres: A Guide, Report
of a project supported by NORDINFO, The Hague, FID,1996. FID Occasional
Paper no. 13.
185
CONTENTS
Rowley, J. 1996, ‘Implementing TQM for library services: the issues,’ Aslib
Proceedings, vol. 48, no. 1, pp.17–21
186
CONTENTS
Staff competencies
Arts Training Australia 1995, Library Industry Competency Standards, Arts
Training Australia, Woolloomooloo, N.S.W.
Bridgland, A. 1998, The Impact of the National Training Reform Agenda and
Workplace Rearrangement on Staff Development in Australian Academic and
State Libraries, Doctoral Thesis, University of Melbourne, Melbourne.
187