Advanced LIMS Technology - Case Studies and Business Opportunities

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 268

Advanced LIMS Technology

Case Studies and Business Opportunities


Advanced LIMS Technology
Case Studies and Business Opportunities

Edited by

J.E.H. STAFFORD
Consultant
Fisons Instruments
Cheshire

SPRINGER-SCIENCE+BUSINESS MEDIA, B.V.


First edition 1995
© 1995 Springer Science+Business Media Dordrecht
Originally published by Chapman & Hali in 1995
Softcover reprint ofthe hardcover Ist edition 1995

Typeset in 10/12pt Times by Cambrian Typesetters, Frimley, Surrey


ISBN 978-94-010-4270-3 ISBN 978-94-011-0615-3 (eBook)
DOI 10.1007/978-94-011-0615-3
Apart from any fair dealing for the purposes of research or private study, or
criticism or review, as permitted under the UK Copyright Designs and
Patents Act, 1988, this publication may not be reproduced, stored, or
transmitted, in any form or by any means, without the prior permission in
writing of the publishers, or in the case of reprographic reproduction only in
accordance with the terms of the licences issued by the Copyright Licensing
Agency in the UK, or in accordance with the terms of licences issued by the
appropriate Reproduction Rights Organization outside the UK. Enquiries
conceming reproduction outside the terms stated here should be sent to the
publishers at the Glasgow address printed on this page.
The publisher makes no representation, express or implied, with regard to
the accuracy of the information contained in this book and cannot accept
any legal responsibility or liability for any errors or omissions that may be
made.
A catalogue record for this book is available from the British Library
Library of Congress Catalog Card Number: 95-00000

e Printed on acid-free text paper, manufactured in accordance with ANSI/


NISO Z39.48-1992 (Permanence of Paper).
Preface

The use of Information Technology (IT) to automate business processes


has had a tremendous impact upon the lives of people throughout the
w<?r1d. IT has transformed the role of laboratory managers, scientists and
technicians through the application of point solutions to business challenges.
The inappropriate context of these solutions has often led to unexpected
problems with laboratory management, giving rise to the need for a holistic
approach to the use of IT in laboratory environments.
One vehicle for the introduction of IT into laboratories is a Laboratory
Information Management System (LIMS). A LIMS implies a holistic
approach, but in practice this has not happened because the vision for
implementation has been clouded by the need to solve only local sample
management issues. Part of the problem may relate to understanding the
nature of a LIMS and the impact of context on the relationship between
the laboratory and the business world. If the purpose of the laboratory is to
provide information, then context must be part of the laboratory input. As
an illustration of this requirement, the application of LIMS in forensic,
environmental, pharmaceutical and clinical research as well as clinical
pathology laboratories is discussed. The analytical process must be viewed
as an integral part of the LIMS and not merely an addition. The use of
computerised equipment and robots to automate the laboratory has
important implications for the relationship between the technical and
managerial staff, and new technology facilitating information generation
will have further impact upon the supervisory and managerial staff.
However, freedom to manage and manipulate data in order to generate
information will be severely restricted unless standards are in place to
facilitate the sharing of data.
This book should enable laboratory managers, scientists and information
management personnel to understand the IT requirements of their
colleagues in other disciplines. They will discover that, to varying degrees,
they share many requirements.
Contributors

Dr K.E. Blick Department of Pathology, The University of


Oklahoma Health Sciences Center, PO Box
26307, Oklahoma Memorial Hospital, 800 NE
13th, Rm EB-400, Oklahoma City, Oklahoma
73126, USA
Dr D.L. Clay Office of Information Resources Management,
EPA (MD-34), Research Triangle Park, North
Carolina 27711, USA
Dr D.M. CIiDe Office of Information Resources Management,
EPA (MD-34), Research Triangle Park, North
Carolina 27711, USA
Dr M.J. Crook Process Analysis & Automation Ltd, Falcon
House, Fernhill Road, Farnborough, Hants
GU14 9RX, UK
Dr D.J.M. Graham Quintiles Scotland Ltd, Research Avenue South,
Heriot-Watt University Research Park,
Riccarton, Edinburgh EH14 4AP, UK
Dr T.V. lorDs Andersen Consulting L.L.P., 100 Campus Drive,
PO Box 765 Florham Park, NJ 07932, USA
Dr R.S. Lysakowski Optimize Technologies, 8 Pheasant Avenue,
Sudbury, MA 01776, USA
Dr R.D. McDowall McDowall Consulting, 73 Murray Avenue,
Bromley, Kent BRI 3DJ, UK
Dr F.D. Morrow Quintiles Laboratories Ltd., 5500 Highlands
Parkway, Suite 600, Smyrna, GA 30082, USA
Dr W.M. Shackelford Office of Information Resources Management,
EPA (MD-34), Research Triangle Park, North
Carolina 27711, USA
Mr T. De Silva Horseracing Forensic Laboratory Ltd, PO Box
15, Snailwall Road, Newmarket, Suffolk CB8
7DT, UK
VIll CONTRIBUTORS

Dr J.E.H. Stafford Fisons Instruments, LabSystems,-Number 1, St


George's Court, Hanover Business Park,
Altrincham, Cheshire WA14 5TP, UK
Dr I.R. Storr PA Consulting Group, 123 Buckingham Palace
Road, London SW1W 9SR, UK
Mr R. Yuille North West Water Ltd., Thornton Road, Great
Sankey, Warrington WA5 2SL, UK
Contents

1 LIMS: An automating or informating technology? 1


J.E.H. STAFFORD
1.1 Introduction 1
1.2 Current LIMS fail to meet business requirements 1
1.2 1 New customers 1
1.2.2 Deriving information from data 2
1.2.3 Increased data visibility 5
1.2.4 Data fit for purpose 5
1.3 Current LIMS automate data management functions 5
1.3.1 Control of data 6
1.3.2 Data quality 7
1.3.3 Consolidation of executive control 8
1.4 New LIMS will informate, not automate 8
1.4.1 The human element in the laboratory process 9
1.4.2 Improving the quality of data capture and analysis 9
1.4.3 Increased control over sample gathering 11
1.5 Architecture of an informating system 11
1.5.1 Application layer 11
1.5.2 Human interface 12
1.6 Making IT happen 13
References 13

2 A model for a comprehensive LIMS 15


R.D. McDOWALL
2.1 Introduction 15
2.2 Strategic design of a LIMS 16
2.2.1 The Data and Information Domains of a LIMS 16
2.2.2 Laboratory business objectives and the role of a LIMS 17
2.3 What is a LIMS? 18
2.4 An architecture for a comprehensive LIMS 20
2.4.1 Architectural views and communication 20
2.5 A LIMS model 21
2.5.1 LIMS functional areas 22
2.5.2 Functional levels 23
2.5.3 Communication with the database 23
2.5.4 LIMS architecture 24
2.5.5 Model organization 24
2.6 Definition of a LIMS 24
2.6.1 Promoting interdisciplinary communication 25
2.6.2 Technology-independent model 25
2.6.3 Future expansion of the model 25
2.7 Detailed classification of LIMS functions 26
2.7.1 Group One: Data capture 26
2.7.2 Group Two: Analysis of data 28
x CONTENTS

2.7.3 Group Three: Reporting 29


2.7.4 Group Four: Managing data and the laboratory 29
2.7.5 Comments on functions in Levels 30
2.8 Applying the LIMS model: the selection of a commercial LIMS 31
2.8.1 Constructing a LIMS model for each supplier 31
2.8.2 Evaluating potential suppliers 32
2.8.3 The role of the LIMS model in evaluation 33
2.9 LIMS standards 34
2.9.1 System management 35
2.10 Summary 36
References 36

3 LIMS in a forensic laboratory 37


T. DE SILVA

3.1 Introduction 37
3. 1.1 The role of a forensic laboratory 37
3.1.2 Automation 39
3.1.3 Analytical data 40
3.2 Objectives of a LIMS 40
3.2.1 Efficient interactive sample log-in 41
3.2.2 Sample tracking 41
3.2.3 Utilisation of barcodes 41
3.2.4 Automatic analytical data transfer and result entry 41
3.2.5 Less clerical work for scientists 41
3.2.6 Application of GLP principles 42
3.2.7 Laboratory performance statistics 42
3.2.8 No typing delays 42
3.3 The system 42
3.3.1 Sample receipt 45
3.3.2 Sample analysis 47
3.3.3 Analytical result review 48
3.3.4 Reporting 49
3.3.5 General management 50
3.4 The future 51
3.5 Conclusions 52
Acknowledgement 53

4 Application of a LIMS in a pharmaceutical drug metabolism


and pharmacokinetics laboratory 54
D.J.M. GRAHAM
4.1 Introduction 54
4.2 Study objectives in drug metabolism and pharmacokinetics 55
4.2.1 Pharmacokinetic studies 56
4.2.2 Drug metabolism studies 56
4.2.3 Outline study design 58
4.3 Configuration of the database 60
4.3.1 The sample database 60
4.3.2 Sample identifier (SID) (five fields) 61
4.3.3 Time fields (eight fields) 61
4.3.4 Sample status 62
4.3.5 Whodunit fields (six fields) 64
4.3.6 Miscellaneous fields 64
4.3.7 Test-related information 64
CONTENTS xi

4.4 LabManager in use 64


4.4.1 Defining the sample matrix 64
4.4.2 Multisample processing 66
4.4.3 Processing sample time data 66
4.4.4 Sample testing 67
4.4.5 Quality Control 68
4.4.6 Security 68
4.4.7 Information management 69
4.5 The future 69
References 70

5 Use of protocol-synchronous LIMS structures to expand the


role of the centralized clinical trial laboratory in
pharmaceutical research 71
F.D. MORROW
5.1 Introduction 71
5.2 The expanding role of the central laboratory in pharmaceutical
research 71
5.3 Comparing traditional and protocol-synchronous LIMS structures
in the clinical trial laboratory 75
5.3.1 LIMS structures in the traditional clinical laboratory environment 76
5.3.2 LIMS structures in the clinical trial laboratory environment 78
5.4 Defining protocol-driven time and events using a multidimensional
matrix 80
5.4.1 Using the accession number as a discrete address in a
multidimensional matrix 82
5.4.2 Using the time and events matrix to manage the production,
distribution and inventory of clinical trial materials 83
5.4.3 Using the time and events matrix to monitor receipt and validity
checking of incoming specimen collection kits 84
5.5 Managing protocol-driven time and events using matrix-dependent
control structures 86
5.5.1 Control structures used in the aceessioning process 88
5.5.2 Managing the data clarification and data revision processes
through control structures 89
5.5.3 Using control structures to support multilevel range-checking
functionalities 90
5.5.4 Using control structures to manage and document phone alert
and reflex messaging function alities 92
5.6 Managing protocol-driven time and events using matrix-dependent output
structures 93
5.6.1 Constructing protocol-specific laboratory reports and requisitions
using matrix-dependent ouput structures 93
5.6.2 Using protocol-synchronous output structures to support the
data clarification process 94
5.6.3 Supporting electronic data services using protocol-synchronous
output structures %
5.7 Summary %

6 Medical Laboratory Information Systems (LIS) 97


K.E. BLICK
6.1 History of clinical laboratory computerization 97
6.2 Computerization and automation of the 'testing process' 98
xii CONTENTS

6.3 How computers function in the clinical laboratory 99


6.3.1 Admission, discharge and transfer (ADT) 99
6.3.2 Order entry and order communication 99
6.3.3 Phlebotomy service 100
6.3.4 Receipt of specimens and prioritization of testing 100
6.3.5 Analysis of specimens with scheduling and data collection 100
6.3.6 Quality control (QC) and results verification 103
6.3.7 Interpretative functions 104
6.3.8 Results reporting 104
6.3.9 Statistics and quality assurance functions 104
6.3.10 Billing 104
6.4 Acquisition of a LIS 105
6.4.1 Areas of the laboratory computerized 105
6.4.2 Single vendor versus multivendor 105
6.4.3 Computer hardware features 106
6.4.4 Minicomputer versus microcomputer solutions 107
6.4.5 Software and database design 107
6.4.6 Most important issues when selecting a LIS 108
6.5 Future of laboratory information systems 109
References 110

7 EPA's Relational Laboratory Infonnation Management


System: Development and implementation 111
W.M. SHACKELFORD, D.M CLINE and D.L. CLAY
7.1 Introduction 111
7.2 Development 112
7.2.1 General principles 112
7.2.2 Developmental philosophy/approach 113
7.2.3 Face to face with the user 114
7.2.4 Completion and testing 115
7.3 Implementation 116
7.3.1 The finished product 116
7.3.2 Laboratory installation and implementation 119
7.4 Conclusions 122
References 122

8 LIMS to robotics interface: A practical approach 123


R. YUILLE
8.1 Introduction 123
8.2 The case for automation 124
8.3 Role of a Laboratory Information Management System 126
8.4 Sample planning and scheduling 126
8.4.1 Sample container types 127
8.4.2 Sample shelf-life 128
8.4.3 Transportation of samples to equipment 128
8.4.4 Sample and container recognition systems 129
8.4.5 Control of services and consumable items 130
8.5 Auditability 130
8.5.1 Error logging and reporting 130
8.6 Information transferred via the interfaces 131
8.7 Laboratory Information Management System network 131
8.8 Analytical process automation 133
8.9 Impact on the laboratory working environment 136
Appendix 137
References 140
CONTENTS xiii

9 Interfacing the real world to LIMS 141


M.J. CROOK

9.1 Introduction 141


9.2 The analysis procedure 142
9.2.1 Standalone instrumentation 142
9.2.2 Integrated instrumentation 143
9.2.3 Data transfer methods 143
9.3 Beckman 147
9.3.1 Instrument automation components 147
9.3.2 LIMS Link Instrument Coupler 148
9.3.3 Instrument Data Acquisition System (IDAS) 149
9.3.4 Transaction Processing Option (TPO) 150
9.3.5 LIUforms operation 150
9.3.6 Typical instrument automation scenarios 150
9.4 Hewlett Packard 153
9.4.1 HP ChemLMS 153
9.4.2 Automatic data entry 155
9.4.3 C to CP communication 156
9.5 LabSystems (Fisons Instruments) 156
9.5.1 Introduction 156
9.5.2 LabStation architecture 156
9.5.3 Instrument interfacing process 158
9.5.4 Examples of some specific instruments 160
9.5.5 Future developments 162
9.6 The future 162
9.6.1 Handwriting recognition 162
9.6.2 Voice recognition 164
Acknowledgements 164

10 Replacement LIMS: Moving forward or maintaining the


status quo 165
T.V. IORNS

10.1 Introduction 165


to.2 Why change? 165
to.2.1 Current LIMS does not adequately support business strategy
and user requirements 166
10.2.2 Business process re-engineering or organizational change
dictates new requirements 169
10.2.3 LIMS evolution, technology changes or maintenance
considerations dictate a change 171
10.3 Why not change? 175
10.3.1 Investment protection 175
10.3.2 Business processes need changing first 176
10.3.3 Something better to do with the money or not enough money 177
10.3.4 Already meets basic needs 177
10.3.5 Customizations expensive to reproduce 177
10.3.6 Interfaced to other systems 178
10.3.7 Users and computing staff already trained 178
10.3.8 System validated 178
10.3.9 Difficult to justify 179
1O.3.tO Fully satisfied with vendor 179
10.4 How long should a LIMS last? 180
10.5 How do you justify a replacement LIMS? 181
10.6 Would a custom system be better? 182
10.7 Support your vendor 184
xiv CONTENTS

10.8 How to implement a replacement LIMS \84


10.8.\ Define strategy \85
10.8.2 Re-engineer key business processes 187
10.8.3 Define preliminary requirements, and select system 187
10.8.4 Final requirements plan, develop and implement 189
10.8.5 Change management 189
10.8.6 System integration 189
10.8.7 Training, documentation and validation 190
10.8.8 Maintenance 190
10.9 Keep an eye on the future - is it easier the second time around? 191

11 The promise of c1ient-server LIMS applications 192


I.R. STORR

11.1 Introduction 192


11.2 Review of LIMS development over the last ten years - the story so far 192
11.3 Current trends 193
11.4 Regulatory requirements 195
11.5 Standards for systems analysis and construction of information systems 195
11.6 Understanding the user 196
\1.7 Meeting the requirements with appropriate technology: The challenge
facing c1ient-server technology 197
11.8 Discussion of relationships and issues 197
11.8.\ Internal conflict 198
11.8.2 User power \98
11.8.3 Data ownership 199
11.8.4 False economies \99
11.9 Systems analysis, construction of information systems and process
re-engineering 200
11.10 Software development 200
\1.11 Communications 201
11.12 Implementing c1ient-server technology 202
11.13 Conclusions 202
11.13.\ Client-server technology is not a panacea 203
\1.13.2 Application of c1ient-server technology 204
11.13.3 Software development 204
11.14 The way forward? 204
References 205

12 Standards for analytical laboratory data communications,


storage, and archival 206
R.S. LYSAKOWSKI

\2.1 Introduction 206


\2.2 Standards investment and payback 207
12.3 The ADISS Program 209
12.3.1 Technology goals must meet the needs of people and businesses 209
12.3.2 Portable laboratory data objects 212
\2.3.3 The ADISS Analytical Information Model (ADlSS AIM) 212
12.3.4 Software implementation of the ADISS architecture 2\4
12.3.5 NetCDF - the de facto standard implementation of the
ADISS architecture 215
12.4 Application of the ADISS Information Model to chromatography 218
12.5 Future ADISS extensions for chromatography and other techniques 219
12.5.\ Future extensions for chromatography 219
12.5.2 Other analytical techniques being completed now 223
CONTENTS xv

12.5.3 Formal standardization of the ADISS specifications 224


12.5.4 NetCDF v2.3.3 upgrade 224
12.6 Future influence of ADISS standards on LIMS in R&D 226
12.7 The influence of standards on market dynamics 227
12.8 Summary and recommendations 228
12.8.1 Industry standards status 229
12.8.2 Call to action 230
Note 231
References 231

Index 233
Glossary

The following is a selection of discipline-specific terms useful to the reader.


Associated definitions reflect what the editor believes to be contemporary
interpretations of terminology originally published by various authorities
and authors.

Accession number A number assigned to a sample either before or after


receipt in a laboratory, which is used to positively identify it whilst in
transit through the testing facility.

Active database That part of a LIMS database used for work in progress.

Application Development Life Cycle see System Development Life Cycle.

ASTM American Society for Testing and Materials. An organization


which develops and publishes standards by consensus.

Audit function Systematic capture of selected events within the LIMS


database which are subsequently used to investigate or document actions
within the laboratory. Audit trail functions are critical for LIMS used
within regulated environments such as FDA or EPA.

Audit trail A report generated by the (central) laboratory LIMS that


documents any revisions made to a (sponsor's) database as a result of data
validity checks and clarification procedures. Data revision reports indicate
who requested the change, why the change is being made, the before and
after image of the database and the date on which the change was made.
Also known as a Data Revision Report.

Backus-Naur Form BNR. A metalanguage used to define other


languages.

BPR see Business process re-engineering.

BUN Blood urea nitrogen. A clinical biochemical indicator of kidney


function.
xviii GLOSSARY

Business process re-engineering A phrase used to describe the process of


understanding how an organization actually carries out its business and
using the information gained to improve and optimize business practices.

CANDA Computer assisted new drug application. An electronic frame-


work for automating and expediting NDA submissions.

CASE Computer aided software engineering. Software applications


which attempt to automate one or several stages of the formal SDLe.

CDER FDA Center for Drug Evaluation and Research.

Central laboratory A term describing a clinical or analytical laboratory


supporting FDA mandated research. The term arose from the paradigm
which suggests that the use of a single laboratory facility will result in
laboratory and data management services that cannot be accomplished by
use of multiple laboratories. Central laboratory provides logistical and
analytical support for the patient event.

cGMP Current GMP.

Chain of custody The complete record of the life cycle of an item under
analytical investigation, from the point of collection and preservation
through to its storage, transfers, analysis and disposal.

Class inheritance A property of object classes wherein one class shares


(inherits) the structure or behaviour defined in one or more other classes.

Client server A software architecture that exploits computer network


technology by dividing the application between the client (the user
interface) and the server (the data storage device).

Clinical trial A controlled and regulated series of procedures or


experiments conducted on humans during the development of new
therapeutic drugs, biologics or medical devices. Qualified as Phase I
through to IV, each of which may constitute multiple trials. Phase II trials
are also called pivotal trials. Phase IV trials are also called postmarketing
surveillance.

CNS Central nervous system.

Committed database That part of a LIMS database used to store data for
samples on which work has been completed. Splitting the LIMS database
into active and committed is a historical device employed to maintain
GLOSSARY XIX

system response time. Should not be confused with the process of data
archiving.

CRA Clinical Research Associate, see Monitor.

CRO Contract Research Organization. A business entity that provides


contract research support to pharmaceutical and medical device manu-
facturers. Depending on their size and scope of operation, a CRO may
comprise scientists, physicians, biostatisticians, computer scientists and
regulatory experts. CROs operate under similar regulatory constraints as
do their client sponsors.

CTM Clinical trials materials. Any material distributed by either the


sponsor or a contractor (CRO or central laboratory) and used by the
investigator to record, document or conduct patient events during the
course of a clinical trial.

Database construction A phrase used in clinical data management to


describe the process whereby a database is populated with data relevant to
the sponsor's study.

Data Revision Report see Audit trail.

DDE Dynamic Data Exchange. Microsoft Windows protocol that enables


two applications to share data continually and automatically.

Delta check A quality control parameter derived by comparing the


current test result to the previous value for the same subject. Widely used
in clinical chemistry laboratories to monitor changes in patient analyte
levels within the normal range.

Dynamic data Data which describe samples and results.

Edited database The concept that it is the responsibility of the central


laboratory to systematically validate the sponsor's database in a proactive
manner throughout the course of a clinical trial. As part of the process the
central laboratory will undertake requests for data clarification from the
investigator sites. Requests for data clarifications frequently result in the
need to amend patient or laboratory related information, i.e. edit the
database. These changes give rise to an audit trail.

EPA Environmental Protection Agency. A US government agency


charged with monitoring and enforcing regulations concerning acceptable
levels of toxic materials in the environment.
xx GLOSSARY

FDA Food and Drug Administration. An agency of the US government


which regulates and oversees development of ethical drugs, biologics,
medical devices and food additives for marketing in the USA.

Functional requirement Specifies a function that a system or software


module must be capable of performing. Less detailed than a functional
specification.

GALP Good Automated Laboratory Practice. Guidelines developed by


the EPA to ensure a high quality of computer resident data. They address
computer-specific issues outside the original scope of GLP.

GMP Good Manufacturing Practice. A combination of manufacturing


and quality control procedures aimed at ensuring that products are
consistently manufactured to their specifications.

HIS Hospital Information System.

lATA International Air Transport Association. An internationally


recognized advisory group that publishes standards, nomenclature and
procedures used by airline and air cargo industries to transport both
hazardous and non-hazardous materials.

IND Investigational New Drug submission. The FDA review stage used
to regulate the testing of pharmaceuticals in human volunteers.

Informate That aspect of using information technology to automate a


process that simultaneously produces information about the underlying
process.

Investigator A clinician selected by the Sponsor and approved by the


regulatory authority, e.g. FDA to participate in a clinical research study
used to generate data for submission to the FDA in support of an IND or
NDA. The investigator recruits patients and conducts the trial as per the
sponsor's FDA approved protocol and under the auspices of clinical
monitors.

ISO 9000 A series of international standards for quality management


primarily intended to ensure acceptable levels of quality in the fulfilment
of a contract between a supplier and purchaser.

Laboratory Events Schedule A schematic representation of the key


laboratory events, both scheduled and unscheduled, which will occur as the
patient proceeds through the clinical trial.
GLOSSARY XXI

LAN Local Area Network. Term given to a small grouping of PCs linked
together to enable sharing of peripherals and data. The network is
managed using dedicated application software.

LIMS Laboratory Information Management System. Computer applica-


tion(s) which acquire, analyse, report and manage data and information in
the laboratory.

Log-in The electronic registration of a sample. Depending on the


implementation, this procedure may not be synonymous with the physical
action of receiving a sample in the laboratory.

MDL Minimum Detection Limit. The minimum amount of a substance


that .can be measured with an acceptably low level of confidence.

Monitor A person who visits investigator sites and conducts audits of


study records to ensure that the study is conducted as described in the
protocol. The audit also includes the best clinical practice generally
accepted by regulatory authorities within the confines of ethical scientific
conduct.

NAMAS National Measurement Accreditation Service. An executive


branch of the UK Department of Trade and Industry and part of the
National Physical Laboratory.

NDA New drug application. The FDA review stage used to regulate the
introduction of pharmaceutics into the US market place.

Object Class A well-defined data structure, together with a set of


operations.

Object oriented A paradigm for software design and development which


exploits properties of objects.

Patient event A patient event represents one or more of a series of


contacts between the investigator and research subject (patient) partici-
pating in an approved clinical trial. Patient events would typically involve
collection of demographic information and laboratory specimens which are
forwarded to the central laboratory for analysis and reporting.

PCMCIA Personnel Computer Memory Card International Association.


An organization which sets standards for a credit-card size system of cards
which can be plugged into mobile PCs to provide additional services such
as extra memory or access to electronic communications.
xxii GLOSSARY

Pharmacokinetics The kinetics of the absorption, distribution and


elimination processes whereby drugs are absorbed and excreted by the
body.

Phlebotomy Surgical incision into a vein for withdrawal of a blood


specimen.

PK see Pharmacokinetics.

PLA Product Licence Application. The Medicines Control Agency


review stage used to regulate the introduction of pharmaceutics into the
UK market place.

Protocol-synchronous structures Data structures within a LIMS which


allow for the predefinition of key protocol design elements prior to study
commencement. Protocol-synchronous structures are used to facilitate the
production, receipt and processing of clinical trial or study materials, and
in tracking and reporting of patient events and laboratory data during the
conduct of a clinical trial.

Protocol An overall research plan submitted by a sponsor and approved


by the appropriate authority, e.g. the FDA, local hospital ethical
committee or responsible person. The protocol is used to guide the
research on human volunteers and animals during development of a
regulated xenobiotic compound or medical device.

Quality Assurance The activity of providing assurance that the quality


function is being performed adequately.

Quality Control A process whereby actual performance is measured and


compared to specifications (standards). Quality is controlled by acting on
the differences uncovered by the process.

Quality "The totality of Features and Characteristics of a Product or


Service that bear on its ability to satisfy stated or implied needs" ISO
4802.

RAD see Rapid Application Development.

RAID see Redundant array of inexpensive disks.

Rapid Application Development A paradigm for software design and


development that exploits the evolution of multiple prototype applications
to capture user requirements more efficiently.
GLOSSARY XXIII

RDBMS see Relational Database Management System.

Redundant array of inexpensive disks A data storage device comprising


of an array of up to five disks managed jointly by a central processor unit.

Relational Database Management System A database architecture where


data are stored as related entities in tables (relations). These relationships
can be defined mathematically.

SAS A trademark of the SAS Institute Inc. and used as a prefix for
identifying their products and services for data analysis.

Shelf-life The length of time a product or material may be stored under


defined conditions and retain an acceptable, specified level of quality.

Software design specification A specification which documents the design


of the system or software module.

Specimen A sample of material for diagnostic evaluation or examination.

Sponsor A term used to designate the pharmaceutical, biotechnological


or other business entity planning, organizing and funding the development
of an ethical drug or medical device under the auspices of the FDA or
other regulatory authority.

SQC Statistical Quality Control. The use of statistics to quantify and


control factors which influence quality.

SQL Structured Query Language. A standard language for defining,


managing and maintaining RDBMS.

Stability Testing Studies or trials leading to definition of recommended


storage conditions for materials. Also used for ongoing stability studies
required to confirm predicted shelf-life in practice.

STAT A sample with the highest priority for testing. Sample requiring
immediate testing.

Static data Data which defines the laboratory environment, e.g. instru-
ments, personnel, analyses.

Study An instance of a protocol. The mechanism by which research data


are generated for inclusion in submissions to regulatory authorities. A
protocol may comprise or create many studies.
xxiv GLOSSARY

Systems Development Life Cycle A formal method for developing


computer applications. Facilitates validation of the application.

Test schedule A listing of tests or related analytical and administrative


procedures which are assignable to an event or accession number.

Validation The process of ensuring that a process, system or item meets


predefined requirements or specifications.

Xenobiotic Qualifies a material foreign to living organisms.


1 LIMS: An automating or informating technology?
J.E.H. STAFFORD

1.1 Introduction

The concept of Laboratory Information Management Systems (LIMS) is


well established, yet almost half of the implementations have fallen short
of expectation (chapter 2, this volume). Why should this be? In an age
where practically all laboratory instruments have microprocessor-based
controllers, less than half of the laboratories surveyed had 'some'
instruments electronically linked to the LIMS database [1]. Why is this?
The literature is replete with advice on how to specify, select and
implement a LIMS [2-6 for reviews], yet still a successful implementation
apparently eludes many laboratories. Could it be that LIMS specifiers,
implementors and vendors have an unconscious, hidden agenda that
restricts the usefulness of the current generations of LIMS? Or is the view
of what a LIMS is too narrow to be of practical use? What new approach is
necessary to enable LIMS technology? These issues will be addressed
further in this and subsequent chapters.

1.2 Current LIMS fail to meet business requirements

One reason for the failure of LIMS implementations is attributed to setting


the users' expectation too high. This explanation is a little difficult to
understand, since it is the users who should have defined their requirements.
However, it is probable that the users have specified what they think they
require, but have been unable to vocalise what they really require and
tacitly assumed that the system will do it anyway. LIMS specifiers may
even have wrongly identified their major users!
The value of analytical data in supporting the product-development and
manufacturing life cycle, forensic or environmental investigations has led
to a reappraisal of its ownership. More and more these data are seen as a
Corporate or Company-wide asset. As such it is much more visible.

1.2.1 New customers


The organisational structure of computerised industries can be modelled as
a series of concentric circles around the central Corporate database [7].
2 ADVANCED LIMS TECHNOLOGY

Each circle in Figure 1.1 represents a layer of management responsible


for increasing degrees of data conceptualisation as data move from the
centre to the outermost ring. Traditionally management at the interface
transforms the data into information for dissemination within that
management layer. In this model, LIMS implementations are viewed as
the management system responsible for the capture, control, processing
and dissemination of analytical data, i.e. the ring closest to the database.
This aspect is more fully discussed in chapter 2. However, this concentric-
ring view of the Organisation also implies that the database can be accessed
from any ring in the circle, i.e. any level of management.
Historically LIMS database systems were designed to service laboratory
reporting requirements only. Communication with customers is by way of
predefined reports (Figure 1.2).
Data are pushed outwards from the centre. Traditional systems do not fit
neatly into the new world view where data might be extracted from the
LIMS database in a potentially ad hoc manner by customers outside the
immediate laboratory environment.

1.2.2 Deriving information from data


Information is derived by viewing data in its context [8]. Where the
laboratory has no ownership of the samples, insufficient data may be
available to enable the analytical results to be understood. In every case
analytical samples will have context. Traditionally LIMS are poor in

/' "-
" '\
/
/
/ \
/ L1MS \

@
I \
I \
I BASE J
\ I
\ /
\ /
\ /
'\ /

" /'
/'

Figure 1.1 Concentric ring management model for IT-based industries.


AN AUTOMATING OR INFORMATING TECHNOLOGY? 3

LABORATORY PROCESS

DATA
SAMPLES
(DOCUMENTS)

PROCESS
PARAMETERS
Figure 1.2 Top-level laboratory process.

managing data structures that give context to samples, and special


application modules are developed to manage sample context. For
example, in Quality Assurance laboratories a significant analytical load is
derived from studies designed to evaluate and monitor the stability of
products under simulated storage conditions. The analytical laboratory
only deals with samples. Therefore, a conventional sample management
system should meet most business requirements. However, the formulation
specialists or Regulatory Affairs Departments sponsoring the stability
trials deal with product formulations, study design and planning and
estimating product shelf-lives. In this scenario the formulation specialist is
dealing with a higher level of abstraction, viz the product under test, the
test or experimental system (Figure 1.3).
The estimated shelf-life associated with the product formulation is
derived from the analysis of many samples. The formulation scientist
requires the LIMS to manage these studies and their outcome. This
dichotomy in requirements is further discussed in chapters 4 and 5.
At the next level of abstraction the variation between production
campaigns is evaluated in order to place a specification on the product
itself. At the highest level comes the business decision concerning the
ability of the product formulation to make money in the market place.
The ability of each layer to organise Corporate analytical data depends
on a filter which puts the data into context. This is summarised
diagrammatically in Figure 1.4.
The data are formatted and analysed as required by that level of
management using additional data about the system under evaluation. In
this Figure, the data to information pyramid has been superimposed on to
the concentric-circle model of corporate management. Each pyramid
represents an organisational function of a hypothetical R&D-based
Company, and overlaps its neighbour at the lowest level of data
conceptualisation. The degree of overlap will depend upon how closely
4 ADVANCED L1MS TECHNOLOGY

Test System
Analysis

Sample Management

Analysis & Data Capture

Figure 1.3 Data to information hierarchy.

CLINICAl RESEARCH
VETERINARY ANALYTICAl
AGRICHEMICAL LABORATORY

TOXICOLOGY
QUALITY
ASSURANCE

REGULATORY
AFFAIRS INVENTORY

MANUFACTURING

Figure 1.4 Hypothetical R&D-based company and overlapping requirements for LIMS.

these functions interrelate. The management of these overlapping views of


the database falls within the LIMS layer and helps to visualise why each
functional user group has a different view of LIMS. This view will not
necessarily be shared by laboratory personnel, who will have their own set
of requirements derived from their job functions designed to maintain the
analytical database.
AN AUTOMATING OR INFORMATING TECHNOLOGY? 5

Invariably when data are transformed a new reduced dataset is created.


Often this dataset is qualified by a textual introduction, discussion and
summary. Thus a document is the means of communicating information
(Figure 1.2). A document will have its own life-cycle and require additional
management.

1.2.3 Increased data visibility


Greater data visibility will place laboratory operations increasingly under
the spotlight. Increased timeliness of service will become as important a
factor in R&D operations as it is in commercial laboratories. Data will also
have to be of high or at least known quality. In order to achieve these
objectives all laboratories will need to have a better understanding of their
businesses as a set of analytical processes or systems. Historically LIMS
focused on the business aspects of laboratory management, recording and
reporting parameters reflecting the efficiency of the laboratory process as a
whole, e.g. mean sample turn-around time, numbers of tests reported per
unit time, backlog reports, instrument loadings, etc. The importance of
using LIMS historical data to improve the quality of analytical service has
received little attention in non-clinical laboratories.

1.2.4 Data fit for purpose


Analysts need to be aware that the data they generate will be seen not as
isolated results, but in context. It will not be sufficient for them to make
isolated judgements on the value of their data. The result must be reported
in the context of the analytical system, i.e. with an estimate of its
reliability. It is for the customer to judge the result in the context of the
analytical process and the experimental system yielding the samples. The
practice in some analytical disciplines of reporting a value less than the
MDL (minimum detection limit) as 'less than the MDL' rather than the
measured value itself, for example, results in loss of information [9], and is
not consistent with the global view of access to analytical data for use
throughout the organisation. It is equally true that the user of analytical
data must understand the implications of uncertainty in the analytical
result.

1.3 Current LIMS automate data management functions

One common justification for a LIMS purchase is the ability to handle


more efficiently the data explosion accompanying the increased demands
for analytical services and automation. These demands stem from business
pressures to provide a more cost-effective and efficient service, to adopt
6 ADVANCED LIMS TECHNOLOGY

Quality Assurance programmes and to meet the increased regulatory


control exercised over some industrial sectors, e.g. pharmaceutical
development and manufacturing, environmental monitoring and forensic
laboratory services. However, if the majority of laboratories view
instrument connection to the LIMS database as low priority, by virtue of
the apparently low connectivity rates [1), then the data explosion from
automated instrumentation would not seem to be viewed as a major
impediment to achieving these business objectives. The control of test
result processing, management and reporting appear to be more important
functions warranting automation. Laboratories with a high dependence on
chromatographic analytical systems may be an exception.

1.3.1 Control of data


As a result of deficiencies in the management of data uncovered during
investigations of certain testing laboratories in the USA, a series of
guidelines and regulations was introduced. Good Laboratory Practice
(GLP) [10] was first published in 1978 and current Good Manufacturing
Practice (cGMP) [11] in 1976. Although directed at manual operations,
these regulations focused on the capture, manipulation, management,
storage and reporting of data. The impact of computers on these processes
was addressed in 1988 [12] and 1989 [13] for non-clinical studies and 1990
[14] for environmental laboratories. At the time computerised management
of analytical data was seen as an efficient method for complying with the
regulations. LIMS were seen as an ideal way of holding data in a controlled
environment for automated reporting of sample, test and result informa-
tion.
Use of an Audit function within LIMS enabled changes to the data to be
tracked, thus ensuring data traceability and accountability. Audit reports
could be generated automatically to indicate who had changed the data,
when and why.
Work flow was monitored by automatically updating the status of
samples, tests and results as testing progressed, and an approval process
ensured that only approved data were reported. Canned, or preconfigured,
reports generated interactively or automatically enabled laboratory man-
agers to manage and track the progress of work within the laboratory,
produce certificates of analysis, billing information and other business
statistics. Reporting signalled the end of the sample data life cycle and data
were archived or moved to a separate database. The LIMS database
became a data cemetery.
Laboratory management, however, had increased confidence in the
integrity of the data delivered automatically to their desks for reporting.
The logical extension of this approach is the total automation or
industrialisation of the analytical laboratory. An approach to total
AN AUTOMATING OR INFORMATING TECHNOLOGY? 7

laboratory automation and its implications for LlMS are discussed in


chapter 8.

1.3.2 Data quality


Maintenance of data integrity and traceability is of little value if the data
entered into the system are of poor or unknown quality. Forensic (chapter
3), clinical chemistry (chapter 6) and environmental analysis contract
laboratories (chapter 7) have extensive and well-established analytical
Quality Control (Qq programs to quantify the accuracy and precision of
the data reported. The pharmaceutical industry appears less advanced in
this area. Indeed, the quality of bioanalytical methods used to support IND
(Investigational New Drug) submissions was of such concern that a
conference was convened to discuss it and issue guidelines [15]. Without
data from method validation studies it is not possible to set realistic QC
criteria to control assay performance in use. Consequently confidence in
the bioanalytical data used to support pharmacokinetic-based studies was
low.
All regulatory authorities now require calibration and QC data to
accompany analytical data. This requirement greatly adds to the data mass
requiring management and reporting by the laboratory to its clients. All
LlMS systems should be able to automate the QC process by managing
calibration standards and QC for inclusion on test worksheets. Canned
reports summarise these data for inclusion in regulatory submissions.
Bioanalytical data are used to support inferences about xenobiotic
compound distribution in populations of humans and animals or in the
environment. Together with clinical observations, toxicology and chemical
and manufacturing information they form the basis of a regulatory
submission, e.g. NDA (New Drug Application) or PLA (Product License
Application). These data are summarised in the form of expert reports
which have proved adequate for informing European Regulatory Agencies.
However, concern about the quality of the clinical data analysis forming
the basis of these summaries led the FDA (USA Food and Drug
Administration) to adopt a different approach. Here the desire was for the
FDA reviewers to be able to manipulate the data for themselves in order to
evaluate claims for the safety and efficacy of drug treatments.
The ever-increasing size of these submissions (in excess of 500 000 pages
predicted [16] for the 1990s) and the need to be able to navigate and
manipulate the data led to the proposal that submissions should be
electronic, and the concept of a Computer Assisted New Drug Application
(CANDA) was born. Information technology was used in early CANDA
systems to help to automate the review process [17] by building dedicated
systems at great cost to deliver data to the reviewer's desk. This approach
reflected the traditional LlMS implementation where IT was used to
8 ADVANCED L1MS TECHNOLOGY

automate the movement of data from the point of capture to a


management level for review.

1.3.3 Consolidation of executive control


In both cases information technology was used in early LIMS and CANDA
systems as a management tool for extending executive control to the data-
gathering stage. Their prime focus was on automating the delivery of data
of known quality to the decision-maker's desk. Laboratory and Regulatory
Affairs staff were employed to feed data into the system. Whilst the
inappropriateness of this approach for preparing NDA submissions was
eventually recognised by Pharmaceutical Companies, the lesson has not
been learnt and applied to LIMS implementations. Indeed, it has been
argued that the next generation of LIMS will extend this process to
monitor, control and enforce laboratory methods, Standard Operating
Procedures (SOPs) and business rules more effectively [18].
The use of information technology to extend managerial control by
means of automating production is not new and its occurrence in
manufacturing industries is extensively discussed by Zuboff [7]. She argues
that a technology that informates creates an ambiguity in the rationale for
authoritarian control, which traditionally has formed the basis for
managers' interaction with subordinates. In attempting to eliminate this
ambiguity executives strive to distance themselves from the means of
(data) production, whilst at the same time consolidating their exclusive
access and control of the organisation's knowledge base.
The result is a polarised work force, with subordinates withdrawing
because they have no control over their work and managers taking more
and more decisions, because they have lost confidence in their staff.
If this analysis is true of LIMS implementation vis-a-vis the 'hidden'
agenda, it is perhaps not surprising that many systems are poorly received
by laboratory personnel.

1.4 New LIMS will informate, not automate

Informate is a word coined by Zuboff [7] to describe the capacity of IT to


supplant the traditional logic of automation because IT also generates
information about the processes it automates. In this sense IT is different
from other automating technologies, e.g. harnessing water to drive the
machines that powered the Industrial Revolution. Automation tends to
preserve what is known whilst an informating strategy focuses on the
opportunity for continual learning as new data, events and contexts create
chances for additional insight, improvement and innovation.
A striking feature of the concentric-circle model of an IT-based
AN AUTOMATING OR INFORMATING TECHNOLOGY? 9

organisation is the nature of the common skills required to transform data


at the data interface. At all levels the ability to apply a conceptual
approach to the definition of a problem for analysis, the selection of
relevant data to facilitate the analysis, an evaluation of the effectiveness of
the analytical approach and application of the result to the improvement of
performance are required.
Enabling the expression of intellective skills of employees working at the
innermost level, viz LIMS, is the key to a successful LIMS implementation
strategy.
Interestingly, Dr c.c. Peck, ex-director of CDER, is quoted [19] as
saying of CANDA systems:
If your computer aided review tools aren't useful in reducing 'in house'
development time, they probably won't be very useful in reducing FDA review
time
To paraphrase: using computers to automate NDA submissions will be
fruitless unless they also informate.
A subsequent proposal [20] that the FDA form a partnership with
pharmaceutical companies and actively participate in the drug development
review process using IT techniques to access the product (Corporate)
database is consistent with the concentric-circle model of a computerised
Company management structure. The implications for a revolution in the
IT architecture of Pharmaceutical R&D organisations is complementary to
that for traditional LIMS systems.

1.4.1 The human eLement in the Laboratory process


When implementing LIMS, the human element most often considered is
end-user requirements [21], and there is a general agreement that a
correlation exists between meeting user requirements and satisfaction
ratings of the final LIMS implementation. The use of Rapid Application
Development techniques to capture user requirements more accurately at
the interface is further discussed in chapters 7 and 11.
A systems approach to modelling a laboratory identified a human and
technological sub-system [22]. The importance of the human sub-system in
terms of individuals' attitudes to innovation was considered to be a key
factor in achieving a successful LIMS implementation and improving
laboratory quality. However, the contribution to laboratory quality is more
directly related to an interaction with analytical processes than to
technology per se.

1.4.2 Improving the quaLity of data capture and anaLysis


The systems model of a laboratory can be expanded to include a multiple
of analytical process sub-systems (Figure 1.5).
10 ADVANCED L1MS TECHNOLOGY

LABORATORY PROCESS

ANAl.YTICAL PROCESS

Samples Data DATA


SAMPLES
(DOCUMENTS)

L..-_ _...--_ _ ..Jn


Process
Parameters

PROCESS
PARAMETERS

Figure 1.5 The human component in the laboratory process.

Each analytical process, when automated using IT, outputs a great


amount of data characterising the process, i.e. the informating aspect. A
human operator interacts to a greater or less extent with the analytical
system. In an automated environment the operator supplies the process
with samples, replaces disposables and maintains the equipment. In an
informating environment the operator will take decisions on the suitability
of the state of the analytical process to do the job. Analytical process
parameters will be used to facilitate these decisions. The data required for
the analysis, automatically captured and stored in the central analytical
database, will nevertheless be processed locally, at the instrument.
Analytical specifications, operating method and instructions are all
retrieved from the LIMS database for local use.
Quality Control data and system suitability data will no longer merely
support analytical data, but will be used to control the analytical process
actively to improve data quality. Incidence of data loss due to 'out of
control' runs will be reduced, as will late results due to trouble-shooting
analytical processes once they are out of control. Historic data in the LIMS
database will be made to work for the benefit of current analytical
processes.
In such an environment the analysts or technicians have control over the
quality of the work they perform. They are more effective in carrying out
their job responsibilities, which in turn results in a sense of achievement,
an important motivation factor [23]. The differences between instrument
interfacing and true automation are examined in chapter 9.
The analytical data review process will also be enhanced because the raw
data, e.g. the processed chromatogram or the mass spectrum supporting
AN AUTOMATING OR INFORMATING TECHNOLOGY? 11

the test result, will be readily accessible from within the review screen (see
chapter 3).

1.4.3 Increased control over sample gathering


In many environments the condition of the sample under investigation has
a significant impact on the quality, i.e. reliability, of the analytical results
obtained. The ability to demonstrate chain of custody of an analytical
sample is a key element where data might be subjected to judicial scrutiny.
However, to enable a full history of the sample to be known the laboratory
system requires foreknowledge of the sample's existence. Thus there is
increasing interest in being able to manage and monitor the collection of
samples, their storage and transfer to the analytical laboratory. LIMS that
informate have the ability to predict sampling requirements. However,
since the context of the sample must also be known, the simple sample log-
in schedulers characteristic of many current LIMS are not adequate. They
are automating functions, not informating functions. Informating schedulers
will supply the information necessary to collect and preserve the sample in
a valid state during transfer to the laboratory based on a knowledge of the
system under evaluation. This important aspect of informating LIMS is
further discussed in chapters 5 and 8.

1.5 Architecture of an informating system

1.5.1 Application layer


Information is data in context. This is achieved by providing additional
data about the system under investigation. These data may reside outside the
physical analytical database (Figure 1.4) and be managed by a separate
Corporate function. Consequently, software applications required to carry
out the transformation may need to interact with distributed databases in
order to support ad hoc queries. One such architecture is client-server,
which is discussed further in chapter 11. At a superficial level client-server
offers a very attractive solution wherein a data query tool resides on the
client PC and constructs a query which returns data for manipulation and
reporting. However, closer examination reveals that useful data will only
be retrieved if the distributed systems speak the same language and
interpret the query in the same manner. The dream of pulling together
disparate datasets for manipulation, review and reporting can only be
realised if there is standardisation of data definitions. The importance of
standards to facilitate analytical data exchange is discussed in chapter 12,
and these are of fundamental importance to the realisation of an
informating LIMS.
12 ADVANCED LIMS TECHNOLOGY

The need for standards may progress beyond the analytical data itself
and address the definition of the laboratory environment. In this area the
clinical laboratories are already leading the way [24], and the experience of
the EPA is discussed in chapter 7. Whilst vendors of sample management
systems (most LIMS systems are sample management systems) may view
this development with some trepidation it offers the users and commercial
software developers potentially a great number of advantages. Software
applications that manipulate laboratory data and create information will be
able to take advantage of standardisation to interact cleanly and more
securely with LIMS from different vendors and enable a more sound
business case to be made for undertaking product development. The
laboratory community would benefit by having more variety and choice of
software applications to help them to analyse their analytical processes and
test systems. Currently the availability of application-specific LIMS
modules reflects the viability of vendors able to develop products for niche
markets, e.g. stability testing, materials and formulation management, lot
disposition, drug metabolism and SOc.
Perhaps the greatest change necessary is in the way instruments interact
with the LIMS database. Instruments can no longer be implementated as
islands of automation, but must form part of the LIMS strategy (chapter
8). In an informating LIMS environment instruments are true clients of the
LIMS database server. The analytical scientist or technician will manage
their activities by way of the instrument interface. The LIMS database will
be invisible. It will become a transparent medium by which analytical data
are shared in a controlled and secure manner between instrument systems.

1.5.2 Human interface


If information is created at the data interface and reported in the form of
documents, is the current human interface with LIMS appropriate for an
informating system? Irrespective of whether the current interface is
character-based or graphical the data-management mode is the same, viz
by way of a form. Whilst often considered as data, the certificate of
analysis (CoA) is a simple document and often carries annotations that
relate to batch of product and not necessarily to a single analytical sample.
The CoA is created by a LIMS 'query by form' function and is printed out
for annotation and sign-off before distribution. The annotation creates
information about the data on the CoA, but there is no electronic copy for
review at a later time by a third party. This information is, therefore, not
available to the third party.
An informating environment exploits the document paradigm to
facilitate the creation of the CoA as an electronic document. In this case
the responsible person is working within a familiar environment, the
document, but the data are retrieved from the database on demand for
AN AUTOMATING OR INFORMATING TECHNOLOGY? 13

immediate action. Annotation and approval signature(s) are electronically


linked to the CoA.
The concept of an electronic document or notebook acting as a medium
for exchange and management of analytical data is not new [25] and has
been exploited by several basic research groups. Prototype implementations
are being evaluated in regulated industries [26]. Although electronic
notebooks manage analytical data they are not LIMS as the term is
generally understood, because they do not, for example, provide sample
tracking and scheduling functions. Nevertheless, they present a useful
paradigm for exploring the human interface of an informating LIMS.

1.6 Making IT happen

LIMS implementations are at a crossroads. In order to achieve their full


potential LIMS must informate, not automate. Sponsors and specifiers of
LIMS and management must face the challenge and desire to exploit the
informating potential of IT. Managers must commit themselves to the
necessary changes in managing relationships with coworkers. LIMS must
not maintain the status quo (chapter 10), but move forward. The more
widespread adoption of standards (chapter 12) can help in making IT
happen.

References

1. LIMS: State of the Technology (1994). Editorial in Analytical Consumer, 17-19.


Analytical Consumer Inc., Carlisle, MA, USA.
2. McDowall, R.D. (ed.) (1987) Laboratory Information Management Systems. Sigma
Press, Wilmslow, Cheshire, UK.
3. Mahaffey, R.R. (1990) LlMS Applied Information Technology for the Laboratory. Van
Nostrand Reinhold, New York, USA.
4. Nakagawa, A.S. (1994) LIMS: Implementation and Management. The Royal Society of
Chemistry, Cambridge, UK.
5. Hinton, M.D. (1995) Laboratory Information Management Systems: Development and
Implementation for a Quality Assurance Laboratory. Marcel Dekker Inc., New York,
USA.
6. ASTM E 1578-93 (1994) ASTM LlMS Guide. Annual Book of ASTM Standards,
v14.01. American Society for Testing and Materials, Philadelphia, PA, USA.
7. Zuboff, S. (1988) The Age of the Smart Machine. The Future of Work and Power.
Heinemann Professional Publishing, Oxford, UK.
8. Mattes, D.C. (1991) Data or information management: strategic distinctions. Chemo-
metrics and Intelligent Laboratory Systems: Laboratory Information Management, 13 (1),
3--13.
9. Cressie, N. (1994) Limits of detection. Chemometrics and Intelligent Laboratory Systems:
Laboratory Information Management, 22 (2), 161-163.
10. Good laboratory practice for nonclinical studies (1978) Federal Register, 43 (247), 59986-
60025.
11. Current good manufacturing practices for finished pharmaceuticals (1976) Federal
Register, 41 (31), 6878-6894.
14 ADVANCED LlMS TECHNOLOGY

12. Computerised Data Systems for Nonclinical Safety Assessment. Current Concepts and
Quality Assurance (1988). Drug Information Association, Maple Glen, PA 19002, USA.
13. Good Laboratory Practice Advisory Leaflet, No. 1 (1989) The Application of GLP
Principles to Computer Systems. UK GLP Compliance Programme, Department of
Health, London.
14. Good Automated Laboratory Practices (1990) Recommendations for ensuring data
integrity in automated laboratory operations with implementation guidance. Draft.
OIRM, US EPA, Research Triangle Park, NC 27711, USA.
15. Shah, V.P., et al. (1992) Analytical methods validation: bioavailability, bioequivalence
and pharmacokinetic studies. Pharm. Res., 9 (4), 588-592
16. Rogalski, W. (1994) Global electronic submissions, part I: The DAMOS transfer
interface standard. Applied Clinical Trials, 3 (11), 30-40.
17. Mathieu, M.P. (ed.) (1992) CANDA: A Regulatory, Technology and Strategy Report.
Parexal In!. Corp., Waltham, MA, USA.
18. Rank, M. (1993) Automating procedure management for regulated industries through
LIMS. LC-GC Int., 6 (6), 342-344.
19. Bell, R.A. (1993) FDA Perspective on CANDAs. In CANDAs: Evolution & Revolution,
Conference Proceedings, Amsterdam, pp. 124-139.
20. Kimbrell, J.Y. (1993) CANDA as a strategic initiative for the pharmaceutical industry. In
CANDAs: Evolution & Revolution, Conference Proceedings, Amsterdam, pp. 22-32.
21. Trigg, J.F. and Smith, J.A.P. (1994) Do end-users meet the LIMS requirements?
Chemometrics and Intelligent Laboratory Systems: Laboratory Information Management,
26 (3), 181-187.
22. Long, T.J. (1992) Human issues and impact on quality. Chemometrics and Intelligent
Laboratory Systems: Laboratory Information Management 17 (3), 289-294.
23. Herzberg, F. (1966) Work and the Nature of Man. WPc.
24. ASTM E 1639 (1995) Standard Guide for Functional Requirements of Clinical Laboratory
Information Management Systems. American Society for Testing and Materials,
Philadelphia, PA 10103, USA, in press.
25. Gorry, G.A., Long, K.B., Burger, A.M., Jung, c.P. and Meyer, B.D. (1991) The virtual
notebook system: an architecture for collaborative work. J. Organisational Computing, 1
(3), 233-250.
26. Rubinson, L., personal communication. Megalon, Novato, CA 94949, USA.
2 A model for a comprehensive LIMS
R.D. McDOWALL

2.1 Introduction

Demands on many laboratory organizations are becoming a driving force


to automate analytical procedures. Automation, which is generally
focused at the bench, allows an analyst to complete more work per unit
time, resulting in higher productivity. Laboratory Information Manage-
ment Systems (LIMS) have been developed to carry out many associated
administrative tasks and procedures required to run a laboratory.
However, many organizations often take a narrow view of both a
laboratory and the functions that can be automated, preventing them from
extending automation to provide real scientific and business benefits.
The large number of LIMS installations that fail to meet initial
expectations, estimated by informal sources at approximately 50%,
coupled with a lack of functional issues debated in the literature, implies
that this technology is not fully understood.
This chapter presents the issues of strategic design and broad scope of a
LIMS via the development and discussion of a model to meet two
objectives:

1. To answer the question 'What is a LIMS?' and provide a standard


description of the scope of a LIMS that allows a coherent platform for
future discussions.
2. To provide a tool that will aid analytical chemists to define, acquire or
develop a LIMS to meet the needs of their own laboratory
environments and organizations. This will be a conceptual model
representing the functions of a LIMS.

The ultimate goal is the understanding necessary for the efficient


development and application of a LIMS in any organization. The proper
application of information technology will provide enhanced quality,
quantity and value of information that will maximize a laboratory's value
to its customers.
16 ADVANCED L1MS TECHNOLOGY

2.2 Strategic design of a LIMS

The key issues when planning or designing a LIMS are:


• What functions within an organization should a LIMS automate?
• Where in the organization are the benefits realized?
The analytical laboratory is always a hub for information gathering and
distribution for a broad spectrum of clients within its community. The
ability of the laboratory to automate effectively the handling of this
information flow improves the productivity of both the laboratory and the
entire organization. The target of a LIMS is the total automation of this
complete process.
Therefore, the challenge for any organization is to implement a LIMS
that supports the effective automation of information production and
distribution from the analytical laboratory.

2.2.1 The Data and Information Domains of a LIMS


Independent of organizational structure, the benefits of a LIMS fall into
two major areas; these are the 'Data' and 'Information' Domains. These
are linked with the answers to the two questions in the last section.
1. The Data Domain: The objective within the data domain of a LIMS
is to transfer the laboratory results to the recipient quickly. To aid
this, a LIMS can be connected to analytical instruments or laboratory
automation systems. The main benefit is a gain in laboratory
productivity such as faster report production.
2. The Information Domain: In this domain the results generated by the
laboratory are interpreted and used to effect decisions. Progression
from the data to the information domain is by the provision of context
and structure for the analytical report. This increases the laboratory's
productivity and complete value to the organization. A key con-
sideration in design is to ensure an efficient interface between the two
domains, thus maximizing the benefit to an organization of a LIMS.
If a LIMS is implemented in the Data Domain, the laboratory becomes
relatively efficient. Internal processes can be improved and the laboratory
benefits from the investment in the system. However, the benefits do not
go much further than the laboratory. If a system were implemented in the
Information Domain, the laboratory would become effective within the
organization: both the laboratory and the organization benefit. Unfortu-
nately many LIMS are implemented in the Data Domain with little thought
of the client base.
COMPREHENSIVE LIMS MOOEL 17

2.2.2 Laboratory business objectives and the role of a LIMS


In defining the objectives that a LIMS should fulfil in a laboratory
environment, it is easy to say that 'the LIMS is to automate the laboratory
and improve productivity'. This raises the following questions for careful
consideration:
• What are the objectives of the laboratory?
• How should productivity be measured?
• What increase in productivity justifies the expense of implementing
and maintaining a LIMS?
• What factors, other than productivity gains, could (and sometimes
should) be used to justify a system?
Investigating the various answers to these questions is outside the scope of
this chapter. However, when productivity is discussed in this section it
refers to information production. Therefore, the following sections discuss
different ways of improving productivity dependent on the business
objectives of an individual laboratory.
• Improved information quality: Here the justification for a LIMS may
be focused on improving the control of the information within the
laboratory environment, especially within regulated industries. The
ability to help and document the required control and approval
mechanisms is a key requirement and consistent with the main
business objective of the laboratory.
• Improved information rate: This aim is consistent with many service
laboratories that are charged with the task of providing an analytical
service. Analytical reports are required by the submitters quickly.
Thus the throughput or number of results reported per unit time is a
measure of productivity. Increasing laboratory throughput via resource
management and automated report generation are key areas which
can be addressed by a LIMS.
• Improved information availability: In many research laboratories an
aim is to generate summary reports consisting of consolidated
information from a variety of samples and experiments to support a
particular scientific finding. In this type of laboratory, the ability to
organize individual tests and then analyse and report the findings
effectively may be a major means of improving productivity.
Most laboratories have objectives that merge these aims in various
proportions and priority, but most are implemented with only one
objective, which is to automate routine functions such as sample tracking
and reporting. This has the by-product of increasing laboratory productivity
and is achieved by:
18 ADVANCED L1MS TECHNOLOGY

• Organizing data that were either difficult to obtain or badly ordered .


• Distributing the data functions, in a controlled manner, to all
laboratory personnel while consolidating the information for use by
the laboratory manager.
This approach to improving laboratory productivity is effective in
providing relatively immediate productivity gains. However, it is limited in
providing strategic advantage, and the system itself has little potential for
expansion. A LIMS which only focuses on these needs does not address
the full scope of a laboratory. Improving laboratory productivity in such a
restricted manner is effective to some extent, depending upon the need and
priority of increased throughput within the organization.
The proper approach for a LIMS implementation is to take a more
encompassing view. Ideally, a system should address the total information-
handling needs of the laboratory and its business environment. This will
consider both the Data and Information Domains. This can be achieved by
considering the business objectives of a laboratory and designing the LIMS
to meet those objectives. A major element in finding out the scope of a
LIMS is to judge whether the system is to be used as an 'aid' to managing
the laboratory or a key component in improving the achievement of the
strategic aims of the organization.
A LIMS should provide automation at all levels of operation, both inside
and outside the laboratory, rather than focusing on the Data Domain.
This allows a LIMS to become the framework for a laboratory automation
strategy, rather than a 'black box' for automating a laboratory without
integrating the information into the overall operation of an organization.
Thus, the LIMS becomes the integrating tool for a coherent laboratory
automation strategy, providing the key for all information-related activities.
Therefore, a LIMS must provide a facility for the efficient, cost-effective
generation of information for the laboratory environment and the
organization or it is not viable.
On a pragmatic level, unless the LIMS project is well resourced it is
probably not feasible to implement a system in both the Data and
Information Domains simultaneously. The best approach should be to
design the total system and start in stages, starting with the laboratory
functions that have the highest priority business objectives.

2.3 What is a LIMS?

Returning to a basic question 'What is a LIMS?', it is necessary to have a


structured definition as the term 'Laboratory Information Management
System' is ambiguous. There are three main literature definitions of a
LIMS.
COMPREHENSIVE LIMS MODEL 19

1. 'A computerised system designed to provide on-line information about the


analytical laboratory and the samples assayed within it. The information
provided includes the current location of all the samples in the laboratory,
along with the current status of all analyses (i.e. not begun, in progress,
awaiting approval and completed) [1].'
This definition explains that a computer is at the core of a LIMS that can
provide on-line access to information. This is predominantly focused on
the management of laboratory functions. Therefore, scheduling, tracking,
sample location and status information are key benefits of a system. Issues
related to analytical reports, instrument calibration, quality control of
assays and the more scientific aspects of the laboratory are not specifically
addressed.
2. 'A database tailored to the analytical laboratory intended to integrate sample
information (e.g. source, batch number, etc.) with the results from
instruments which will drastically reduce the administrative tasks and speed
up the production of the final report [2].'
Here a LIMS involves the use of a database to store and manage sample
information for the analytical laboratory . No mention is made of any other
organizational group outside this environment. The definition concentrates
on the analytical aspects of the laboratory function: collecting results from
instruments and arranging them in a database before the production of the
final report. However, there is no mention of functions related to
managing the laboratory or its operations.
3. 'In general, LIMS performs a basic set of functions which greatly facilitate
the operation of analytical laboratories: they provide for work scheduling,
for status checking and sample tracking, for automated entry and processing
of analytical test data, for automated report generation, for laboratory data
quality assurance and for data archiving [3].'
The third definition focuses on a LIMS as a series of tools to manage
laboratory operations, but does not discuss the functions further nor
consider the environment outside the analytical laboratory. There is no
attempt to imply that a computer system or application is involved, but this
definition goes further than the other two in mentioning all stages in the
analytical process.
Although the definitions are complementary, a major weakness of all is
that they focus on the end results of a LIMS rather than considering what
functions are required to build the application. Such an approach gives us a
vision of the accomplishments and the benefits of a LIMS without the
understanding and insight that are necessary to implement a system. This,
combined with the differences in laboratory operation between organiza-
tions, results in the fact that 'LIMS' can mean different things to different
people. Results similar to those outlined in the three definitions listed
above can be obtained with the use of other computer-based applications.
20 ADVANCED L1MS TECHNOLOGY

However, in reality a LIMS should have a much broader impact on the


organization, as already shown.
The impressions of a LIMS can range from a single computer to multiple
processors and networks, from manual data entry to sophisticated
instrument control and from report preparation to total management of the
laboratory. Which are right? An answer could be that if the desired result
is achieved then the technology used was correct. However, this assumes
that the problems to be overcome have been correctly identified at the
outset.

2.4 An architecture for a comprehensive LIMS

A framework now exists upon which to build a LIMS model. This model
will eliminate some of the shortcomings of the literature definitions [1-3]
discussed previously. The model is based on the objectives that define a
laboratory environment, which must be addressed prior to a functional
definition of a system. Although conceptual in nature, it will define a LIMS
so that it encompasses effectively the total automation requirements for a
laboratory. Moreover, it will provide a basis for a LIMS in both the Data
and Information Domains, i.e. an integrated information management
environment. As a laboratory is charged with producing information, a
LIMS, using today's computer technology, can provide this integrated
framework to link all aspects of the laboratory environment and thereby
increase the productivity of all functions inside and outside the laboratory
[4, 5].
The idea of an architecture takes these issues further, with a systematic
and unifying organization of LIMS functions and their interactions. This is
important, as it provides the framework for the future development of
LIMS. However, the model can be viewed with different perspectives
depending on the viewpoint of the reader: user, designer or vendor. For
example, consider an architectural analogy.

2.4.1 Architectural views and communication


A function of architecture is to give different interest groups their own
meaningful views of the same structure to ensure that the final building will
meet the initial requirements. Consider architecture as it relates to a
building [6]. An architect has to consider three interested parties: the
owner, the designer and the builder. These roles can be extrapolated to the
owner, the systems analyst and the programmer in a LIMS project.
An architect will start by discussing the future owner's requirements and
constraints on the building. A set of drawings will represent the owner's
perception of the house, which can be reviewed and allow the owner to
COMPREHENSIVE LIMS MODEL 21

decide to go on. In the LIMS model, the identification of the functional


areas, the scope of the areas and the initial constraints on the interactions
between the areas provides the user (owner) with what to expect from the
LIMS.
If the owner proceeds, the architect now produces a set of plans that
address the building from a new perspective, that of the designer. These
plans will provide more explicit specification of the design for the building
without explaining or referencing the ideas that were the source and
purpose of the earlier drawings. In the LIMS model we have minimized the
design constraints to maintain flexibility; however, it still provides input
into a design view of how the LIMS will work because of the constraints
which are part of the model.
The third view is that of the builder, who generates plans that present
the building in a format suitable for construction. These plans focus on
particular functions of the building, describing the needs and requirements
for its construction, without the general concern for the complete
integration and the final product. In the LIMS model, the description of
the functional areas and levels provides this type of segmentation within
the system. This allows the developer to work on specific functions and yet
to integrate them to meet the complete requirements of the LIMS.
If the architecture of a building is complete and it meets the needs of
each group, as described above, then it will be successful. The same is true
of a LIMS.

2.5 A LIMS model

The model consists of two principles that form the basis of an architecture
for a comprehensive LIMS:
• First, a LIMS is divided into a set of functional components. This
means that the boundaries and scope of each of these functional areas
are clearly understood .
• Second, the interaction between these functional areas is defined by a
common method that allows each area to grow and expand independ-
ently of any other, thus preserving a path for incremental growth,
upward compatibility and future expansion of the model itself.
A pictorial representation of the LIMS model is shown in Figure 2.1.
Here the centre represents the LIMS database which is surrounded by the
four functional components of the LIMS. The boundaries between these
functions are well defined and the major connection between the
functional areas is via the database. In this way it can express the following
major elements of a LIMS:
22 ADVANCED LlMS TECHNOLOGY

Data
Capture

Figure 2.1 A universal LIMS model. From reference 11.

• The functional areas that will define the scope of a LIMS.


• The functional levels that define the capabilities of a LIMS.
• The relationship between the parts of the model, which delineate the
architecture of the system.
• The segmentation of each functional area of the model, which allows a
modular approach to organize and meet requirements.
• The organization of the model, which aims to define the functional
implementation of the LIMS.
Each of these areas is discussed in more detail below.

2.5.1 LIMS functional areas


Any LIMS function can be classified into one of five categories, one of
which is 'system'-related while the remainder are 'user'-related. The
system-related category is the Data Repository: this is where the data and
information are stored by a LIMS and is a database management system
(DBMS).
The four user functional categories are:
• Data Capture: the gathering of data
• Analysis of Data: reduction and manipulation of data
• Reporting: presentation of data
• Management: subdivided into two categories of managing data and
managing the laboratory
In Figure 2.1, each user-related area occupies one quarter and thus has an
equal share of the total functionality of the system. This representation is
for convenience only; in some applications there may be a disproportionate
emphasis on certain functions, depending on the implementation. For
COMPREHENSIVE LIMS MODEL 23

example, a laboratory may have a large emphasis on data capture and its
subsequent analysis with rudimentary reporting and management facilities.
Therefore, the model is sufficiently flexible to cope with different
situations.

2.5.2 Functionallevels
To facilitate the understanding of the model and to use it as an effective
tool, each of the four 'user function' areas is divided into three levels,
which represent the scope of implementation. These range from the
fundamentals required for a computer application to be classified as a
LIMS to possibilities for the future.
Within any of the user functional areas described above, the actual
implementation may occur at various levels of complexity. The following
broad descriptions of these levels help to describe the idea of a complete
LIMS framework and to facilitate discussion of LIMS in general.
Level I Minimum functions for a LIMS required to meet the basic
requirements of a functional area. This level of implementa-
tion is usually manual and is only minimally, if at all, helped by
the technology and computer tools used to implement a LIMS.
A system with Level I functions would be aimed at the Data
Domain only.
Level II Functions at this level are intermediate and generally represent
known applications of computer tools in a laboratory environ-
ment. They are generally automated functions and are usually
more complex and hence more costly to develop and imple-
ment into a LIMS, but it is only as higher level functions are
added that the system begins to address the Information
Domain.
Level III At this level, functions are advanced features that are at the
leading edge of technology, which may be high-risk and high-
cost, and are not available on most systems at present.
However, they should generate competitive advantage for the
organization if successful. Implementation of functions at this
level will fuel the development of LIMS in the future.

2.5.3 Communication with the database


The LIMS database is surrounded by the four functional areas; the major
mode of operation of each area is by way of the data repository. However,
as only Level I of each functional area is adjacent to the database, this
basic interaction should provide links to all higher levels within any area to
ensure consistency of operation. Furthermore, implementation of higher-
level functions within an area should not affect communication between it
24 ADVANCED L1MS TECHNOLOGY

and the remaining areas: communication is independent of the level of


implementation in each functional area. Using the model, a LIMS can be
considered as a modular system, with each functional area dependent only
on its interaction with the database to work. This means that the actual
implementation of a LIMS can be undertaken in an '0 La carte' manner to
meet the requirements of any analytical environment and organizational
structure.

2.5.4 LIMS architecture


The model shows that the levels in any area are developed in a layered and
well-ordered fashion to include all of the lower-level functions as
additional layers are added to the system; new functions do not exclude the
use of basic functions. Therefore, a system with higher-level functions
should automatically incorporate those from a lower level in a specific
group, e.g. a LIMS with Level II data capture will have the facility to enter
results manually into the computer. However, in practice this may not be
the case, depending on the individual implementation of the system. Note
also that a system is tailored to an individual laboratory: one LIMS may
have powerful Level II data capture and reporting facilities with Level I
analysis and management.

2.5.5 ModeL organization


As the model provides a functional plan for a LIMS, it can be used
effectively to define the major functional components for any implementa-
tion by any laboratory environment. In addition, the major communication
path or interaction between the functional areas is handled via the
database, in order to allow each area to be developed and implemented in
a modular fashion. This provides minimum constraint and maximum
flexibility to expand any functional area to meet a laboratory's require-
ments.

2.6 Definition of a LIMS

Having classified functions into areas and levels, we are ready to define a
LIMS. A LIMS is a computer application that meets the requirements
outlined by the model: the Level I functions in each of the four areas, with
a database. A system that does not meet the minimum requirements of this
model may help automate the laboratory, but is not a LIMS. A LIMS,
therefore, is a computer system that can capture, analyse, report and
manage data and information by way of a database, or it is not a LIMS.
COMPREHENSIVE LIMS MODEL 25

2.6.1 Promoting interdisciplinary communication


A major problem in the implementation of any computer application,
including LIMS, has been the crossing of disciplinary boundaries. Dessy
for many years has been teaching analytical chemists the vocabulary
necessary to discuss computing with the MIS professionals [7]. This model
can be used to help attain this goal; by giving both groups the common
ideas and vision of a LIMS, it helps each group to talk with the other. This
communication will be enhanced by each discipline's own perspectives and
strengths, which will be fused in a project team with the remit to
implement a LIMS successfully. It is also important to realize when
applying this model that the users should fit the model to their
requirements rather than a vendor's idea of a laboratory or organization.
By adopting this approach users can develop a complete picture of the
requirements that will meet their needs, which can then be mapped to
prospective commercial systems.

2.6.2 Technology-independent model


Technology in this context relates to the functions provided by the LIMS
for the user rather than the details of the system implementation. By
basing the model on functional components, a LIMS can be devised with
the appropriate tools and computing platforms that are consistent with
other systems currently implemented within the laboratory.
For instance, the model ignores how data and information are stored.
Database facilities may be effective in helping several functions within the
LIMS, such as reporting and management of data. However, these
facilities may be difficult to capitalize upon. The functionality needs to be
emphasized rather than technology that may have implicit functionality,
but may be unavailable within a LIMS. Therefore, the model is directed
toward the functional design of a LIMS and less toward the means by
which the functions are achieved. However, with this approach, there is a
danger that the means to effect the model are neglected and such systems
could be very cumbersome or user-hostile in practice. It is important, when
designing a LIMS based on this model, that consideration is given at the
system specification stage to user-friendly operation of the system and all
of the software modules.

2.6.3 Future expansion of the model


As technology develops, more functions will be available for LIMS. These
additional functions can be accommodated by the incorporation of
additional Levels to each functional group once sufficient functions are
available. This provides not only a simple platform for defining a LIMS to
26 ADVANCED LIMS TECHNOLOGY

meet laboratory IT requirements with current technology, but also lays the
foundation with which to expand the model in a coherent and controlled
manner.

2.7 Detailed classification of LIMS functions

This section contains an assignment of LIMS functions to the four user-


related functional areas and further classification to the three capability
Levels. This is intended to be of practical use for the LIMS user, the
software developer and vendors when applying the model to real
examples. The list is not comprehensive and is intended to stimulate
discussion for further refinement of the model. A key to the model is that
the levels illustrate a migration from basic to more complex functions
within a functional area, and as the more complex operations are overlaid
on to the model they become extensions to a functional area, not new
functional areas.
Functions that are classified into data capture and analysis are generally
involved with the Data Domain whilst the higher levels in the reporting
and management areas are used to integrate the laboratory with the
Information Domain.

2.7. I Group One: Data capture


Data and information from the samples entering the laboratory and the
associated results must be entered into the LIMS. Table 2.1 presents the
data-capture elements, which are subdivided into the three levels.

Level I. Functions in this group are rudimentary: information about the


sample and results are entered manually. To aid this process and reduce
error, verification is used to compare the format or range of the data input
with expected values held within the database.

Level II. A LIMS operates best when results are transferred electronically
and the functions at this level reflect this; however, transfer of data is only
one-way to the LIMS. The ease of interfacing analytical instruments varies
greatly, and therefore at this level instruments with low data capture, such
as chromatographs, atomic absorbance and emission spectrometers and
balances can be interfaced. Data can be captured on-line with well proven
peripheral equipment and software. The LIMS model, being technology-
independent, does not stipulate whether the computer running the LIMS
or an ancillary computer acquires data. Therefore, a network of processors
or a single system can still be classified as a LIMS, provided that it still
Table 2.1 Elements of the LIMS model

Level Data capture Data analysis Reporting Management

Manual entry of sample Result verification Hard-coded reports Sample log-in


information Calculation of results Database searches using forms Acknowledgement of sample
Manual entry of results with Automated calculation of results Comparison of results with receipt
format verification specifications Sample and status tracking,
backlog reports
Worksheet production
Archive and retrieval of data
using magnetic media
Approval and validation
schemes for results
Stand-alone error reporting
Hierarchical security
II One-way communication with Transfer of data files to User-configurable reporting Location of samples
instruments statistical or interpretative Graphical display of results Work scheduling
Use of barcode readers packages Ad hoc searches and reporting Workload prediction
Data reduction programs Electronic distribution of reports Audit trail of database
Interpretation of results Interfacing with desktop Transaction logging or
Graphical presentation of data publishing packages journalling of the database
Data conversion from Chemical structure drawing Class security
instrument to DBMS format
III Bidirectional communication Chemical substructure searches Natural-language reporting Resource management
File transfer of reduced data Conversion of data to methods Scheduling work on-line from
reports from NMR and MS information Expert assistance for ad hoc external systems
Three-dimensional graphics reports Electronic laboratory notebooks
Error correction across
laboratory network
Dynamic balancing of laboratory
resources
28 ADVANCED LIMS TECHNOLOGY

performs the model functions. For identifying and tracking samples,


barcode technology can be used.

Level III. Data capture at this level involves bidirectional communication


whereby a LIMS can pass a file to an intelligent instrument containing the
information required to set up and run an assay for a set of samples. After
the analytical run, a file with the results is imported into the LIMS and the
results are stored in the database.
Equipment with faster data-capture rates, such as NMR and mass
spectrometers, presents an entirely different problem. Usually these
instruments have their own specialist computers to acquire, reduce and
interpret data. The problem is: in what form should data be transferred to
a LIMS, a report file or raw data? Ideally only the report file should be
transferred to the LIMS for inclusion in the final report. If the report were
not a simple text file, then additional software is required to interpret the
data.

2.7.2 Group Two: Analysis of data

Level I. Functionality here is rather rudimentary, with verification of


data format, e.g. pH range and calculation of results from data entered
either manually or on-line. Automatic calculation of results from a set of
input data is achieved with user-defined routines specific to an analytical
method.

Level II. Data-analysis functions at this Level bring more power to the
system: either files imported from instruments can be converted into
database files, or the results contained within the original file can be
abstracted by a utility program prior to insertion. Furthermore, this
process can work in the opposite direction and results files can be exported
from the LIMS for graphical or statistical interpretation.
Reduction of data is an important function at this Level of LIMS
operation: an example might be the reduction of chromatographic data to
amount or concentration of analyte. Included in this process is also the
interpretation of results, where a chromatogram is viewed and an analyst
makes a judgement on whether any further interpretation is required.

Level III. Functions here involve the use of more sophisticated graphical
interpretation packages, e.g. three-dimensional plotting, or the use of
chemical substructure searching and interpretative packages used for
converting data to information.
COMPREHENSIVE LIMS MODEL 29

2.7.3 Group Three: Reporting

Level I. Reporting by the LIMS consists of hard-coded report formats


that only require the user to input the variables at the time the report is
run. Searching the database is achieved by using forms, and there is the
ability to compare results by using a predefined specification within the
database. These functions confer basic reporting functions on a LIMS, but
offer little flexibility such as the ability to perform ad hoc searches.

Level I/. At this Level there is improved reporting flexibility with the
ability to present results graphically, or to merge the LIMS output with
electronic mail facilities for faster distribution of reports or with desktop
publishing packages for professional reporting. Some laboratories may
require chemical substructure searching and drawing as part of their
operation, and an interface with such a package would be a feature at this
level. At this Level the functions that enable a LIMS to conquer the
Information Domain begin to emerge.

Level II/. A LIMS with functionality at this Level has the ability for
natural-language interrogation of the database with on-line expert assistance
for ad hoc searches. The system would be interfacing with expert systems
to help the inexperienced user.
The system would be integrated into the Information Domain, and its
value to the organization would be enhanced by interaction with other
applications, e.g. production information systems or manufacturing systems
to provide the key information to Computer Integrated Manufacture.

2.7.4 Group Four: Managing data and the laboratory


In this group the lower Levels are concerned with managing data, whilst
the higher ones are involved with providing information to manage the
laboratory and also linking with the Information Domain in the organization.
This is the least developed of all the functional groups.

Level I. Functions cover the basic sample management operations such


as logging samples into the system and tracking them; status of the samples
and basic work-scheduling abilities are catered for, as is backlog reporting.
Archiving and retrieval of data using magnetic media fall into this Level.
System security is catered for by using a hierarchical approach, where all
users with a particular security level can use all the system functions at or
below that level.

Level II. Here the ability to track sample location is integrated with the
sample-management functions of Level I and the data-capture functions of
30 ADVANCED L1MS TECHNOLOGY

barcoding by way of the database. There is the ability to predict workloads.


In regulated industries there is the need for an audit trail and a transaction
log to ensure the integrity of the stored data. Security of the LIMS, at this
level, is addressed by a class system where the responsibilities of the user
are mirrored by access to the appropriate software modules for each
individual user. Archiving and retrieval of data would use optical storage
technology.

Level Ill. Functions at this Level are concerned with planning of resources
and running the laboratory on a long term basis: on-line scheduling of work
from other computer applications and systems, interaction of the LIMS
with project-management software and electronic laboratory notebooks.

2.7.5 Comments on functions in Levels


When the functions in each Level are viewed as a whole two general
comments can be made.
1. The complexity of each functional Level increases from Level I to
Level III. The advantage is that the system will integrate more into
the business environment, i.e. the Information Domain, of the
organization the higher the Level. The disadvantage is that the risk in
implementing higher-level functions is greater the higher one rises in
the model. This risk is manifested in the technology and software
available to undertake the job, and is compounded by the fact that
very few LIMS installations have experience of undertaking any of
the functions in Level III in all four groups.
2. The overall cost of a system rises when a large number of higher-level
functions, in one or more groups, are required for an individual
LIMS.
Therefore, using these two points, the model can be used in a predictive
mode: more complex functions for a LIMS may take a long time to specify
and develop, and will usually be expensive and carry a significant risk of
failure [8]. Knowledge of information before the project starts enables
feasibility studies to be undertaken and informed judgements to be made,
which will reduce the uncertainty and enhance the likelihood of success for
any LIMS project. It should be remembered, at this point, that building
credibility of a system in users' eyes involves providing simple services that
work well initially, with the more difficult tasks following later [9]. This
advice may not be applicable to all systems, but should certainly be kept in
mind when considering some Level III functions.
COMPREHENSIVE LIMS MODEL 31

2.8 Applying the LIMS model: the selection of a commercial LIMS

During the system-development life cycle [10], the laboratory's require-


ments are usually defined within a document known as an invitation to
tender or a request for a proposal. The functional requirements for a
system that are described within this document can be abstracted and used
to construct a LIMS model for an individual laboratory by comparing them
with the functional list already published [4]. The Department of
Bioanalytical Sciences, Wellcome Research Laboratories, is used as an
example [11]; the key functional LIMS requirements were abstracted from
the invitation to tender document and are outlined in Table 2.2. These are
the highest-level requirements for each functional area and define the
model, as shown in Figure 2.2, for the laboratory.

2.8.1 Constructing a L1MS model for each supplier


The invitation to tender was submitted to potential suppliers and five
tenders were received. Each tender was evaluated and a model constructed
for each system by taking the highest function available for each user area;
the supplier models are shown in Figure 2.3.

Table 2.2 Key functional requirements to define an individual laboratory model

Functional area Level Comments

Data capture III Bidirectional communication between the LIMS and the
data-acquisition systems for immunoassay and
chromatography. This is for downloading of files
containing sample identity and the uploading of files
containing the same identities with the associated
results. This is a key requirement to ensure sample
continuity and efficient use of resource.
Data analysis II Two-dimensional graphics for the visualization of
analytical results. Utilities for the conversion of files
transmitted from the data-acquisition systems for
inclusion in the database.
Reporting II The requirement is the merging of LIMS output with a
document management system for electronic
distribution of reports.
Management III The requirement for software to manage protocol
required for bioanalytical studies means that Level III
facilities are required. This facility was not included in
the original definition of the LIMS model, but
experience has indicated that it is prudent to do so now.
32 ADVANCED LIMS TECHNOLOGY

Laboratory requirements

Figure 2.2 The LIMS model based on the requirements for the Department of Bioanalytical
Sciences, Wellcome Research. From reference 11.

Figure 2.3 The LIMS model from (a) vendor A; (b) vendor B; (c) vendor C; (d) vendor D;
and (e) vendor E. From reference 11.

2.8.2 Evaluating potential suppliers


The model provides an easy means to evaluate alternative suppliers;
however, it should be realized from the outset that it is not the only method
that could be used to select a supplier. Looking at each supplier model,
there are often close similarities between the laboratory and some
COMPREHENSIVE LIMS MODEL 33

commercial systems. Interestingly, there are some systems that have very
little in common with the laboratory. Therefore, it is important that the
laboratory knows its minimum requirements before approaching potential
vendors, in case compromises need to be made about specific functions.
Discussing each functional area in turn:
• Data capture: three suppliers were able to provide a full solution on
paper to the laboratory requirements. However, one vendor did not
have sufficient communications capability for serious consideration as
a contender within this category. Another could meet the laboratory's
requirements, but made it clear that the software did not exist and
would be written if an order were placed.
• Data analysis: The majority of suppliers could meet the laboratory
requirements for integrating files from chromatographic and immuno-
assay data-acquisition systems, but two did not have on-screen
graphics capability to present the results.
• Reporting: All vendors could meet the requirements for reporting.
• Management: Only two suppliers could provide study protocol
management facilities; the others had plans to develop them but with
varying timescales.
Taken overall, the model showed that only two of the five systems were
close to the requirements of the laboratory and hence should be evaluated
further. One system could meet the laboratory requirements through
additional programming, and whether this or any of the other systems
should be evaluated was left to the judgement of the project team.

2.8.3 The role of the LIMS model in evaluation


The advantage of the model is that the laboratory's requirements can be
visualized and compared with the systems on offer from suppliers. The
visualization and presentation by the model is also useful to help senior
management understand why a particular system has been chosen.
However, it would be crass stupidity to use the model as the only means of
selecting a system. The model is only a static means of evaluation, which
must be supplemented by visiting the vendor and user sites and by hands-
on experience. These latter approaches actually test the veracity of the
individual tenders and, therefore, the model is the first stage of a relatively
long selection process. However, the model can be updated in light of
these further evaluations, and at the end of the selection process, including
in-house testing and even interfacing to analytical equipment, it can
accurately reflect the system on offer from an individual company.
Two disadvantages of the model are that some factors such as support
are not classified, and must be evaluated separately, and that it caters only
for the highest function in any particular area. Therefore a checklist [12]
34 ADVANCED LlMS TECHNOLOGY

may also be appropriate to reflect the full functionality of a system. As the


model is independent of hardware and software [4]' it is also the
responsibility of the project team to ensure that the system conforms to
corporate information technology standards.

2.9 LIMS standards

The American Society for Testing and Materials (ASTM) has a LIMS
committee, E31.40, that has been developing a LlMS Guide. The Guide
was published in 1994 and covers standard definitions, the definition and
scope of a LIMS and the system-development life cycle [13].
When the LIMS model was used in practice, it was static and did not
cover such areas as support [11]. One element of the LIMS Guide is the
further development of the model to address such issues.
The first area was the introduction of global issues that impact the whole
of the LIMS model. Some of the global issues are:
• Change control or configuration management This covers many
aspects of a LIMS such as the hardware configuration, software
version and revision change control, and the storage and approval of
results within the database. Formal change control is essential for data
integrity.
• Communication infrastructure The means of transferring data to and
from instruments in the laboratory and to and from the LIMS and the
organization.
• Documentation and Training To operate and maintain the system.
• Security Both the physical security of the equipment and the logical
security built into the operating and application software.
• Validation Any LIMS operating under a quality scheme such as
Good Laboratory Practice (GLP), Good Manufacturing Practice
(GMP) or International Standards Organization (ISO) should be
validated to demonstrate and document that the tested functions work
as designed.
• Performance The responsiveness of the whole system when used in
practice.
The LIMS model was enhanced, as the authors intended [4], by the
addition of another functional segment of system management. This is
aimed at the functions of monitoring and maintaining the LIMS. Table 2.3
summarizes the functions in this new group and Figure 2.4 shows the
updated LIMS model.
COMPREHENSIVE L1MS MODEL 35

Table 2.3 Additional functions for system management in the ASTM LIMS concept model

Level I
Backup and recovery
Level II
Archiving
Manual performance tuning
System fault tolerance
Level III
Dynamic performance tuning
Advanced fault tolerance
Redundant systems
Advanced communications links to external systems

Global Issues

Data
Capture Analysis

Reporting

Laboratory

Figure 2.4 Updated LIMS model. Copyright ASTM, Philadelphia, USA.

2.9.1 System management

Level I. The functions here concern backup and recovery of the existing
data and software on the system.

Level I1. This covers such items as archiving of data, which many systems
are good at. However, most commercial systems are very poor at retrieval
of data into the database. Manual performance tuning and system fault
tolerance are the other two functions at this level.

Level III. At this level of the model there are dynamic performance
tuning and advanced system fault tolerance. These are the tools to improve
the performance of a LIMS in response to an increasing user load and the
use of hardware solutions such as clustering or CPU redundancy to keep a
system operational after hardware problems have emerged. There are also
36 ADVANCED L1MS TECHNOLOGY

advanced links to external systems and applications where a user can cross
seamlessly from one application to another.

2.10 Summary

All major functions of a LIMS can be classified into one system- and four
user-related areas and conceptualized in the LIMS model. The system
function is the data repository or database. The four user areas are data
capture, data analysis, reporting and management. The LIMS model
provides the means for introducing the concepts of LIMS, promoting a
common vocabulary for multidisciplinary communication and visualizing
LIMS requirements, and as an aid in the selection and acquisition of a
system. The original model has been enhanced in the ASTM LIMS
standard and improved through incorporation of system management and
global issues.

References

1. Gibbon, G.A. (1984) Trends in laboratory information management systems. Trends in


Analytical Chemistry, 3, 36-38.
2. McDowall, R.D. (1988) An introduction to laboratory information management systems.
In Laboratory Information Management Systems: Concepts, Integration and Implementa-
tion (ed. R.D. McDowall). Sigma Press, Wilmslow, I-IS.
3. Golden, J.H. (1988) Economics of laboratory information management system.
Intelligent Instruments and Computers, July-August, 197-200.
4. McDowall, R.D. and Mattes, D.C. (1990) Architecture for a comprehensive laboratory
information management system. Analytical Chemistry, 62, 1069A-1074A.
5. Mattes, D.C. and McDowall, R.D. (1990) A universal LIMS architecure. In Scientific
Computing and Automation (Europe) 1990, Data Handling in Science and Technology,
vol. 6 (ed. E.J. Karajalainen). Elsevier, Amsterdam, 301-305.
6. Zachman, J.A. (1987) A framework for information systems architecture. 1MB Systems
Journal, 26, 276-292.
7. Dessy, R.E. (1987) The Electronic Laboratory. American Chemical Society, Washington,
DC.
8. McDowall, R.D. (1993) The evaluation and management of risk during a laboratory
information management system or laboratory automation project. Chemometrics and
Intelligent Laboratory Systems: Laboratory Information Management, 21, 1-19.
9. Dessy, R.E. (1983) Laboratory information management systems, part I. Analytical
Chemistry, 55, 70A-80A.
10. McDowall, R.D. (1991) The systems development life cycle. Chemometrics and
Intelligent Laboratory Systems: Laboratory Information Management, 13, 121-133.
11. McDowall, R.D. (1992) The use of a laboratory information management system model
to visualise user requirements and aid system selection. Chemometrics and Intelligent
Laboratory Systems: Laboratory Information Management, 17, 181-185.
12. Mattes, D.C. (1988) How to evaluate a LIMS. In Laboratory Information Management
Systems, Concepts, Integration and Implementation (ed. R.D. McDowall). Sigma Press,
Wilmslow, 311-319.
13. ASTM E 1578--93 (1994) Standard guide for laboratory information management
systems. In Annual Book of ASTM Standards, 14.01. American Society for Testing and
Materials, Philadelphia, PA, USA.
3 LIMS in a forensic laboratory
T. DE SILVA

3.1 Introduction

Forensic science laboratories, particularly those involved with horse


racing, are operating within the commercial world. They are expected to
provide more answers from an increasing number of samples, and are
expected to do this with less staff, in a much shortened period, and at lower
unit costs. All of this has to be achieved within the legal environment in
which we at the Horseracing Forensic Laboratory (HFL) and other
forensic laboratories operate. There is a need to be able to prove that all
work is performed to an internationally accepted level of quality.
In today's modern laboratory, automation of laboratory procedures can
go a long way towards improving sample handling productivity, but this
often produces an additional amount of analytical data that can become
counterproductive. Indeed, we found that we were becoming increasingly
buried under a paper mountain. All too often the information we were
interested in was lost in the wealth of data produced. The setting up of a
Laboratory Information Management System (LIMS) has helped to
alleviate the problems associated with the resulting explosion of data and
information.
The effects of the LIMS upon the forensic constraints are very
important. Two areas to consider are chain of evidence, and the security of
data with its transfer from the instrument to a sample file. The correct
interpretation of those data and generation of final reports are crucial for
the successful operation of a properly accredited analytical system that can
withstand scrutiny by lay persons in a legal environment.

3.1.1 The role of a forensic laboratory


Mr Neville Dunnett, the Director of the Laboratory, outlines the role of
the Horseracing Forensic Laboratory below. His words apply equally well
to any forensic science laboratory.
The prime function of any racing or forensic laboratory is to provide an effective
analytical drug detection service as efficiently as possible, and at the lowest cost.
This statement contains two very closely related but opposing factors. In
general, effectiveness can be considered in direct proportion to the investment
38 ADVANCED L1MS TECHNOLOGY

level and this cost. It is the third factor in the statement, namely efficiency,
which is so crucial in achieving the desired goal.
The total workload of any laboratory is the challenge against which it must pit
its resources in order to succeed. A racing laboratory faces not only inflationary
pressures, but also other factors specific to the current racing scene.
These factors are as follows:
(1) More samples submitted for analysis
(2) A wider range of drugs being tested for
(3) Ever increasing need for tighter forensic audit
(4) Resource implication for QC/QA procedures
(5) Compliance with Accreditation schemes
The first two factors mentioned need to be investigated in more detail.

3.1.1.1 More samples submitted for analysis, to screen for a wider range of
drugs. Forensic laboratories are today faced with an ever increasing
workload, and are screening for a wider range of drugs. The combination
of these two factors has therefore caused an increase in the numbers of
analyses performed. As recently as ten years ago, the analytical techniques
available to the laboratories were technically simpler and less abundant.
For example, each biological sample would have been subjected to no
more than five to eight tests in total. These would have probably involved
liquid-liquid extraction followed by UV spectroscopy and crystal micro-
scopy testing. This testing did not, in general, involve the ubiquitous
computer and so the data produced, although quite possibly incomplete,
were very concise. It was all very labour-intensive, requiring a high level of
personal skill, but ultimately only capable of detecting a relatively small
range of substances.
In contrast, the present-day laboratory will do between ten and fifteen
tests per sample involving a battery of analytical techniques. These may
include solid-phase extraction (automated), followed by chromatographic
procedures, such as gas and liquid chromatography, in addition to
extensive immunologically based analyses, including RIA and ELISA. The
use of GC-MS has also developed into a fundamental method for both
confirmatory work and a wide range of screening analyses. The use of
autosamplers, working day and night, has become essential, as has the use
of robotic sample dispensers that efficiently process large numbers of
samples through a range of different immunoassays. Newer techniques are
also being introduced or are under investigation. Because of these
developments, the number of substances tested for, and therefore
requiring some interpretational action, has increased dramatically in recent
years.
As an example of this, we can look at the increase in HFL's workload.
The number of samples received for analysis at HFL has increased steadily
at 5% per annum since it started in 1963. This increase, however,
LIMS IN A FORENSIC LABORATORY 39

represents only part of the workload increase. While It IS extremely


difficult to quantify the change in numbers of detectable substances,
examining the increase in number of tests applied per sample reveals a
more significant trend (Figure 3.1).

3.1.2 Automation
Clearly, the application of automated procedures is essential to contain the
enormous increase in sample test numbers. With the development of the
microprocessor, and later the personal computer, control of 'robotised'
instruments has become a reality.
Autosamplers for analytical instruments such as gas and liquid chromato-
graphs were among the first developments; they seemed to appear in the
early 1980s and have significantly improved to give the models that are
available today. Automation for other techniques soon followed; automatic
spotters for TLC plates were produced but received mixed reviews, with
enthusiastic acceptance by some and rejection by other analysts. Signifi-
cantly, the use of automation for sample extractions, arguably the most
labour intensive area, did not become a reality until the introduction of the
modern solid-phase extraction cartridge. Despite their ability to reproduce
the same task reliably, early sample-extraction robots were very demanding
upon laboratory space, very expensive to install and slow in performance.
Versions are now available which are extremely compact, inexpensive and
moderately quicker. Their strength lies in the fact that unlike the human
equivalent, they will work reproducibly around the clock. Immunoassay
equipment manufacturers were quick to adopt the automatic approach,
initially with radioactivity counters and later with robotic sample processors.
The sample dispensers in use today are complete with multiple robotic
arms and on-board barcode readers.

/'
,967
'969
197'1973 I• I'J~ .....
1975,977
1979 '981 - ........
tests
198319851987
,989 ,991
samples
Figure 3.1 Increse in HFL workload, 196710 1991.
40 ADVANCED LIMS TECHNOLOGY

It is deserving of note to see the effect of market forces in the


commercial development of fully automated analytical systems. For
example, in the human clinical analysis world the markets are very large. It
is possible to purchase a single instrument that will automatically perform a
complete range of analyses on several biological samples and produce a full
listing of the results. No such product is yet available for the forensic
analyst.

3.1.3 Analytical data


Clearly, while the computer has provided the key for development of
effective automation, it also brought with it a set of problems to
accompany the benefits when applied to the data-production aspect of
analysis. Early methods of drug detection in biological fluids did not
involve the ubiquitous computer, and as a result the data produced were
very concise, although most probably also incomplete. As analytical
techniques have become more sophisticated and wide-ranging, more
analytical data are generated, and interpretation of the data has become
increasingly complicated.
The computerised data systems in use today produce a mountain of data,
usually in printed form, but this is not always synonymous with a great deal
of useful information. This is because the worthwhile information that is
produced can be buried amongst the irrelevant, especially in the case of
mass-spectral library-search printouts. The interpretation of such data has
proved difficult to automate, which leads to a real danger of missing the
positive result buried by the insignificant. This is most likely to occur at
times of greatest work pressure.
In a forensic environment, this problem is compounded by the need to
continually display legalistic 'chain of custody' audits. When this fact is
added to the sample information explosion, and the need to enforce quality
procedures for accreditation purposes, the general situation acquires
nightmarish proportions. It was against this background that HFL took the
decision in 1987 to install a Laboratory Information Management System
(LIMS).

3.2 Objectives of a LIMS

The main objectives behind the decision to purchase a LIMS were:


(a) The improvement in the integrity of data transfer at all stages of the
Laboratory's work.
(b) A reduction of the paper mountain from analytical instruments.
(c) Improved efficiency in all areas.
(d) Better management of laboratory operations.
L1MS IN A FORENSIC LABORATORY 41

To achieve these objectives, it was considered necessary for the system to


provide the following benefits.

3.2.1 Efficient interactive sample log-in


As a racing forensic laboratory, we are in the enviable position of being
able to predict our workloads. Details of all UK Jockey Club races are
published in advance, and several of our overseas clients have adopted
policies of notifying us of sample dispatch. In order to make use of this
information, we needed a racing calender or diary on the system into which
all these data could be entered. In addition, we wanted that diary to
interact with our sample log-in procedures so that it was updated as
samples were received at the laboratory. This would mean that we could
run reports to show us our approximate workloads for the next day, week
or month. The interaction with sample log-in would allow us to check at
the end of the working day that all expected samples had been received.

3.2.2 Sample tracking


Once the samples had been received in the laboratory, we wanted to be
able to track them. For instance, was a particular sample in the GC-MS
laboratory? Was it in the extractions laboratory? Or was it lying in a
broken vial on the floor? Also, we needed to eliminate that black hole in
which samples occasionally became trapped.

3.2.3 Utilisation of barcodes


It seems hard to believe now, but only four years ago there were no
barcoded labels in use at HFL. We had great hopes in the utilisation of
barcodes to eliminate transcription errors. Anyone who has visited HFL
will testify that we have succeeded in their use.

3.2.4 Automatic analytical data transfer and result entry


This was an area where we had strong beliefs, and one that seemed to scare
away most of the LIMS vendors. It seemed a ridiculous scenario that the
data being produced by our laboratory instrumentation would be manually
entered into the LIMS by our users. We specified a system where all
results, such as text reports and chromatographic printouts, in fact any
result that one would normally print out, would automatically be uploaded
to the LIMS for review at any terminal or networked computer.

3.2.5 Less clerical work for scientists


One of our primary aims was to reduce the amount of clerical duties being
carried out by our scientific staff. Time and motion studies showed that, on
42 ADVANCED LIMS TECHNOLOGY

average, scientific staff were spending 20% of their time on clerical work.
All instruments were producing some form of hard copy, and all these
printouts had to be collated into the correct files. These files were then
second and third checked by different members of staff to ensure that each
file contained the correct printouts and each printout the correct result.

3.2.6 Application of GLP principles


If the specification and installation of a LIMS was not enough, we were
also embarking on the long road to NAMAS and GLP compliance. This
was to be an important consideration when we came to consider such
factors as the use of standard reagents, the preparation and expiry of
standards and the control of user authority for result entry and review.

3.2.7 Laboratory performance statistics


As already stated, the forensic laboratory is operating in a commercial
environment, and is governed very heavily by its performance statistics.
We have a rigid timeframe to adhere to when analysing UK Jockey Club
samples, and it seemed reasonable to expect the installed system to
monitor and record our sample turnaround times.

3.2.8 No typing delays


Finally, we needed the system to directly produce the varied Certificates of
Analysis as the cases were completed. With our manual system, we were
dependent on the laboratory's secretaries to type the reports. This
invariably led to an extra day's delay, which was unacceptable for a
laboratory judged so heavily on its performance statistics.

3.3 The system

Although we had a good idea of what we wanted from our system, we were
largely ignorant of what was actually feasible or currently available from
LIMS vendors. To this end, we appointed a computer-system expert in the
role of consultant for compiling an invitation for companies to tender for
supply, and for developing a specification and evaluating the systems on
offer. This appointment proved to be a prudent decision and the consultant
was involved right through from feasibility study to acceptance of the
system.
A short list of options for finding a suitable system was considered.

(a) Develop an in-house system. This option was a non-starter. In


LIMS IN A FORENSIC LABORATORY 43

1987, when these decisions were being made, we did not have any staff
with the necessary computing knowledge to design such a complex system.

(b) Commission a turnkey system. While an attractive idea, we


considered that this option would be prohibitively expensive, and that such
a system could be difficult to maintain and develop further.

(c) Buy a standard package.

(d) Employ external software experts. We chose a combination of


options (c) and (d) which would result in the tailoring of an off-the-shelf
system. We needed the access to expert technical knowledge, knowledge
that included not only computer and software expertise but, equally
importantly, also laboratory expertise.
It very quickly became apparent during our early discussions that
virtually any forensic laboratory must be considered to operate under five
very general function headings.
The HFL LIMS was considered under these function headings:
(a) Sample receipt
(b) Sample analysis
(c) Analytical result entry/review
(d) Result reporting
(e) General management
Although generally applicable, the above headings will undoubtably
differ in detail for each type of laboratory. Despite this, there is a
surprisingly high level of regularity of operation, though the language may
vary.
Many systems are available that provide the functions listed, but we
genuinely believed that the greatest efficiency improvements would amass
from direct interaction between the LIMS and a variety of laboratory
instrumentation. In 1987, only one company was prepared to attempt to
comply with our request for direct linkage to our many instruments. This
fact, plus the flexibility of a software design allowing the experienced user
to tailor the system to the ever-changing needs of a dynamic racing forensic
laboratory, were two of the prime reasons for our ultimate choice.
Before taking a close look at the functionality of the system, there are
four major components of a LIMS that should be considered at the outset.
Hardware and software are two obvious members, with the cost of the
former continuing to fall dramatically for any given level of computing
power. The price of software remains on the high side, although this must
be weighed against the sophistication that continues to increase consistently.
A third factor, training, is often overlooked or becomes a non-budgeted
item. The speed with which operational benefits can be achieved is
44 ADVANCED L1MS TECHNOLOGY

dependent upon the effort put into the training of users and managers.
There will always be a learning curve once a new system becomes
operational, but this can be reduced to a minimum with a thoroughly
prepared training program. Once the laboratory is locked into a computer-
controlled environment, the need to keep the system operating effectively
becomes essential. Not is it only prudent to invest in both hardware and
software maintenance support, but more importantly is the provision of a
competent system manager. System housekeeping, database tuning and
refinement add up to a full-time job and are frequently underestimated.
The design, development, installation and system validation elements
generate a complete project that is extremely time-consuming. For the
HFL system, this took three years from inception to going live. This was
longer than anyone envisaged out the outset. There were problems along
the way, as there will be with any new implementation, but due to the
experience gained by both ourselves and the vendors, I am confident that
this timescale would be shortened if we were starting today.
Looking at the current hardware configuration shows how the system
has grown since it went live in 1990 and how dependent the laboratory has
become on it. The dramatic increase in networked personal computers and
on-line disk capacity reflects the growth of the laboratory's PC network
(Table 3.1).
The two major software components of the system are the VG
Laboratory Systems Sample Manager and the VG Data Systems Multichrom
chromatography software. Ethernet communications and terminal emula-
tion for the PCs is achieved by a combination of both Decnet and TCPIIP
protocols, with Digital Pathworks and Reflections 4+ software.

Table 3.1 Hardware configuration of L1MS at HFL

1991 1994

Processing power 1 X MicroVAX" 1 X MicroVax II


4 X MicroVAX 3100
1 X Vax 4500a
Terminals 9 X colour 6 X colour
3 X monochrome 4 X monochrome
Networked pes 5 30
On-line storage 300 Mb Magnetic disc 7 Gb Magnetic disk
10 Gb Optical
Archival 3 X Magnetic tape 1 X Optical disk
1 X Magnetic disk
Printers 9 X Dot-matrix 3 X Dot-matrix
2 X Laser 5 X Laser
1 X Dedicated barcode 3 X Dedicated barcode
LIMS IN A FORENSIC LABORATORY 4S

3.3.1 Sample receipt


Using a LIMS, sample receipt can and has become much more than simply
entering information about samples into an electronic register.
Sample receipt is a laboratory operation often perceived as mundane,
and is certainly the least popular area of work. In any forensic
environment, however, failure to achieve a satisfactory procedure can
invalidate all of the sophisticated science that follows receipt. Consequently
it is important to ensure that proper audit controls for reception staff and
sample identification are set up. Sample identity by the use of preprinted
HFL-supplied barcoded sample labels for each client has removed
completely the problems of transcription errors with customer identity and
sample codes on receipt.
The sample labels are supplied to clients as part of a self-contained
'sample collection kit'. Kit types vary, but they invariably contain the
sample labels, in addition to sample bottles and 'tamper-proof' polythene
bags. Like the labels, these bags are uniquely barcoded, and their identity
is also recorded on to the LIMS at sample receipt to enhance the sample
integrity. The production and management of these kits are currently
independent of the LIMS, but it is our intention to produce a daily log file
containing details of barcoded numbers entered. This file will be processed
by the kit production system, thereby allowing us to keep track of all kits
despatched.
The following are some features provided by sample receipt.

3.3.1.1 Workload prediction. By using the system diary, it is possible to


predict daily, weekly or monthly workloads. This is valuable for calculating
required staff levels and stock levels of consumables, as well as planning
dates for such tasks as instrument maintenance.

3.3.1.2 The audit of log-in personnel. In addition to the standard audit-


trail facilities provided with the system, enhancements have been made in
sample receipt. Before any samples can be logged into the system on a
given day, the 'sample receipt team' must be defined to the system, with
each of the given tasks assigned to a single name or group of names. This
information is recorded into the system diary and archived with each day's
samples, allowing us to show both legalistic 'chain of evidence' audits and
quality procedures for accreditation purposes.

3.3.1.3 Sample registration with automatic identification allocation. All


cases and subsequent samples that are entered are automatically assigned a
unique numeric and textual identification. By making use of the
information on the system diary, these numbers, along with a selection of
46 ADVANCED LIMS TECHNOLOGY

the log-in information, are created the night before receipt to speed up the
log-in process.

3.3.1.4 Automatic barcoded sample labels. Additional case and sample


identification barcoded labels are printed at the log-in workstation as each
sample is registered. The labels are used for sample information in the first
stages of analysis and also for sample storage management.

3.3.1.5 Test scheduling (client-, sample- and sex-dependent). It is of


particular benefit for HFL for the computer to assign the correct test
schedule for each sample automatically, as the manual system was at its
limit of flexibility for test assignment. Because the analyses applied to each
sample are dependent upon the type of sample (blood or urine mainly), the
sex of the animal from which the sample is collected (three sexes), and the
rules and specific requests from each client, the permutations are large.
Humans have difficulty in remembering a large number of options and
consistently assigning the correct version. This is a strength of computerised
systems, but we have retained the ability to modify any selection manually
when required.
3.3.1.6 Deficiency report and sample receipt generation. A detailed
receipt is generated after log-in for dispatch to the client. Any deficiencies
recorded at log-in are automatically appended to the receipt.

3.3.1.7 Sample volume tracking. The volume of sample required for the
analyses is also stored on the system. This allows the system to decrement
the sample volume automatically after the initial volume has been entered
at log-in.
3.3.1.8 Sample storage and location. Control of sample storage was a
major problem without LIMS. We receive about 30 000 samples per year,
nearly all of which are stored in freezers, and most produce negative
results. As we have limited freezer space, it is imperative to dispose of
negative samples as soon as possible after reporting, but it would be a
catastrophe to dispose of samples producing positive results before all legal
options have been completed. Consequently, the LIMS controls all storage
management. All freezers are barcoded, as is each of the many storage
trays within each freezer. Using these barcodes and those on the samples,
all samples are logged into the freezer using a portable computer with a
built-in barcode reader. The completed data are then uploaded into the
LIMS. Each day on request, the LIMS will produce a list of samples ready
for disposal in both barcoded and human-readable form. Samples for
disposal are removed from their location and, using the portable computer
and barcoded labels, their correct identity and numbers of samples
available for disposal are verified, before disposal takes place.
L1MS IN A FORENSIC LABORATORY 47

3.3.2 Sample analysis


Direct interaction with our instruments was a fundamental requirement for
LIMS, and proved to be one of the most difficult objectives to achieve. The
transfer of sample worksheets from LIMS to the instruments was relatively
easy, but the upload of raw and processed analytical data for review at any
LIMS terminal or networked PC took considerable time to set up.
Use of micro barcoded labels (270 per A4 sheet) for items of glassware
and autosampler tubes has completely removed the need to write case
details and has improved accuracy of sample analysis result assignment.
Transcription errors do not occur, because information is automatically
transferred.
The main aspects of the sample analysis function are shown below.

3.3.2.1 Recording and monitoring of sample pH values. A pH test is the


first analysis performed on the newly received samples. The pH meters are
directly interfaced to the LIMS, barcoded labels are checked to ensure that
the sample actually requires a pH test, and both the initial and adjusted
values are recorded. Later in the working day, a background report checks
that all the adjusted pH values are within an acceptable range before the
test results are authorised. No manual interaction is necessary.

3.3.2.2 Production of micro barcoded sample labels. The plethora of


glassware required for sample analysis, including the vials for the
instrument autosamplers, are labelled in advance. Again, the system diary
is utilised to ensure that enough glassware is prepared.

3.3.2.3 Worksheet generation with automatic downloading to instruments.


The use of predefined templates showing the position of blank and control
samples or standards allows us to generate worksheets containing details of
samples for analysis. These worksheets are created whatever analysis is
being performed. The operator simply enters the name of the template, the
analysis and the instrument name, before entering any sample numbers. If
the instrument in question does not have an on-board barcode reader, the
worksheet is built by the operator. Whatever the route, barcoded labels
are always used to reduce transcription errors. On completion, the
worksheet is downloaded to the appropriate instrument. This involves the
system converting the worksheet into a format that is recognised by the
instrument. No sample tables are entered at the instrument, and the
operators only have to start the run.

3.3.2.4 Uploading of chromatograms and results into LIMS. Gas


chromatograms, liquid chromatograms and associated UV spectra can all
be viewed at any of the LIMS terminals with graphics capability. This has
48 ADVANCED LIMS TECHNOLOGY

greatly reduced the quantity of printed results produced in the Laboratory.


Our aim is to remove the need completely, but we have not yet reached
this level of confidence.

3.3.2.5 Transfer of mass spectra and chromatograms into LIMS. The


interfacing of the Laboratory's mass spectrometers, and in particular our
older instruments, has given us significant problems. In the main, this has
been due to the complexity of the three-dimensional data and their non-
standardisation. However, recent steps taken by the instrument vendors to
produce a common data format and the advent of MicroSoft Windows
have allowed us to make progress. All but the newest of the mass
spectrometers are now interfaced, and we believe that in the near future it
will be possible to upload result files containing mass chromatograms and
mass spectra from each of our nine mass spectrometers for direct review
and archiving in the LIMS database.

3.3.2.6 Transfer of immunoassay results into LIMS. Immunoassay is an


area where we have had to make numerous changes since the system went
live. While the method of generating worksheets used in other areas is
possible, the increasing battery of immunoassays available in this dynamic
section makes it impractical.
All samples that are received into the immunoassay section are placed
directly on the robotic sample dispenser, and at this stage the staff in this
section do not know which sample requires which assay. The sample
dispenser reads the barcodes of samples present and uploads this
information to the LIMS. The result is that a file instructing the dispenser
which sample requires which assay is downloaded. Consequently, the
dispenser will only remove liquid from those sample tubes known to
require the assay being carried out at any given time.
Using worksheet templates as before, the LIMS also generates the
necessary analysis worksheets that are downloaded direct to the 'plate-
reading' instruments for the analysis.

3.3.3 Analytical result review


Because of the quantity and complexity of the analytical data produced,
coupled with the forensic and quality-control aspects, every analytical
result is subject to a minimum of two independent assessments.
Before results relating to client samples can be entered, the data from
the quality control and quality assurance samples are reviewed. All results
for these QC and QA control samples must be typical in order for the
remaining data to be acceptable, and all results and data pertaining to
these samples are later passed on for scrutiny by the QClQA section. In
LIMS IN A FORENSIC LABORATORY 49

addition, when completed client samples are archived, details of control


samples from appropriate worksheets are included.
For each analysis, a first result is entered by one analyst and then a
second by a more experienced analyst. If both agree an analytical result is
negative, the second action automatically authorises that test as negative.
Any non-negative result requires further review by a more senior person,
before the result becomes authorised for further action.
Where all the results entered for a sample's tests have been negative, the
case is automatically made available for reporting. Those cases where any
non-negative results has been entered require a fourth review. This
interactive 'level 4 review' allows a single experienced member of staff to
review both users' interpretive results and uploaded chromatographic data
from a single screen before deciding if the case is negative or requires
additional work.
The rationale behind the method is best explained by the Laboratory's
director Mr Neville Dunnett.
Our basic philosophy considers it is better to miss a positive result than
incorrectly identify a negative sample as positive. Consequently, the system is
designed to be 'fail safe'. No positive results can be reported until the whole case
has been reviewed up to and including Director level. Thus in any positive case,
there will have been five levels of independent review, all suitably audited by the
LIMS.

3.3.4 Reporting
Case reporting is perhaps the area where the users have observed the most
benefits from the LIMS. Before use of the LIMS, all reports leaving the
laboratory were sent to the secretarial staff for typing. Not only was this a
time-consuming exercise, but it also introduced a problem of induced
errors at times of greatest pressure. Transcription of sample code number
digits, which were typically between six and twenty characters, was one of
the most commonly encountered errors and required many checks and
cross-checks before the reports could leave the laboratory.
Secretarial involvement has been removed completely from the reporting
procedure and the features provided by the reporting function are listed
below.
(a) Automatic generation of negative reports
(b) Production of composite report for Jockey Club
(c) Production of Certificates of Analysis for private clients
(d) Production of positive report via the network to PCs running word-
processing software
(e) Generation of hard copy data for positives
(f) Interaction with Accounts for invoice generation
50 ADVANCED LIMS TECHNOLOGY

Negative result reporting is handled completely by the LIMS once cases


have been given authorised status. Samples received from clients com-
mercially are reported in the form of a Certificate of Analysis. Each
Certificate is sequentially numbered, and allocation of numbers is
controlled by the LIMS to facilitate interaction with our Accounts section
for financial audit purposes. Samples received from the UK Jockey Club
are downloaded directly to the Jockey Club's database at Weatherby's.

3.3.5 General management

3.3.5.1 Controlled access to L1MS. Security of the system is a major


concern of any LIMS. This is controlled in the first instance by the security
functionality of Digital's VMS operating system, and by a combination of
password and personnel identity. In addition, the LIMS menu options
presented to each person are restricted. For example, a person with a low
level of authority will be presented with a limited menu of operations
related to that authority level. Inevitably this will be less than the number
of options available to a more senior analyst.

3.3.5.2 Audit trail. The system allows us to audit the reading, modifica-
tion, addition or deletion of any data. The only restriction we have is the
availability of disk space to store all these transactions. In reality the
actions that need to be audited are reduced because the operator identity
and date are recorded for most database entries.

3.3.5.3 GLP elements (standards, controls, etc.). Elements of GLP or


accreditation requirements can be readily incorporated. To date we have
set up only part of this. This will undoubtably form one of our most
important future developments of the LIMS.

3.3.5.4 Instrument maintenance schedules. As part of an accreditation


requirement, the system monitors the calibration and maintenance of all
instrumentation, from mass spectrometers to pipettes. The frequency of
maintenance and calibration is defined, as is the date that the action was
last performed. Each night a background report checks that all instrument
maintenance and calibrations are up to date, any lapses are written to a log
file that is displayed at log-in to the responsible operator.

3.3.5.5 Workload statistics. Monthly workload statistics represent one


feature of LIMS that provides a vastly improved performance over the
manual method. Previously these were manually calculated from a register
and involved many hours of one person's time to accumulate the data,
calculate the results and type the final table. LIMS will now provide this
L1MS IN A FORENSIC LABORATORY 51

table within minutes of the request being entered on the first day of each
month.

3.3.5.6 Archive of case data. To keep the 'active' database as small as


possible, reported cases are automatically transferred to an intermediate
'committed' database. This database is accessible for ad hoc enquiries, but
its existence speeds up access to the live data. Eventually, as all work on a
sample is completed and the sample has been disposed of, the case is
archived. This again is an automatic process and involves the creation of a
text file containing all data about the case. If the case was reported
positive, this file will contain the personnel data for all operators who
carried out an action on the case and instrument details showing correct
calibration and maintenance.
The archiving of data that are very important for accreditation and
forensic purposes is achieved by use of magnetic tapes and optical disks.
Although practical, the magnetic tapes suffer from potential deterioration
on storage, and data recovery is very slow. Optical disks offer a far
superior, although more expensive, option and as prices decrease they will
be used more extensively.

3.3.5.7 System maintenance. Obvious areas of maintenance investment


are hardware and software contracts, since once a laboratory is operating
via a LIMS its reliable functioning is paramount. However, the amount of
regular housekeeping required to maintain efficient running, even without
any component failures, is quite substantial.
HFL had underestimated this aspect and was not warned beforehand.
Most advice suggested the need for a system manager, but it was assumed
that it would not be a full-time job. In reality we require a system manager
plus an assistant. The assistant takes care of the necessary daily
housekeeping, allowing the manager to cope with more serious trouble-
shooting and system development.

3.4 The future

Besides the normal upgrading of the system as and when newer versions
are developed by the vendor, enhancement will be achieved by introducing
new features.
System transmitted and received faxes will further improve efficiency.
We currently have a direct link to the UK Jockey Club, but ultimately
direct communication with other customers' computers is a possibility.
This would also introduce the reality of uploading sample information to
reduce the sample receipt procedure dramatically.
Incorporation into the system of incoming and other externally
52 ADVANCED L1MS TECHNOLOGY

generated paperwork is currently a problem. This may be addressed by


using digitising scanners and optical character recognition software. As
such, we have started to evaluate the feasibility of scanning incoming paper
to record these data on the database.
Although automation and LIMS have reduced the burden of analysing
large numbers of samples, screening the enormous quantity of analytical
data is extremely time consuming. It may be possible to improve the
situation by application of expert systems particularly in the review of mass
spectral and diode array data.

3.5 Conclusions

Purchase of a LIMS is unquestionably a major investment for any forensic


laboratory. For HFL this represented approximately 25% of the total
annual budget and clearly required a sound case to be made for allocation
of funds. It must also be appreciated that the lead-in time from inception to
live operation is significant. For HFL this was spread over a three-year
period from 1987 to 1990, and involved much customisation of software.
Throughout this period the project was a drain upon the Laboratory's
resources and nothing was being gained directly in return. However, it is
important to appreciate that this long development and evaluation process
played a major part in achieving a relatively painless and trouble-free
introduction to live running.
Now that the system has been operating live for over four years, the
question must be asked: Has it fulfilled our objectives?
The answer is a clear 'yes!'. Without doubt there was an immediate
improvement in the integrity of data transfer, although efficiency plummeted
during the first two months of operation as staff began climbing the steep
learning curve. This accrued in spite of intensive training and to a large
extent was psychological in nature, as attitudes of mind developed over
years of manual procedures took time to readjust.
Fewer analytical results are now printed for evaluation, but there is still
some way to go before the printed paper output is reduced to a minimum
level.
Although not one of the original objectives of highest priority,
improvement of efficiency was nevertheless included as a justification for
investment. It was an objective which we had no confidence in achieving,
simply because no other users we contacted could show this benefit. It is
pleasing to note, therefore, that after 18 months of operation, we could
show a minimum saving of 16% in the scientific staff level. This is
quantifiable because we have always used a formula to calculate staff
levels. The formula is based upon caseload, and since using the LIMS, this
has been modified, giving this staff saving without lowering efficiency, as
LIMS IN A FORENSIC LABORATORY 53

Table 3.2 Decrease in HFL case delays

GMK cases delayed (%)

Pre-LIMS. 1989 Post-LIMS, 1991

Mean monthly figures 2.5 1.8

Source: Dunnett, N. (1992) Should a racing laboratory go out on a LIMS? In Proceedings of


the 9th International Conference of Racing Analysts and Veterinarians, New Orleans, 1992,
vol. 1 (ed. by C.R. Short), pp. 273-281.

demonstrated by our monthly workload statistics report. For example, the


monitoring of the percentage of Jockey Club cases that take more than ten
days to report shows no increase after reducing staff levels (Table 3.2).
Besides the savings in scientific analytical staff levels, it must be borne in
mind that secretarial staff time has also been saved to enable development
of other laboratory services, for example, research, QClQA (accreditation)
and scientific information.
The last major objective, that of improved management of laboratory
operations has also been achieved. It provides, very quickly and
accurately, details of workload and staff involvement to assess the
distribution of work within the laboratory. It also allows rapid production
of information on workload trends from customers.
The views of the staff are recognised as significant factor in the ultimate
success of a LIMS system. The consensus, from members of laboratory
staff involved in working both with and without the LIMS, is that we could
not have coped so effectively with current demands had HFL not invested
in this technology. None would wish to return to manual procedures.

Acknowledgement

Neville Dunnett is thanked for his contribution to this work.


4 Application of a LIMS in a pharmaceutical drug
metabolism and pharmacokinetics laboratory
D.J.M. GRAHAM

4.1 Introduction

One of the initial major impacts of mainstream commercial Laboratory


Information Management Systems (LIMS) has been on conventional
chemistry laboratories which analyse samples from some form of process
chemistry against a specification, followed by production of a certificate of
analysis based on the results of that analysis and subsequent approval of
that sample for further use [1]. In such cases, the analysis of a given sample
is generally independent of the analysis of other samples being analysed at
the same time. This meant that the LIMS databases were very much
sample- and specification-orientated, and little or no functionality was
provided to allow the manipulation of batches of related samples. In time,
as the acceptance of these applications has become fact, some vendors
have made attempts to increase the market appeal of their systems by
attempting to promote a broader applicability without changing the basic
underlying structure of their systems. They have done this by adding
functionalities to support this wider applicability of their system.
An area that was of obvious appeal to vendors due to the size of the
market was exemplified by some of the more biologically orientated
biomedical applications in the pesticides or pharmaceutical industries.
There are many similar areas which differ from mainstream chemistry
analyses in that experiments, or studies, are designed to generate many
samples each of which contributes to the outcome of the study, but when
taken in isolation, a single sample is of no particular significance. The
outcome of any experiment, or study, lies in the integration of the results
of the analysis of samples from the study as a whole.
This chapter describes, as an illustration of such an application, the
implementation of a LIMS in the Drug Metabolism and Pharmacokinetics
Department of Syntex Research Scotland (SRS). The responsibilities of
the department in question are broadly to define the disposition of
candidate human drugs during both research and development. The
information generated by the department contributes to the evaluation of
both the safety and the efficacy of such new drugs and forms a significant
portion of the dossiers submitted as marketing applications to Health
LIMS IN A DMPK LABORATORY 55

Authorities. At SRS, the Drug Metabolism and Pharmacokinetics Depart-


ment is responsible for the execution of all metabolism and pharmacokinetics
studies from basic research support through to registration studies, and
there is in the department, in addition to metabolism and pharmacokinetics
groups, a bioanalysis group which acts as a service function to the other
groups, carrying out routine sample analysis. Frequently, in the pharma-
ceutical industry, these three components are separated managerially and
the opportunity to integrate the functions closely does not exist. At SRS,
the department comprises some 31 people and is closely integrated, with all
three sections contributing to many of the studies carried out. It would be
impossible in the space available here to describe fully the design and
implementation of the LIMS. Rather, the intent is to provide an overview
which provides an insight into the application of a mainstream laboratory
database in a specialised field common in biomedical sciences.
In the late 1980s, it became clear to departmental management that, in
order to improve both the quality and the efficiency of the output of the
department, the various instrument and computer applications used to
generate and manipulate scientific data should be superseded by a more
standardised approach with a minimum number of carefully chosen
software solutions.
The purpose of this chapter is not to describe the selected application
software. This would not be appropriate as not all of the functionality of
the system is used in the current setting. This is primarily an example of the
use of the application in the specific setting.
Much has been written in the contemporary literature regarding the
specification, development, implementation and testing of such systems
[2-7]. Again, the purpose of this chapter is not to describe these processes,
but to describe the final application and its utility in its installed setting.
Following conventional 'life cycle' processes of specification and selection,
the system installed was the LabManager system supplied by Beckman
Instruments. This system was installed with various options, one of which,
the Bioassay option, was designed to provide functionality to support drug
metabolism studies.
The following sections describe in some greater detail the types of
activities carried out by the Drug Metabolism and Pharmacokinetics
Department and then describe the application of LabManager in that
setting.

4.2 Study objectives in drug metabolism and pharmacokinetics

The studies that yield samples for analysis in the Drug Metabolism and
Pharmacokinetics Department can generally be defined as being of several
distinct types. The main types of studies, the types of samples generated
56 ADVANCED LIMS TECHNOLOGY

and the analyses carried out are summarised in Table 4.1. The first four of
these studies are designed to determine the fate of a test drug following
administration to a test subject and can be further broadly classified as
follows.

4.2.1 Pharmacokinetic studies


In these studies, the time course of the residence of the drug, or on
occasion, active metabolites, in the test subject, is measured by the
collection of sequential samples of body fluids at times following
administration of a test dose of the compound. The most common type of
sample collected is plasma or serum, because of the ease of collection, but
on occasion whole blood, saliva, urine or other fluid is collected. The most
common measurements made are simply drug levels, or those of drug and a
major metabolite. In pharmacokinetic studies, as well as some of the
studies listed below, an important parameter related to each sample is the
time when the sample is collected, relative to the time of administration of
the last dose. In the current application, such time data are entered and
stored in the database.

4.2.2 Drug metabolism studies


The objective of these studies is to determine the fate of the test drug when
administered to the test subject. A common approach is to administer the
drug in a radiolabelled form (usually carbon-14 or tritium labelled)
followed by quantitation of radioactivity levels in biological fluids and
excreta in an attempt to account fully for the dose of drug given. Often, in
these studies, analysis of parent compound levels is also carried out, or
further analysis undertaken to characterise the metabolites of the drug
produced by that test subject.
The second group of studies is those in which the effects of the
administration of the test compound on levels of hepatic drug-metabolising
enzymes are measured. These enzymes, predominantly the enzymes
containing cytochrome P-450, are responsible for oxidation of drugs and
xenobiotics prior to excretion by the body. Administration of a test drug
can cause induction, or inhibition, of these enzymes and thereby change
the way in which the test drug, or any co-administered drug, is eliminated
by the body.
The studies are generally carried out in two ways. 1n vivo studies
generally look at the effects of single, or repeat, administration of the test
drug on the pharmacokinetics of a standard compound with known
kinetics, and the study designs are similar to those used in evaluating the
pharmacokinetics of a test drug. In vitro studies generally either measure,
directly, the enzyme activities in liver tissues derived from test subjects
Table 4.] Outline description of the studies generating samples and the types of analyses carried out on those samples

Study description Sample types Typical number of Typical analyses


analyses/samples

Pharmacokinetics in vivo Plasma, serum, saliva 1-2 Drug, metabolite, endogenous


marker (e.g. ACE), radioactivity
Toxicokinetics in vivo Plasma 1-2 Drug/metabolite
Exeretion/balanee in vivo Urine, faeces, expired air 1-2 Volatile and non-volatile radio-
activity, drug, metabolite
Quantitative distribution ex vivo Tissue 1-2 Radioactivity, drug
Enzyme induction ex vivo Tissue 3-10 Protein, P-450 enzymes,
conjugative enzymes
Enzyme inhibition in vitro Tissue 3-10 Protein, P-450 enzymes,
conjugative enzymes
Drug-Drug interaction in vivo Plasma, serum, saliva 2-4 Drug(s) metabolites
Drug-Drug interaction in vitro Tissue 2-4 Drug metabolites
Metabolic stability in vitro Tissue 1-2 Drug metabolites
58 ADVANCED LIMS TECHNOLOGY

administered with the drug, or determine the effect of adding drug on the
activity of enzymes derived from the liver of control subjects. In either
case, the results of these analyses are generally enzyme activities compared
across various drug treatments.

4.2.3 Outline study design


Early in the development of the specification of the LIMS, it became clear
that, irrespective of the objective of the various studies, the overall design
of the studies had many common features which allowed a single
specification to be developed to cover the operation of the Department as
a whole. The structure defined, for the purposes of the specification, is
shown in Figure 4.1. The components are as follows.

(a) Study. This, exclusively, is the identifier of the Study Protocol and is
the primary key to the stored data. All studies in the Department are
carried out under a unique Study Protocol.

(b) Treatment groups. For the most part, for studies carried out in vivo,
this identifies groups of subjects that receive the same treatment. Some
studies may only have a single treatment group, for example, a simple
excretion/balance study, but in general, most have several treatment
groups, which may have different routes of treatment (intravenous, oral,
topical), different doses, or different formulations (e.g. tablet, capsule). In

1 • 6 GROUPS PER STUDY

1 • 4 TREATMENTS PER GROUP

1 ·36 SUBJECTS PER TREATMENT

1 ·24 SAMPLES PER SUBJECT

1 ·6 ANAlYSES PER SAMPLE

RESULTS RESUlTS RESUlTS

Figure 4.1 Outline of a study design. The numbers shown are those commonly encountered.
Although not shown, each Group will have an identical hierarchy of components.
LIMS IN A DMPK LABORATORY 59

the common, but specific, instance of balanced randomised crossover


studies in which all of the subjects receive all of the treatments in a
randomised order, the treatment groups can be defined as the leg of the
crossover (e.g. Phase 1, Phase 2, etc.). For in vitro studies, the treatment
group may simply be used as a means of separating treatments, e.g.
concentration of test drug in incubations.

(c) Subjects. This details only the subjects studied in a given treatment
group.

(d) Samples. These are the samples collected from the subjects in a
specific treatment group. Not all subjects necessarily have the same
samples collected from them, but the protocol defines which samples are
collected from which subjects.
Where several samples of the same type are collected from a single
subject, e.g. sequential plasma samples collected from each subject in a
pharmacokinetic study, the samples are differentiated by the time of
collection of the sample in reference to the previous dose of the test
compound.

(d) Tests. The tests, or analyses, to be carried out on the collected


samples are defined in the protocol. For the majority of samples, only a
single analysis is carried out, but for a limited number of samples more
tests (generally no more than four) are carried out.

The matrix shown can be used effectively to represent the sample matrix
in any in vivo or ex vivo experimental protocol in current use. To allow the
extension of the template for application to in vitro studies, not all of the
above parameters are required to define the sample matrix. However, the
samples can be included in such a matrix. How this is done in practice is
covered below.
There are a small number of study protocols which either do not fit the
matrix shown above, or contain such a large number of alternatives in the
way in which the work is executed that they preclude precise specification
of the sample matrix in the protocol study plan. Examples of this type of
study are analytical method development, or metabolite characterisation.
In the former case, there is generally no fixed study plan, or even sample
matrix, and only a general approach is specified at the outset. In the latter
case, although the sample matrix is generally well specified, the tests, or
analyses, which are used first to purify potential metabolites and
subsequently to characterise their structure are not. These types of studies
are thus not generally entered or contained in the LIMS database.
60 ADVANCED LIMS TECHNOLOGY

4.3 Configuration of the database

The most important component of the configuration of the LIMS is the


configuration of the sample table. However, flexibility that gives the user
the capability of easily reconfiguring the database allows for a more rapid
implementation of the system, as users will have confidence that, if
necessary, the original configuration can be changed without too much
difficulty. This is more important where the resource available to carry out
preliminary requirements analysis is limited and the users have limited
experience of the operation of related databases.
In the implementation of LabManager for drug metabolism and
pharmacokinetics, as well as configuration of the main database table to
contain sample and test result data, ancillary tables, which, in the case of
LabManager, have a subset of the relational capabilities of the main
database table, were configured to contain other data. These data were
needed either for the full functionality of the LabManager system, or to
provide additional functionality for departmental use (Figure 4.2).

4.3.1 The sample database


The sample database was configured to contain all information relative to a
sample and the analysis of it. There is one record per sample and
effectively one record for each test assigned to the sample. It is not
necessary here to define all the fields of the sample or test records, but
some insight can be gained in respect of the utility of the system if some of
the field descriptors, or contents, are examined.

PROTOCOL STUDY COMPOUND

~ I /
SAMPLE AND TEST
HOME OFFICE
TEST I(
DATABASE
I(
• RECORDS

/
CALCULATION
~ TREATMENT
Figure 4.2 The data base and its ancillary tables. The Protocol, Test and Calculation tables
are required to support application functionality. The remaining tables are user-defined,
providing application-specific functionality.
L1MS IN A DMPK LABORATORY 61

4.3.2 Sample identifier (SID) (five fields)


In the LabManager system, only five fields are allowed to identify uniquely
any record. Although this is a limitation, and six or more fields would allow
greater flexibility, the use of these fields in the current application matches
the matrix of samples defined in the study design. The primary interpreta-
tion of the SID fields is as follows.
Study identifier
Treatment group
Subject identifier
Sample type
Nominal time

(a) Study identifier. This is self-explanatory, and is usually the protocol


reference.

(b) Treatment group. In all but crossover-type studies, this is simply the
treatment given. For crossover studies, this entry is generally the leg of the
crossover study (e.g. Phase 1). The treatment given in that leg of the study
is entered to a separate treatment field in the sample record.

(c) Subject identifier. This is self-explanatory.

(d) Sample type. This records plasma, blood, dose solution, etc.

(e) Nominal time. This is a composite field that contains two pieces of
information. Both relate to the time the sample was collected in reference
to any preceding treatment. The Dose Nominal Time specifies the time
from study initiation that the reference dose was given. The Sample
Nominal Time specifies the time after the reference dose that the sample
was collected. The times are both described using a specific notation (see
section 4.3.3 below) and are designated as target, or protocol, times.
The descriptions given above are the most commonly used descriptions
for the Sample ID fields, but the fields are configured as simple
alphanumeric fields and under special circumstances the field contents are
other than those specified above.

4.3.3 Time fields (eight fields)


For the purposes of pharmacokinetic data analysis, it is necessary that the
actual time a sample was collected, relative to a previous dose of drug, is
used in calculations, as opposed to target or protocol times. In addition,
several national regulatory authorities require listings of actual sample
collection times with the listings of test results. For this reason, and
62 ADVANCED LIMS TECHNOLOGY

specifically because the Department has responsibility both for the analysis
of the samples and for the subsequent pharmacokinetic data analysis,
LabManager has been configured to allow expression of the various times
relevant to a sample in a number of ways (Figure 4.3). In this scheme,
given the study protocol information of Study Start Date (Dose Regime
Start Date) and the Dose and Sample Nominal Times, functionality within
the system calculates the Daterrime of the last dose and the theoretical
(i.e. target) Sample Time (as a Date and Time field). The user is then
provided with the ability to update the theoretical sample time to an actual
sample time, and a user-defined procedure calculates two values for the
Time Post-Dose and enters the data to the sample record.

4.3.4 Sample status


The sample status field of the sample record is maintained by Labmanager
and is used to track samples through the system (Figure 4.4). The sample
statuses and their interpretation within the current application are as
follows.

(aJ Logged. The sample record has been created in the database, i.e. the
system 'knows' about the sample, but the sample has not been received.

Nominal Dose

~:;~~~~/ Time
Nominal Sample

" Timel
DateITime of Last Dose ~

Theoretical Sample DatelTime

/
Actual Sample DatelTime

Time Post-Dose [HH:MM]

!
Time Post-Dose [xx.xl

Figure 4.3 Sample times in the LabManager database. All time values, except the two Time
Post-Dose fields, are component parts of LabManager with the Bioassay option. The Time
Post-Dose fields are user-defined functionality.
L1MS IN A DMPK LABORATORY 63

Figure 4.4 The sample status field. As samples progress through the laboratory. various
functions progress the status, as shown. Certain actions, such as adding tests to a sample,
result in retrograde changes to the status.

(b) Received. When samples are delivered to the laboratory for analysis,
analysts will use a specific 'Sample Receipt' screen to change sample status
to Received from Logged.

(c) Tested. When all tests assigned to a sample are completed, the status
changes to Tested.

(d) Validated. When a user with appropriate authority reviews and


validates the test results for a given sample, the process of validation
changes the status to Validated.

(e) Approved. In an analogous way, a user with appropriate authority


can review results 'in context'. In the current application, the review in
context is taken to be the final approval of the sample and the status is
changed to approved. Within both the Validated and Approved statuses,
there are subcategories which can be used to define further the status for
further processing. These categories can be used to structure access to the
results for further processing, or reporting.
64 ADVANCED LIMS TECHNOLOGY

4.3.5 Whodunit fields (six fields)


These fields, the 'Whodunit fields', are used to record the identity of the
user carrying out an operation which changes the sample status, along with
the date and time that the operation occurred. For example, the identity of
the analyst who tested a sample is recorded and is available for retrieval
and reporting purposes.

4.3.6 Miscellaneous fields


There are a number of additional fields which have been specially
configured for the current application and which support user-defined
functionality in the current system. Examples of these are as follows.

(a) Priority. This is a value assigned to samples from a given study at log-
in, which allows samples scheduled for analysis to be sorted in order of
priority. Thus, when an analyst carries out a retrieval in preparation for
analysing samples, the high-priority samples are at the head of the hit list.

(b) Sex. This is specified in the protocol (see below) and assigned
automatically to the samples at log-in, to allow sorting of results by sex.

(c) Randomisation sequence. Updated by users, this field contains


information on the randomisation sequence in crossover studies. This has
been configured specifically to allow results files to be configured for direct
input to statistical ANOVA routines.

4.3.7 Test-related information


For each test assigned to a sample, the database has been configured to
contain, in addition to result information, information on the user carrying
out operations on the test. The time and date of such operations are also
recorded. As well as these standard fields, two more have been configured
to store an analysis reference and a sequence position in that analysis.
These fields are used to allow users to recall samples by analysis and create
worksheets based on analysis and analysis position.

4.4 LabManager in use

4.4.1 Defining the sample matrix


At the extremes of size, clinical pharmacokinetic studies can yield up to
several thousand samples. For example, a study which has up to 50
LIMS IN A DMPK LABORATORY 65

subjects, with up to three treatments, with up to twenty samples collected


after each study treatment, will yield 3000 samples. Given that each sample
requires five fields of information to define its SID, to enter such data
manually would be a significant administrative chore. To facilitate the
entry of such data, LIMS vendors have begun to produce add-on
functionality. In LabManager, this is the Bioassay option. Using this
option, the user defines the sample matrix in the form of a script which is
stored in a conventional table and handled in the same was as other
information tables. The script essentially assigns, in a hierarchical way,
values to keywords which are in essence the SID fields, and one additional
field, the treatment field. The example shown (Figure 4.5) defines a matrix
of 60 samples where there are six subjects in Phase 1 of a crossover study,
three of which are treated with tablets or capsules. Following a dose given
9 h after the start of the study, plasma and saliva samples are collected to
be assayed for phenytoin at 1, 2, 4, 8 and 12 h after drug administration.
The script is easily generated using system editors by copying and
modifying an existing protocol or by entering a new one.
To create the sample records from the script, a protocol parser interprets
the script and creates the sample records. The process, which generally
runs as a maintenance process each night, creates only those record

GR=PHASE 1
TR=TABLETS
10= AA001.CC003,EEOOS
SP=PLASMA,PHENYTOIN
SP=SALlVA,PHENYTOIN
OO=09HR
TI=lHR
TI=2HR
TI=4HR
T1=8HR
TI=12HR
TR=CAPSULES
10= BB002.00004,FF006
SP=PLASMA,PHENYTOIN
SP=SALlVA,PHENYTOIN
OO=09HR
TI=lHR
TI=2HR
TI=4HR
TI=8HR
TI=12HR

Figure 4.5 Example of a protocol script. Interpretation of the keywords in terms of Sample
Leader fields is as follows: GR = Group, TR = Treatment, ID = Subject Identifiers, SP =
Sample Type, DO = Dose Nominal Time (after study start), TI = Sample Nominal Time
(after last dose).
66 ADVANCED L1MS TECHNOLOGY

samples which will be in existence up to the following day, that is, the
calculated sample time is before the next day's date.
Following sample logging, subsequent sample processing follows an
essentially conventional pattern (Figure 4.5).

4.4.2 Multisample processing


The nature of the analyses undertaken dictates that analysts, in general,
process many samples together.
At each stage, functionality is provided by LabManager, or can be
configured by the user, to ease the processing of samples through the
system. In the current application, batches of samples are grouped
together by analysts using a function called Runsheet. This assigns a
particular value to a field in the test header of all samples in the retrieval
executed by the analyst, and subsequently all of these samples can be
retrieved simply by entering the Runsheet (or analysis) number.
The analysis number is the primary means of retrieving samples for
generation of worksheets and entering and validating results.
In addition to actual test samples, calibration data and QC sample
results are submitted to LabManager, dependent on the analysis in
question. As part of the data upload process, calibration and QC samples
are logged automatically along with the test data for these samples. The
analysis number is retained in the test record of the calibration and QC
samples. Thus, during the validation step, the validator can retrieve the
calibration and QC data along with the test results to be validated.
The functionality provided, which is of utility in multisample processing,
is also used when processing small numbers or individual samples. The
only difference is that, when sample numbers are low, there may be little
benefit gained from linking the samples in an analysis, and samples are
processed by conventional means using retrievals based on sample header
fields and test code.

4.4.3 Processing sample time data


At any stage following logging of the samples, a user can change or
manipulate the dose and sample time information of a sample, or more
usually, a group of samples.
Functionality provided with LabManager allows the user to enter actual
times instead of protocol times for dose and sample times. The functionality
provided by the vendor makes this process easy. If the time of the last dose
is changed, then the theoretical (or target times) for sample collection can
be updated automatically and the new theoretical sample time used to
generate the actual sample time.
Two additional time fields have been configured. These are both defined
LIMS IN A DMPK LABORATORY 67

as sample times expressed as the differences between the actual sample


time and the last dose time. This time difference is then expressed as
HH:MM (hours and minutes) for reporting purposes and as floating point
decimal for use in graphics packages or pharmacokinetic programs.
The processing of the time information can be carried out totally
independently of sample testing and thus can be assigned to non-laboratory
staff. Furthermore, given appropriate information, the data could be
uploaded to the system electronically from remote devices.

4.4.4 Sample testing


There are several sample testing methodologies in use in the current
application. Where the analysis method is a high-throughput one and many
samples are tested, automatic or semiautomatic data-capture methods
have been employed. Where a small number of samples is to be processed
and data are simple 'read and record' (for example, volumes and weights),
manual methods are still generally in use.
Irrespective of the mode of data collection, the procedures followed by
the analyst are generally the same. The initial step is to retrieve from the
sample database the samples to be analysed. The analyst will then generate
a worklist for the samples to be analysed. Where the data are collected or
recorded manually, this is normally a printed worklist which contains all
the details of samples to be analysed, as well as the data to be collected.
Printed worksheets are automatically annotated with analyst names and
other relevant data to facilitate tracking. The data are then recorded
directly on the worklist, and when complete the data are entered into
LabManager manually.
Where the data are to be collected electronically, the LabManager
system offers several alternatives, not all of which are currently implemented.
LabManager has the capability to interface directly with analytical
instruments and collect data directly. This Instrument Data Acquisition
System is not currently configured in routine use. An alternative approach
which is used routinely is to make use of a batch transaction processing
option in the following way. The initial step of the process is similar to that
used in manual processing, save that, in addition to a paper worksheet, an
electronic file listing of the samples to be analysed is generated and passed
to a remote device. The processing at the remote device depends on the
data to be collected, but the end result of the processing is the generation
of a file which is configured as an input file for the batch transacting
processor option, BTLAB. Submission of the input file to the process
electronically transfers the test data to the LabManager database,
completing the testing process.
There are many alternative approaches possible and the most appropriate
approach will be dependent on many factors which are laboratory, or
68 ADVANCED LIMS TECHNOLOGY

application, specific. A significant consideration in determining the route


taken will be the resource and/or expertise available to configure new
instrument interfaces, etc. It is possible to configure the connections with
only a rudimentary understanding of instrument control languages or
programming.

4.4.5 Quality Control


Where appropriate, Quality Control samples are analysed along with test
samples, as described above. These samples are automatically logged and
the results submitted to LabManager. The QC samples contain the study
reference in the sample header, and if analysis contains samples from
several studies, replicate QA samples are logged with a sample header
reference to all studies analysed. Thus, on completion of any given study,
retrieval of all QC samples relevant to that study is straightforward, as is
the reporting of the results on the tests of those samples.

4.4.6 Security
The security functions in the system are implemented in a way that not so
much controls access, but in an ordered way is designed to permit access to
functionality and data. In essence, these approaches are the same, but the
philosophy is totally different. In the former sense, access control is similar
to restricting data access strictly on a 'need to know' basis, where in the
other sense, data are made available to anyone who may want to know,
may benefit from knowing, but may not strictly need to know.
In terms of the functionality that controls sample logging through to
approval, access is defined on an operator's experience, training and
management authority to carry out a particular function. The appropriate-
ness of the level of access granted to any group can be monitored through
LabManager's audit functions, and the fact that the identity of operators
carrying out the main operations on a sample (logging, testing, validation,
etc.) is detailed in the sample record for later review, if necessary. In
addition, the ability to modify or print worksheets, or print reports of
unapproved data, can be strictly controlled.
In terms of the results data in the database and data held in the ancillary
tables, such as study status information, broader access can be granted. It
has been proposed that there is little need to control access to status
information, or approved results. Thus, access can be granted to scientists
and managers outwith the department, for example, in the Clinical
Department, or QA Group, which allows them to look at status
information and even approved study data. Under normal circumstances,
pharmacokinetics staff would have access only to approved test data under
exceptional circumstances, but may be granted access to unapproved, or
LIMS IN A DMPK LABORATORY 69

even invalidated, data. This can be done by altering access levels, or


granting senior staff access to the unapproved data so that they can then
permit its dissemination to their staff.

4.4.7 Information management


LabManager has been implemented as a central repository for study-
related status information. On initiation of a study protocol, a user is
required to enter information relating to the study, and as the study
progresses, status information can be recorded and accessed. For sample-
related analysis, status data are updated automatically by the system.
However, study-related information requires manual updating, and standard
procedures have evolved that ensure that teams of scientists executing
studies update the study database as appropriate.
The one overriding benefit of the integrated approach to information
management in the group is that the flow of study-related information
between the various groups in the department can be optimised. When
data are approved, they are immediately available to staff who require
these data for further analysis. When status information is changed, those
changes can be viewed by management immediately, if need be.
Status information is also made available to staff outside the department,
allowing them to monitor study status without the need to make requests
for status information.
Labmanager can be configured to provide as much information as is
likely to be required about the status of samples in the database, by study,
or subject, or by analysis type, for example.

4.5 The future

As with all systems that are not custom designed, there will be
shortcomings. In the current setting, LabManager provides all of the major
functionality required, but some operations are cumbersome and some are
only achieved via a work-around. It would be desirable to review analyte
concentration-time profiles graphically at the validation stage. Such in-
context review is achieved more easily in a graphical setting, but this is not
possible without custom coding on the current system. In-context review is
restricted to review of tabular data.
At present, derived data such as pharmacokinetic parameters are not
submitted to LabManager. To do this would be straightforward, but
entering the data seamlessly without the need for manual transcription
would be problematic, given the various options available for generation of
the derived data. Alternative options for storage of such data are under
consideration.
70 ADVANCED LIMS TECHNOLOGY

One feature of the current system that is conspicuous by its absence is


the ability of the system to provide statistics on samples before they are
logged. In the current application, this occurs when studies are initiated
and the samples are collected. There is no functionality to build in the
ability to look prospectively at studies that are scheduled and hence at
samples which are scheduled for collection. This type of functionality
would benefit short-term resource planning, but would be limited by the
fact that detailed information about a study protocol, adequate to define
the sample matrix, is only generally available a relatively short time before
the study is due to commence.
This, however, may point the way for the future of such systems. In
general, the information systems environment within pharmaceutical
companies is becoming highly integrated. Applications that stand alone
will soon become a thing of the past.
Thus, financial systems become integrated with planning and planning
with operations such as clinical development, or pharmaceutical manu-
facture. Document management and production become integrated with
dossier publication.
In the same way, it will become inevitable that in the Drug Metabolism
and Pharmacokinetics area the LIMS will become more closely integrated
with information systems in project planning and management, clinical
development, biostatistics and document production. This will be enhanced
by a planned move to Oracle for underlying system operation, which will
ease integration with these other applications.

References

1. Hunsmann, N. and Picht, W. (1992) Aspects of laboratory information management


systems in the chemical industry. Chemometrics and Intelligent Laboratory Systems:
Laboratory Information Management, 13. 239---245.
2. Maj, S.P. (1991) Analysis and design of Laboratory Information Management Systems.
Chemometrics and Intelligent Laboratory Systems: Laboratory Information Management,
13, 157-162.
3. Jack, W., Smith, J.R. and Svirbely, J.R. (1988) Laboratory information systems. M D
Computing,S (1), 38-47.
4. McDowall, R.D. (1993) The valuation and management of risk during a laboratory
information management system or laboratory automation project. Chemometrics and
Intelligent Laboratory Systems: Laboratory Information Management, 21, 1-19.
5. McDowall, R.D. (1993) Laboratory information management systems and database
management issues. Analytical Proceedings, 30, 201-202.
6. Cooper, E.L., Hice, R.c. and Rahn, P.O. (1992) Future trends in the integration and
implementation of a laboratory information management system in an instrumentation
laboratory. Chemometrics and Intelligent Laboratory Systems: Laboratory Information
Management, 13, 215-220.
7. FaUlkner, H.C. Ill, Farmen, R.H., Myer, R.S., Hahn, E.M. and Rouse, J.A. (1992)
LIMS: evolution of second-generation systems. Chemometrics and Intelligent Laboratory
Systems: Laboratory Information Management, 13, 211-214.
5 Use of protocol-synchronous LIMS structures to
expand the role of the centralized clinical trial
laboratory in pharmaceutical research
F.D. MORROW

5.1 Introduction

Clinical trials involving human subjects are a major step in the long process
toward approval of new therapeutic agents for the FDA or equivalent
agencies in other countries. Increasingly, pharmaceutical and biotechnology
firms rely on large central laboratories capable of providing analytical and
data services designed exclusively to meet the exacting requirements of
Phase I through Phase IV clinical trials. Using a variety of sophisticated,
fully integrated information technologies, a state-of-the-art centrallaborat-
ory can provide pharmaceutical clients with an expanded array of patient
and data management services. The advantages afforded by use of
advanced information technologies from the central laboratory include
improved patient management, more effective day-to-day management of
the study, and the assurance that an audited and edited database is always
available - from study inception to database lock. This chapter will discuss
the unique services provided by the centralized clinical trial laboratory and
the approach used by one of the larger central laboratories in the USA to
meet the needs of its pharmaceutical clients. A functional analysis is
presented which compares the implementation of a protocol-synchronous
LIMS with traditional sample-oriented systems in meeting the needs of the
clinical trial client. The use of off-the-shelf information technologies is also
discussed as a readily available means to ensure efficient and reliable
laboratory services.

5.2 The expanding role of the central laboratory in pharmaceutical


research

During the last several years, a number of factors have supported an


expanded role for the central laboratory in the development of safe and
efficacious therapeutic agents. Among those factors are the development
of whole new families of compounds with uncharacterized benefits and
72 ADVANCED LIMS TECHNOLOGY

toxicities, the expansion of safety and efficacy testing in FDA-mandated


medical device research, the use of 'fast track' or 'expanded access'
programs for selected disease states, and a more aggressive stance by FDA
advisory committees in determining the extent and nature of laboratory
testing. Developments within the clinical laboratory itself, e.g. the
availability of new testing modalities with enhanced sensitivity and
specificity, have also improved the clinical utility of data collected from the
laboratory aspects of the trial.
To leverage the laboratory aspects of the clinical development program
better, research protocols during the last several years have become
increasingly sophisticated. Now, protocols often include multiple arms
involving subsets of patients with special laboratory needs, continuation
studies with complex screening and flagging requirements, and extended
follow-up studies for selected patients whose participation in the trial far
exceeds that outlined in the main protocol. In addition to an increasing
complexity in the design of the protocols' major elements, there is also a
clearly established trend toward increasing the frequency of laboratory
visits, toward providing a broader array of analytical procedures at each
visit, and toward the inclusion of a wide range of ancillary services which
would traditionally be regarded as falling outside the purview of the clinical
laboratory. Among the ancillary services that are frequently requested by
sponsors are specimen management, long-term archival services, provision
of collection materials for specimens sent to other participating laboratories,
and results management of both laboratory and non-laboratory parameters.
Frequently provided to pharmaceutical and biotechnology clients as a
comprehensive package of 'value-added' laboratory services, the unique
ability of the centralized clinical trial laboratory to provide an expanded
array of services will ensure that the central laboratory will continue to play
a strategic role in the clinical development program.
Because of the evolution toward sophisticated value-added laboratory
services in the clinical trial laboratory, each protocol can present new
challenges to the central laboratory and its information services group.
Table 5.1 summarizes trends in the central laboratory industry which
support the need to have both flexible LIMS structures and well-integrated
information technologies. Among the most important capabilities which
the central laboratory of the 2000s must be prepared to provide are the
manufacture of protocol-specific specimen collection materials, the pro-
vision of GLP- and GCP-compliant data clarification and data revision
procedures, and the issuance of laboratory reports with formats and
flagging capabilities harmonized to meet FDA regulations. Even at
present, it is evident that patient, study and data management procedures
used in the central laboratory have evolved to become fundamentally
unique to this relatively young industry. The expansion of the role which
the central laboratory will play in the clinical development process will
CLINICAL TRIAL LABORATORY SYSTEMS 73

Table 5.1 The expanding role of the central laboratory in the clinical development program

Traditional central laboratory Trends suggesting an expanded role for the central
function laboratory

Production and distribution of Production and distribution of sample collection


sample collection materials materials must include labeling to meet domestic and
international lATA requirements, to service cohort
subset studies, service in-hospital critical care patients
and provide specialized transport to meet esoteric
testing requirements. In general. visit-specific
collection materials have become increasingly
complex to produce, distribute and track.
Sample receipt and accessioning Sponsors expect that the central laboratory will be able
to validate the condition and completeness of kit
contents systematically. Sponsors expect that a high
level of validity checking of patient and visit data will
occur during the accessioning process and that
questionable data are clarified as needed.
Data clarification process and Sponsors expect that the central laboratory will detect
concept of edited database potential errors in patient and visit data and seek data
clarifications from the investigators. Record data
clarification efforts in the protocol audit trail.
Data revision process Sponsors expect that the central laboratory will issue
system-generated data revision reports which meet
applicable regulatory guidelines.
Analytical services Testing services have expanded beyond routine safety
testing to include virology, microbiology, flow
cytometry, immunology, immunohistology and
others. The central laboratory must be able to receive
transferred technologies from the sponsor or provide
de novo methods development and validation.
Support for pivotal studies requires careful selection
and quality control of efficacy testing.
Pharmacokinetic testing services Sponsors seek central laboratories with capability to
provide PK testing on a rapid-turnaround time basis
and report these data into the sponsor's central
laboratory database (see Interlaboratory
management) .
Interlaboratory management For some analytes, sponsors prefer that the central
laboratory subcontract and manage a secondary
performing laboratory on their behalf.
Specimen management Sponsors expect that the collection and tracking of
multiple specimen matrices with critical times lines
will take place to ensure specimen and result integrity.
Laboratory must co-ordinate shipping to multiple
recipients of protocol-managed specimens. Monitor
overdue status of specimens not shipped from the
sites.
Results management The central laboratory is expected to capture results
from testing performed at the investigator site or by
another laboratory facility.
74 ADVANCED LIMS TECHNOLOGY

Table 5.1 Continued

Traditional central laboratory Trends suggesting an expanded· role for the central
function laboratory

Laboratory reporting The central laboratory must be prepared to issue


customized reports oriented toward case report form
format. CRF format eliminates need to transcribe
laboratory results at the investigator site. Customized
laboratory report must accommodate clinical
significance and etiology coding per protocol. Provide
trigger statements.
Trigger statements Laboratory's LIMS must be sufficiently flexible to
provide reflex statements on the laboratory report
(based on protocol-specific algorithms) which direct
investigator actions.
Protocol-specific flagging Special protocol-specific flagging capabilities including
delta, exclusion, withdrawal, and telephone/panic
alert flags.
Critical alert transmittal System-monitored and system-generated alerts called
to sites on the same day as specimen receipt, with full
documentation in sponsor's audit trail.
Project management Monitoring and reporting of critical project milestones
using system-generated reports and ad hoc queries.
Inventory control at the Monitor shelf inventories of visit-specific collection
investigator site materials at all sites. Ability to accommodate special
requests for distribution of laboratory and
management reports and other clinical trial materials.
Distribution and inventory of bulk foam transport
mailers at the sites.
Clinical data services Ability to provide sponsor-specific format or SAS data
sets on a periodic basis throughout the trial. Automatic
tracking of transactional data and retransmittals due
to ongoing data revisions.
Database reconciliation Ability to reconcile central laboratory database
systematically with sponsor or their designated CRO
at the time of study close-out.

continue during the coming decade and will largely be driven by the
availability of new information technologies and approaches.
To gain access to the comprehensive clinical trial services that are
currently available, pharmaceutical and biotechnology companies are
increasingly seeking preferred or sole-vendor relationships with those
laboratories which have developed both the capacity and the functionality
to provide value-added analytical and data services. To meet the
challenges of providing expanded services while at the same time
addressing the unique requirements of each protocol, the well-designed
central laboratory moving into the next century must use highly integrated
CLINICAL TRIAL LABORATORY SYSTEMS 75

information structures to ensure that flexible solutions are provided to the


sponsor which meet the needs of today's clinical trial environment. Among
the approaches that will allow the state-of-the-art central laboratory to
meet these needs efficiently and reliably are the expanded use of keyless
data entry modalities, fault-tolerant hardware components and proprietary
interrogative data management tools. Certainly, one of the most critical
decisions facing the central laboratory is the selection of a Laboratory
Information Management System (LIMS) that will provide adequate
flexibility and expandability to meet the challenges during the next several
years. Properly planned and orchestrated, a LIMS based on protocol-
synchronous information structures will optimally position the central
laboratory to take an even more dominant role in the development of new
therapeutic agents.

5.3 Comparing traditional and protocol-synchronous LIMS structures in


the clinical trial laboratory

Whether commercially available or developed by an in-house software


development team, traditional LIMS applications have usually been
designed with a distinct orientation toward laboratory samples rather than
toward research protocols. In such systems, the most discrete database
element (the sample or accession number) is held as an independent entity
at the top of the LIMS hierarchy. Because samples are not predefined
relative to a larger protocol structure, LIMS systems designed with a
sample orientation can be said to have 'protocol-asynchronous structures'
(PAS). PAS-based LIMS characteristically lack information structures that
include a temporal element (i.e. the ability to capture and track time-based
events), and use retrospective accessioning of incoming samples to initiate
and complete the definition of critical links between the accession number
and other database elements. In contrast, the functions of the clinical trial
laboratory revolve around time-based events that are, for the most part,
known in advance of study start-up. Examples of temporal elements in the
clinical trial environment are patient visits or timed serial blood draws, and
examples of critical database links are the identity of the investigator, the
site location, the visit type and the required test schedule. Because of the
prospective and temporal nature of the activities supported by the clinical
trial laboratory, LIMS used to support this type of facility can be designed
to include protocol-synchronous information structures. By taking advant-
age of the fact that many of the events to be managed during the conduct of
a clinical trial are known in advance of study commencement, the clinical
trial laboratory using protocol-synchronous information structures can
operate more effectively than a corresponding facility using a traditional
sample-orientated LIMS environment.
76 ADVANCED LIMS TECHNOLOGY

5.3.1 LIMS structures in the traditional clinical laboratory environment


An excellent example of a non-clinical trial environment that benefits from
using a LIMS which is sample-oriented is the large, high-volume reference
testing laboratory. Referred to as 'pathology laboratories' outside the
USA, the major focus of these reference laboratories is to provide both
routine and specialized testing services to other laboratories, to hospitals
and clinics, and to private physicians within their geographic locale.
Samples are generally collected from the region in which the facility
operates and are registered within the LIMS at the time of receipt by ad
hoc assignment of a randomly assigned accession number. Links within the
database between the accession number and other descriptive elements
(e.g. doctor, location, etc.) are made retrospectively with little or no
opportunity for error-checking. Depending upon the required testing,
samples are analyzed and reported within 24-96 h of receipt. Collection
materials are provided in bulk and reports are limited to a single, standard
format determined by the laboratory director or administrator. Services
beyond that required to complete this straightforward reporting cycle are
limited to general 'customer service' functions such as result inquiry,
retrospective test ordering, and client billing. The PAS orientation of the
reference laboratory LIMS does not imply that such systems are inferior.
In fact, due to the high volume - but relative simplicity - of the reference
laboratory's operation, the sample-oriented LIMS provides an efficient
and reliable means by which incoming samples can be accessioned and
work listed, tracked and reported in a timely manner.
Although sample-oriented LIMS are not optimally designed to support
clinical research protocols, some reference laboratories seek to augment
their traditional sources of revenue by securing contracts with pharma-
ceutical clients. However, faced with the expanded array of services
required to support the clinical trial environment properly, limitations of
the PAS-oriented LIMS quickly become apparent. In general, the high
level of tracking, reporting and event monitoring which the pharmaceutical
client needs to effectively manage the clinical program is not easily
supported from the PAS-oriented LIMS environment. Table 5.2 lists the
functional limitations most commonly encountered when the traditional
sample-oriented LIMS is used to support clinical trial laboratory services.
Because it is inherently difficult to capture the design elements of the
sponsor's protocol in the protocol-asynchronous LIMS, protocol-driven
services such as specimen management, reflex testing, special reporting
needs, and ongoing validity checking of the sponsor's database either do
not take place at all or, due to the lack of system automation, are
performed in an unreliable manner. In particular, PAS-oriented LIMS lack
the ability to construct FDA-compliant audit trails, to perform ongoing
validity checking of the client's database throughout the course of the trial,
Table 5.2 Functional limitations of the traditional sample-oriented LIMS

Central laboratory function Traditional LIMS limitation Impact on sponsor

Protocol-specific design
Visit sequence Visit sequence definition not Investigator, visit and test
inherent to traditional schedule are not predefined
LIMS at time of database
construction
Visit-specific kit design Use of generic kits. No pre- Study co-ordinators would
definition need bulk supply materials
and would add barcoded
labels to all samples that
correspond with requisition
accession number.
Deferred testing Testing completed on all Dictate, rather than
samples within 24-96 h of accommodate, protocol
receipt testing requirements.
Batch testing Batch testing at convenience Does not route and monitor
of LIMS system, not batch testing based on
protocol defined quantity and predefined
schedule.
Reflex testing General, rather than May have additional non-
protocol specific, reflex protocol or missed testing.
testing
Result reporting
Sponsor-specified design LIMS-specified report Transcribe information from
format standard laboratory report
to the CRF form.
Trigger statement Not applicable to LlMS Investigator would have to
system remember particular actions
to take based on a test
result, e.g. 'place patient on
vitamin supplementation' or
're-order given test before
enrolment'
Alert flagging Standard highflow flags print Would need to decipher
on report impact of high or low flag
based on protocol
therapeutic area.
Delta assignment Delta flags not inherent to Continuously monitor
traditional LlMS due to patient laboratory reports
visit sequence not being a manually for delta change
standard function of LlMS between visits.
Protocol management
Specimen management Not an inherent part of a Investigator would need to
traditional LlMS store samples or ship to a
variety of destinations.
Result management Not an inherent part of a Results captured at
traditional LlMS investigator site would need
entry into a common
database at the end of the
study.
Database validity
Patient information Straightforward key entry of Patient identification not
patient information. No guaranteed at time of
validity checking for database transfer and lock.
consistency between visits
Visit sequence Visit sequence definition not Integrity of visit sequence
inherent to traditional not guaranteed at time of
LIMS database transfer.
78 ADVANCED LIMS TECHNOLOGY

and to provide patient and study management functions on a protocol-


specific basis.

5.3.2 LIMS structures in the clinical trial laboratory environment


In contrast to the reference laboratory using a sample-oriented (protocol-
asynchronous) LIMS, the underlying database structures used by the clinical
trial laboratory's LIMS should ideally be synchronous with the fundamental
elements of the sponsor's protocol. This is most readily accomplished by
holding the protocol and related information structures in a multidimensional
matrix. The most discrete database element, the accession number, is held
in a non-hierarchical fashion and is combined with other matrix elements
to define the temporal aspects of the protocol. By using 'protocol-
synchronous structures' (PSS) to incorporate elements of the protocol's
design into the clinical trial database, the laboratory and its administrative
support staff can effectively circumscribe and manage the most basic
element of the protocol - the Patient Event. As shown in Figure 5.1, the
Patient Event can be schematically represented as the least divisible time
factor in the protocol's Laboratory Events Schedule (LES). The Patient

Protocol Design Elements

Patient Screening Objective Specimen Management?


Patient Enrollment Objective Result Management?
Investigator Population Laboratory Report Format

Investigator Requirements Laboratory Events Schedule


p
Sample Collection Materials Test Schedule Visit 1 : Visit 2 Visit 3 Visit n
a
lATA Compliant Shipping Matenals
Laboratory instruction Manual t
i
Hematology
• • • •
Patient Population
e
Chemistry
• • : • •
Dr. Black n
t
UrinalySIS • • : •
Special Chemistry • •
--
Dr. Smith p
0
Esotenc Testing
• •
Dr. Hines 0
L
Patient Fasting? • • • •
Dr. Hunt
Weight kg • : : •
:

Figure 5.1 Laboratory Events Schedule. a component of most clinical trial protocols which is
provided to the laboratory by the sponsor prior to study commencement. listing the scheduled
Patient Events. The Laboratory Events Schedule represents the functional specifications
upon which the Time and Events Matrix (TEM) is based. As described in the text, the TEM is
one of three protocol-synchronous structures which are used to provide the specialized
functionalities needed to support the clinical trial process.
CLINICAL TRIAL LABORATORY SYSTEMS 79

Event, which includes both non-laboratory and laboratory activities, can


be defined as the sum total of all interactions between patient and
investigator that occur at the time of the scheduled or unscheduled visit. It
is the responsibility of the central laboratory to assist the investigator in the
management of the Patient Event by providing logistical support and
materials (e.g. sample collection materials, prepaid airway bills, IATA-
compliant mailers). By doing so, the central laboratory can help ensure
that specimens are collected and transported in a safe and expeditious
manner and that proper patient demographic and secular data are recorded
for inclusion in the sponsor's database.
To provide protocol-synchronous laboratory services to its pharmaceutical
clients, Quintiles Laboratories Ltd (QLAB, Atlanta, GA, USA and
Edinburgh, Scotland, UK) has developed a LIMS application using a
LIMS application development tool provided by Fisons Inc. One of the few
commercially available LIMS structures that complies with ISO 9000
standards, the Fisons DMPK (Drug Metabolism and Pharmacokinetic)
product is an open system that allows the use of multidimensional phases to
define time-based events during the conduct of a research protocol. Used
as a layered product along with the Fisons SampleManager LIMS product,
QLAB's Information Systems and Support (ISS) group has created a series
of protocol-synchronous structures designed exclusively to meet the needs
of the dedicated clinical trials laboratory. Using only menus and software
toggles found within the QLAB clinical trial application package (QLIMS),
the laboratory's Clinical Data Services and Protocol Services groups can
predefine the sponsor's database to reflect the true design elements of the
protocol. This approach ensures that project start-up is accomplished
quickly and efficiently and that all software functionalities and controls
required to manage the sponsor's protocol are already in place without the
need to write customized code within the LIMS environment. Since
protocol-specific services can usually be provided without the need to write
additional source code, ISS efforts that would otherwise be dedicated
toward the testing and validation of 'single-use' customized code can be
directed instead toward maintenance and development of the main LIMS
application.
Identification of the protocol-synchronous structures to be incorporated
into the architecture of the QLIMS system was accomplished by examining
Laboratory Events Schedules and related information from numerous
clinical trial protocols. In general, three categories of protocol-synchronous
structures were identified which, by merit of their incorporation into the
QLIMS application, allow for the predefinition of the sponsor's database.
The three categories of protocol-synchronous structures are:
• Protocol-synchronous time and event structures
• Protocol-synchronous control structures
• Protocol-synchronous output structures.
80 ADVANCED LIMS TECHNOLOGY

As shown in Figure 5.2, the three categories of protocol-synchronous


structures were developed in the QLIMS software as discrete, interactive
modules. At the time of database construction, information derived from
the sponsor's protocol is loaded into each of the three modules from menus
that provide access to a combination of browse lists, text files and software
toggles. After loading is completed, each of the resulting datasets can be
printed and submitted to the quality assurance group for purposes of
testing and validation. The datasets are subject to revalidation whenever
edits occur after the commencement of the study. The use of discrete
interactive modules within the QLIMS environment allows for distributed
and concurrent loading of the sponsor's database by those departments
within the central laboratory facility that have the most immediate access
to critical client information. Overall, distributed database loading using
interactive modules reduces database construction costs and allows for a
more efficient and timely launch of the sponsor's clinical trial.

5.4 Defining protocol-driven time and events using a multidimensional


matrix
A clinical trial protocol is comprised of a series of events that are carried
out at particular times and in a particular sequence. In QLIMS, major time
and event elements are captured in a multidimensional matrix in which
discrete addresses are represented by the sample's accession number.
Within the multidimensional matrix, a singular accession number defines a
particular Patient Event and an array of accession numbers defines a series
of visits for a particular patient participating in the research protocol.
The major elements of this multidimensional matrix are illustrated in
Figure 5.3 and include: (1) the protocol template, (2) the study, (3) the
investigator identity, (4) the visit name, (5) the visit type, and (6) the test
schedule. Although these basic time and event elements are adequate to
meet the needs of most Phase I through Phase IV clinical trial protocols,
the flexibility of the product used at QLAB allows additional phases to be
created as needed. Thus, a complicated hospital-based protocol can
include additional time and event elements such as may be needed to
accommodate serial blood draws or multiple types of early termination
visits. After creation of the time and event matrix (TEM), QLIMS uses a
function to capture the design elements as a protocol template. The
protocol template is, in turn, used to create and propagate 'miniprotocols'
for each of the investigators who will participate in the clinical trial.
Referred to as 'studies' within the QLIMS system, these 'miniprotocols'
include all elements of the sponsor's main protocol that are to be
conducted at a particular investigator's site. Thus, studies can be described
as the least divisible unit of the sponsor's database which captures all of the
major time- and event-driven elements of the main protocol.
CLINICAL TRIAL LABORATORY SYSTEMS 81

Control
Time & Event
Structures
Structures
• MinIMax Allowable Age
o Pattern Matching
• Protocol
• AsstgnICheck Patient Number
• Study
• Investigator
Blod<s
• Visrt Name • Visit Sequence Verification
• Optional Test Restriction
• Vis~Type
• Non-Protocol Test Restriction
• Test Schedule
• Reflex Tesl Ordering
• Demographic Based Test

\
Ordering

Output
Structures
• Record a Data Clarification
• Delay Medical Report unlil
ReSOlved
• Activate VAX Fax-gateway and
send Clarifteation to Investigator
• Receive darification
• Update QLlMS
• Release Medical Report

Figure 5.2 Three major types of protocol-synchronous structures upon which the Quintiles
LIMS is based. Time and Event structures are derived from the sponsor's Laboratory Events
Schedule (see Figure 5.1). Control structures provide real-time validation and error checking
during entry of extemporaneous data. Output structures interact with both the Time and
Events Matrix and Control Structures to provide the specialized reports and other output to
clinical trial clients.

.<f
I A0107266

I A0109397
="TB-S"""'S
t =""c""he-d"""'ul-e--' .<f A0104418
Visit Type .<f
Patient Eyent
Visit name
Investigator
Testing:
Study
Hematology
Protocol Chemistry
Urinalysis

Weight Collection
-
A0107266
Fasting Information f--
A0109397
A0104418

Figure 5.3 Multidimensional matrix. The Time and Events Matrix (TEM) is a multi-
dimensional matrix which holds the six major elements of the sponsor's protoco\' The
accession number represents a discrete address in the matrix which defines a particular
Patient Event. Because all protocol design elements are linked in the TEM, numerous
protocol-synchronous Control and Output Structures can be devised to support the special
services of the clinical trial laboratory.
82 ADVANCED L1MS TECHNOLOGY

5.4.1 Using the accession number as a discrete address in a


multidimensional matrix
As discussed previously, discrete addresses in the protocol's multi-
dimensional matrix (time and event matrix or TEM) are defined by a
unique Iynchpin identifier, the accession number. Because the accession
number functions as a discrete address in the matrix, it can be used to link all
six major time and event elements of the sponsor's protocol. As shown in
Figure 5.4, the time and event elements are prospectively linked in the
sponsor's database to the accession number address at the time of database
construction, i.e. before receipt of the first laboratory specimen. Since the
identity of the patient cannot be foreseen at the time of database
construction, this one remaining link in the time and events matrix is
completed at the time of accessioning by engaging a protocol-synchronous
control structure which defines the sponsor-specified format for the patient
identification schema. Thus, the input of these final ad hoc links by the
QUMS operator can be subjected to high-level validity and pattern
checking. The use of protocol-synchronous control structures to error-
check those elements of the sponsor's database which must be entered by

PATIENT POOL

Patient Event

Visit 2 Visit 4 Visit n



Test Schedule Visit 1 Visit 3
Hematology AOO90;21 A010032 A0100297 A0101145 A0102326

Chemistry AOO9072' A010032 A0100297 A0101145 A0102326

Urinalysis AOO90721 A010032 -.---------_ .. AOlO1145 A0102326

Special Chemistry AOO90721 -----_ ...... A0100297 ............... A0102326

Esoteric Testing AOO90721 --------.--- -----------_ .. A0101145 A0102326

Accession Requisition & Enter Patient Information· Establish Patient:Time & Event Link

Patient Initials DLW DLW DLW DLW DLW


Date of Birth 16-Jan-36 16-Jan-36 16-Jan-36 16-Jan·36 16-Jan-36
Gender F F F F F

Patient Fasting? Yes Yes No Yes N/A


Weight kg 53.2 --.-- 54.7 ----- 53.3

Figure 5.4 Linking protocol Time and Events through the accession number. All elements of
the protocol's Time and Events Matrix are prospectively linked at the time of database
construction via a unique eight-character accession number. As shown, the accession
number's address defines the visit and test schedule, patient demographics, collection dates
and times, and other information related to a patient's visit to an investigator site.
CLINICAL TRIAL LABORATORY SYSTEMS 83

the QLIMS operator after database construction has been completed is


an important concept which is explored in greater detail later in this
chapter.
Because the accession number is used to link all six major time and event
elements of the protocol prospectively, a number of important controls and
functions can take place within QLIMS which are not feasible in the
sample-oriented, protocol-asynchronous LIMS environment. One of the
most important functions of the central laboratory is to manufacture,
distribute and control protocol-specific clinical trial materials (CTM) used
to support the laboratory component of the sponsor's clinical trial. To
expedite sample receipt and processing and to allow for enhanced tracking
capabilities, the accession number is reflected on all collection materials
which are distributed in advance to each investigator participating in the
protocol. Collection materials (typically in a visit-specific 'kit' format) are
used to record the demographic and secular data and to return the required
specimens to the central laboratory. As each visit is assigned a protocol-
specific visit name (screening, baseline, week 3, etc.) and a visit type
(scheduled, unscheduled, early termination), these descriptors (as well as
the protocol identification, investigator name, and testing requirements)
can be captured and preprinted on the collection materials provided to
each investigator. In this manner, the time and event elements derived
from the protocol's Laboratory Events Schedule and the sample's
accession number are already reflected on the CTMs prior to the actual
Patient Event. Upon receipt of the collection materials at the central
laboratory, the only action needed to complete the matrix address
assignment is to input the patient identification schema for the patient for
whom the collection kit was used.

5.4.2 Using the time and events matrix to manage the production,
distribution and inventory of clinical trial materials
Clinical trial materials (CTMs) which are produced by the full-service
central laboratory include protocol-specific investigator instruction manuals,
visit-specific collection materials, mailers, and express courier airway bills.
Functioning as a linchpin for the time and event elements in the protocol,
QLAB's eight-character alphanumeric accession number is produced in
both human-readable and barcoded format on all CTMs that are produced
at the facility. This process is facilitated by high-resolution thermal graphic
printers (Zebra Technologies Corporation, Vernon Hills, IL, USA) which
can use specialized labels able to withstand extremes of temperatures and
humidities for both frozen and ambient specimens. Interfaced directly with
the Fisons LIMS, the thermal graphic printers and conventional graphical-
mode dot-matrix printers (Okidata 341Os, OKI America Inc., Japan) are
used to produce kit building work orders, laboratory requisitions,
84 ADVANCED LIMS TECHNOLOGY

barcoded tube and exterior kit labels, and preprinted airway bills for each
of the investigators.
As one of the only three central laboratories supporting global clinical
trials, QLAB serves in excess of 2 600 physicians throughout the USA,
Canada and Europe. Thus, systematic tracking and inventory control of
clinical trial materials is an important undertaking. Due to the protocol-
synchronous LIMS structures used in QLIMS, shelf inventories of all
investigators for a particular protocol can be automatically tracked via a kit
inventory module (KIM). The kit inventory module is directly interfaced
with the specimen receipt area of the facility where, using hand-held laser
barcode scanning guns (Symbol Technologies Inc.), QLAB personnel can
register receipt of a particular kit type by scanning the accession number
which appears on the exterior of the kit (Figure 5.5). This process registers
receipt of the kit into the facility, records the incoming airway bill number
and decrements the shelf inventory at the investigator's site. At night, a
background process monitors shelf inventories of kits and, based on
protocol- and investigator-specific thresholds for each kit type, work
orders are produced and printed in the kit building facility. The kit building
work orders are used by the laboratory operations personnel to produce
replacement kits for shipment on the following day. As with all specialized
functions within QLIMS, the kit inventory module was written using
Fison's proprietary programming language, VGL. The open structure of
the Fisons LIMS allows the efficient development of new functionalities as
the needs of the central laboratory expand. Quality checking and
distribution of outgoing CTMs are also monitored within QLIMS using the
laser scan gun approach. As kits are constructed for each investigator, each
kit is quality checked by scanning the kit contents prior to packing into
large shipping cartons. As shown in Figure 5.6, this process ensures that
the kits contain the appropriate materials (e.g. laboratory requisition,
blood and urine collection tubes), that all accession numbers on the various
items in the kit match the laboratory requisition, that the shelf inventory at
the investigator site is automatically incremented, and that records of the
outgoing airway bill number are made for the purposes of audit-trail
documentation. It is the use of protocol-synchronous data structures
during the construction of the client's database that allows a broad array of
quality checks to be carried out with a single scan of the accession number.

5.4.3 Using the time and events matrix to monitor receipt and validity
checking of incoming specimen collection kits
For outgoing kits, laser scanning of the barcoded accession number
provides several key quality checks and permanently documents these
checks in the Patient Event audit trail. This same process is used to register
receipt of each incoming specimen collection kit shipped from the
CLINICAL TRIAL LABORATORY SYSTEMS 85

Figure 5.5 Photograph showing laser scanning of incoming specimen collection kits.
Incoming specimen collection kits are laser-scanned to document that all necessary items have
been received, to determine their condition upon receipt. and to ensure that all barcoded
accession numbers match the laboratory requisition. Missing or unauthorized items are
documented in the client's database audit trail for future reference.

investigator site. Due to the prospective linking of time and event elements
from the protocol, it is possible to complete and document an array of
functions simply by scanning the incoming accession number. As shown in
Figure 5.7, these functions include:
• Decrementing of shelf inventory at the investigator site
• Recording of time and date of specimen receipt
• Validation of the accession number on each item in the kit to ensure
that this lynchpin number is consistent on all items and on the
laboratory requisition
• Ensuring that all expected items have been received and documenting
any items that are missing
• Recording receipt of any unauthorized or unexpected items
• Recording the incoming airway bill number
• Recording all of the above in a permanent Patient Event audit trail.
The ability to undertake highly automated validity checking of the
incoming specimens is an important benefit to the central laboratory. By
doing so, the accessioning process and analysis of specimens can occur in a
timely manner, thereby allowing for the transmittal of critical patient alerts
to the investigator sites within hours of specimen receipt.
86 ADVANCED LIMS TECHNOLOGY

Figure 5.6 Photograph showing laser scanning of outgoing specimen collection kits. Specimen
collection kits are manufactured using barcoded work orders. Prior to shipment to
investigator sites, kits and work orders are laser-scanned to record the airway bill and
accession numbers in the client's database audit trail and to increment the QUMS shelf
inventory in each kit type on file at the investigator site.

5.5 Managing protocol-driven time and events using matrix-dependent


control structures

Clearly, the time and events matrix (TEM) is a useful LIMS structure for
managing those elements of the sponsor's protocol which can be
predefined in the database in advance of study start-up. The manufacture,
distribution and receipt of protocol-specific clinical trial materials is an
example of how a well-defined time and events matrix can be used to
manage efficiently events in the protocol that are prospective by nature.
While, indeed, many times and events in a research protocol can be
foreseen, it is the nature of other events in the clinical trial to be
extemporaneous or ad hoc. Examples of protocol events that are
CLINICAL TRIAL LABORATORY SYSTEMS 87

.------,!'"~."", WIth

Record DatefTime
of Kit Receipt

Scan Each Kit


Content Accession
Number

Figure 5.7 Flowchart showing validation of kit contents using protocol-synchronous control
structures. Specimen kits and contents are laser-scanned to validate and capture relevant time
and transport information in a regulatory-compliant audit trail. Control structures are used to
detect protocol violations and thereby initiate an automated fax request for data clarification
to the investigator, either at the time of kit receipt or during sample accessioning.

extemporaneous and, therefore, cannot be foreseen at the time of database


construction are: (1) key-entry of patient identification schema during
accessioning, (2) automatic ordering of pregnancy tests for selective
patients based on criteria provided by the sponsor, and (3) reflex ordering
of additional analyses based on results obtained from the regularly
scheduled testing. The well-designed clinical trial LIMS must be able to
manage both prospective and extemporaneous events in a fully validated,
automated and reliable manner. QLIMS accomplishes this task at the time
of database construction by loading protocol-synchronous control structures
that are harmonized with the protocol's time and event matrix. After all
elements of the TEM (protocol, study, investigator, visit name, visit type
and test schedule) are in place, QLAB's Clinical Data Services personnel
use menu-driven functions to define protocol-synchronous control structures
which provide a broad range of validity checking, range checking and other
event-driven actions critical to management of the sponsor's clinical trial.
Thus, while extemporaneous events themselves cannot be foreseen,
protocol-synchronous control structures can be used to manage extemporan-
eous events when they do occur. By using this approach, the same degree
of automation and validity checking for unforeseen events is provided as is
available in the LIMS for events which are predefined in the TEM itself.
Table 5.3 lists some of the most fundamental protocol-synchronous control
88 ADVANCED LIMS TECHNOLOGY

Table 5.3 Extemporaneous protocol events

Extemporaneous event Corresponding time and Corresponding control


event structure structure

Scanning of kit contents with Protocol-driven kit contents If plasma level tube scanned,
scan guns module call site.
Key entry of patient Investigator name and Pattern matching of patient
identification at time of number, visit name and identification schema
accessioning visit type Reasonability limits on
kilogram entry
(QUMS will trigger
verification on an entry of
> 130 or <30 kg)
Automatic ordering of Test schedule Age group 18-50 for visit 1
pregnancy tests for and 7 only.
selective patients
Investigator requests Allowable non-protocol Non-protocol testing limited
additional testing testing defined in TEM to T4 levels only.
Reflex ordering of Reflex test schedule Medical technician confirms
additional test analyses elevated total bilirubin
based on results obtained >2.0, which activates a
from regularly scheduled trigger routine within
testing QUMS to automatically
order direct bilirubin.

structures (and the corresponding protocol-synchronous time and event


structures) used to manage extemporaneous events in the clinical trial
laboratory.

5.5.1 Control structures used in the accessioning process


The accessioning (or registration) of samples into QUMS is illustrative of
the manner in which control structures are used in concert with the time
and events matrix in a protocol-synchronous UMS environment. As
described previously, incoming kits are scanned by operations personnel to
validate both the content and the condition of the samples upon receipt at
the central laboratory. After this process is completed, laboratory
requisitions are forwarded to the accessioning staff for final registration of
the samples into the sponsor's database. The accessioning process is
initiated by scanning the barcoded accession number which appears on the
laboratory requisition distributed in each of the kits at the time of study
start-up. Recall that because all time and event structures are defined in
the time and events matrix prior to production of kits and requisitions, the
barcoded accession number is already prelinked to the protocol, the
investigator, the visit and the visit type. After scanning the accession
CLINICAL TRIAL LABORATORY SYSTEMS 89

number, the QLIMS operator need only input demographic and secular
data (e.g. collection date and time, or time of last meal) as shown on the
completed requisition. QLIMS immediately responds with a series of
protocol-specific queries that are derived from the control structures
predefined in the sponsor's database. Because the accessioning process
involves a series of extemporaneous events, a corresponding series of
protocol-synchronous control structures is used to ensure that key-entered
data (or LIMS-generated events) are compatible with the protocol's time
and events matrix. Among the most important of the protocol-synchronous
control structures used in the accessioning process are: (1) minimum and
maximum allowable age of the patient at enrollment; (2) pattern matching
on screening and randomization schema; (3) assignment and checking of
blocks of patient identification schema; (4) phraseology mapping of visit
name and visit type to match those specified by the sponsor; (5) restriction
of optional tests allowed per protocol; (6) control and restriction of non-
protocol laboratory testing; (7) and, if requested by the sponsor, the
automatic ordering of selected tests based on patient profiles, e.g.
pregnancy testing. For visits occurring subsequent to the initial visit,
additional protocol-synchronous controls are used to ensure that the visits
are in the sequence specified in the protocol and that patient demographics
and identification schema match those previously key-entered for the
patient whose samples are being accessioned. Generic control structures
(Le. not specific to a particular research protocol) are used in concert with
protocol-synchronous controls to direct actions such as automatic cancella-
tion of testing for specimens which were received beyond the established
stability for each analyte.

5.5.2 Managing the data clarification and data revision processes through
control structures
Although protocol-synchronous control structures can be used to support a
wide range of protocol-specific functions, one of the most important is to
provide automated validity checking of key-entered data at the time of
receipt of the samples from the investigator sites. Requisitions which fail to
pass protocol-specific queries during the accessioning process (e.g. due to
missing or conflicting information) are automatically sequestered by
QLIMS within a Data Clarification Queue (DCQ). Accession numbers
entered into the DCQ are subject to immediate follow-up and clarification
with the investigator. Sequestration of the accession number in the DCQ
provides several important controls: (1) printing of the laboratory report is
temporarily suspended pending outcome of the clarification query; (2) a
GCP-compliant audit trail linked to the accession number is created which
registers the reasons for requesting the data clarification and the resulting
clarification itself; and finally (3) a queue of accession numbers is created
90 ADVANCED L1MS TECHNOLOGY

within QLIMS which is used to initiate a fax-gateway transmission directly


from the VAX to the investigator requesting clarification of the Patient
Event.
By predefining control structures within QLIMS for the data clarification
process, QLAB's operation staff can ensure that the database accumulated
throughout the course of the trial is clarified and documented while the
information is still readily available from the investigator's staff. Because
requests for data clarification frequently result in a need to amend the
database, a separate control structure is used to manage the data revision
process efficiently. Functioning in a manner similar to the data clarification
process, the data revision control structures create an automated entry of
the accession number into a data revision queue (DRQ). By monitoring
the DRQ throughout the day, the QLAB medical records group can
promptly issue revised laboratory reports to the field in an automated
manner that meets the distribution criteria specified by the sponsor. The
systematic use of protocol-synchronous control structures to initiate and
complete the data clarification and data revision processes results in a
validated and edited database that meets sponsor specifications and
ensures that marginally performing investigator sites can be provided with
remedial training and intervention.

5.5.3 Using control structures to support multilevel range-checking


functionalities
One of the most basic responsibilities of the clinical laboratory is to provide
printed annotations (e.g. flags or messages) on laboratory reports
whenever a test result violates a critical value. For the clinical trial
laboratory, report annotations are frequently used to initiate an action by
the investigator, or to draw attention to results that are still within the
reference range but have changed significantly relative to an earlier value.
To fulfil its mission, a central laboratory LIMS must have flagging
capabilities that are both more flexible and more expansive than that used
in the reference laboratory. For example, using protocol-specific control
structures, QLIMS provides four categories of flagging to assist the
investigator and the sponsor to manage patients participating in the
protocol. Included among the four categories are high/low, telephone high!
low, panic high/low, exclusion flagging and delta flagging. As described
below, telephone and panic flag violations are distinguished from an
ordinary high/low flag in that the result must be phoned to sponsor-
specified parties on the same day as testing takes place. Exclusion flagging,
on the other hand, is used to exclude patients from enrolment in the study,
or to withdraw patients from studies in which they are already a
participant. Delta flagging identifies a result on a subsequent visit which is
significantly different from a prior result for that particular patient.
CLINICAL TRIAL LABORATORY SYSTEMS 91

Although some traditional, sample-oriented LIMS can readily accom-


modate multiple levels of flagging, two requirements for report annotation
in the clinical trial environment present a challenge that most protocol-
asynchronous LIMS are unable to meet: (1) the need to assign multiple-
level ranges on a protocol-specific basis; and (2) the need to maintain
active, historical versions of ranges whenever edits to the range limits are
made during the course of the clinical trial. Because most traditional LIMS
hold ranges in a one-to-one relationship with 'test codes' (the database
element that defines a test's parameters within the LIMS), new test codes
must be created whenever ranges are needed that are different from those
currently held in the system. In addition, for ongoing protocols, a new test
code must also be created in the protocol-asynchronous LIMS whenever a
range must be edited at the request of the sponsor after commencement of
the study. Failure by the LIMS administrator to create a new test code
would otherwise result in a loss of historical version control and erroneous
laboratory reports would appear should reprinting occur after the date of
the range change. While the creation of new test codes can allow the
protocol-asynchronous LIMS to emulate the functions of protocol-specific
control structures partially, the administrative overhead required to
support and track hundreds of repetitive test codes is both inefficient and
subject to errors at the time of database construction.
In a protocol-synchronous LIMS environment, the situation is consider-
ably simplified by the use of control structures which can allow test codes
to have a 'one to many' relationship with the alert ranges adopted for use
across and within protocols. Using this approach, the sponsor's clinical
team has complete flexibility to assign telephone, panic, exclusion and
delta critical values based on the patient population being studied and the
known toxicity profile of the investigational drug. For example, the
QLIMS test code for serum creatinine (CHTI) can have a telephone high
value of 2.0 mg/dl in a CNS protocol when studying an anxiolytic agent in
patients free of organic disease, and a telephone high of 6.0 mg/dl in
another protocol examining diabetic patients suffering near end-stage renal
failure. Moreover, by linking each accession number in a protocol at the
time of accessioning with the appropriate historical version of the range-
checking control structures, alert ranges can be modified by the sponsor
after the commencement of the trial without the need either to create a
new test code in the system or to suffer loss of historical accuracy on
reprinting of laboratory reports. By using flexible control structures, the
protocol-synchronous LIMS can provide an unlimited array of report
flagging and annotations. Since it is not necessary to create new test codes
in order to emulate version control, data management efforts are
considerably simplified at both the central laboratory and the sponsor's
designated data management group. Thus, the data recipient can be
assured that the test code CHTI will always designate test results for serum
92 ADVANCED L1MS TECHNOLOGY

creatinine regardless of the protocol, sponsor-specified alert ranges or


other protocol-specific elements operational for the project.
Control structures can also be used effectively to manage the temporal
aspects of range checking. For example, a common protocol-driven
requirement is the need to provide exclusion flagging at the screening visit
and thereafter suppress exclusion flagging on reports arising from
subsequent visits. While cosmetic suppression of the flag from the printed
page could be accomplished in the laboratory report functionality itself,
the use of control structures within the multilevel flagging routine actually
eliminates the flag from being appended to the sponsor's database when
results are filed from the laboratory. Similarly, the selection of the
reference visit to which subsequent visits will be compared for purposes of
delta flagging is also accomplished by the use of protocol-synchronous
control structures within the multilevel flagging routine. Within QLIMS, a
virtually unlimited pattern of delta checking can be provided to support the
needs of complex protocol designs such as crossover or continuation
studies.

5.5.4 Using control structures to manage and document phone alert and
reflex messaging functionalities
The protocol-synchronous range-checking control structures described
above provide an efficient, reliable and fully automated means to ensure
that critical telephone, panic and exclusionary flags are annotated on the
printed laboratory report. However, since central laboratory data are used
to ensure patient safety throughout the course of the clinical trial,
markedly abnormal results that violate sponsor-approved ranges must be
conveyed to the designated parties in a timely and documented manner.
Using a series of background processes and protocol-synchronous control
structures, QLIMS continuously monitors incoming data from the laboratory
instrumentation for those results which violate critical, protocol-specific
alert ranges. Accession numbers which have laboratory results appended
with an alert flag (telephone, panic or exclusionary value) are automatically
transferred to the protocol services group using a QLIMS-resident Result
Alert Queue (RAQ). The RAQ is monitored throughout the day by
QLAB staff and, based on emergency contact control structures, critical
values are phoned to the investigator, the CRO or the sponsor within
minutes of receipt from the laboratory. Functionalities within QLIMS
allow the Protocol Services representatives to reference recipient phone
and fax numbers easily and then record the identity of the party to whom
the critical result was conveyed. QLIMS automatically time- and date-
stamps the transaction and enters these data elements in an accession
number-linked audit trail for reference as needed. To convey detailed
information which is less 'time-critical' than telephone or panic alerts to
CLINICAL TRIAL LABORATORY SYSTEMS 93

outside parties, a special set of protocol-synchronous control structures can


be utilized to provide 'reflex messaging' on the laboratory report. Based on
sponsor-specified algorithms, reflex messages can be used to initiate an
action at the site or promote compliance to a particular protocol
requirement. Examples of reflex messages used at QLAB have included
advisory statements to initiate vitamin 0 supplementation for subjects with
25-hydroxyvitamin 0 levels less than 16 ng/ml, requests for retesting to
confirm abnormal laboratory values, and requirements to undertake
creatinine clearance determinations on subjects with marginally elevated
creatinine and BUN values. Because reflex messages are report annotations
similar to alert flags, full historical control must be maintained in the LIMS
structures to accommodate any edits to the reflex algorithm or to the text
of the messages which may occur at the request of the sponsor throughout
the course of the trial.

5.6 Managing protocol-driven time and events using matrix-dependent


output structures

As discussed in the previous section, protocol-synchronous control


structures operate primarily as background processes to manage internal
LIMS functions such as real-time validity checking of key-entered data,
range checking of results, and the creation of action queues used to
manage the laboratory facility. Protocol-synchronous output structures, on
the other hand, are matrix-dependent library routines which function to
provide specialized reporting features used to support the clinical trial
process. Output provided may take the form of printed, facsimile or
electronic formats. Because protocol-synchronous output structures make
calls to the six major elements of the time and events matrix and, where
appropriate, return query statements to the user, specialized output
required to support the clinical trial laboratory can be constructed with
little or no additional coding by the programming staff. Unlike the
traditional clinical laboratory which provides a limited selection of
standard report formats, the use of matrix-dependent, library-based output
structures to automate the production of protocol-specific reports and files
is particularly important to the central laboratory, where each protocol
may require more than a dozen output formats.

5.6.1 Constructing protocol-specific laboratory reports and requisitions


using matrix-dependent output structures
The manner in which protocol-specific laboratory reports are constructed
by using matrix-dependent output structures is illustrative of the utility of
this approach. To construct the format for the laboratory report(s) and
94 ADVANCED LIMS TECHNOLOGY

requisition(s) to be used for a particular clinical trial, the user responds to


system-generated queries regarding the protocol identity, patient numbering
schema, visit names and types, investigator identity, and dating formats.
These inputs are provided through 'browse lists', which restrict the choices
to those captured in the time and events matrix for the protocol under
consideration. By using a restricted list of choices, the user cannot specify a
patient identification schema not previously established in the TEM during
the database construction process, nor can errors related to entry of free
text occur. Customized programming is restricted to: (1) literal statements
(determining the content and placement of text or ASCII-based symbols);
and (2) if-conditional statements (e.g. to accommodate reflex placement of
sponsor-specified fields when certain conditions are met on the report),
and thus only limited facility with the VGL programming language is
required. A similar approach is used within QLIMS to construct 'single
use' ad hoc reports easily, or to provide electronic output such as periodic
transmission of laboratory data to data management groups.

5.6.2 Using protocol-synchronous output structures to support the data


clarification process
As discussed earlier, protocol-synchronous controls are used in QLIMS to
initiate requests for data clarifications from the study investigators. This
process is accomplished by using background processes to perform real-
time validity checks as extemporaneous data are inputted during the
accessioning process. As shown in Figure 5.8, the accessioning and data
clarification processes are governed by protocol-synchronous control
structures which are predefined in the sponsor's database. Violation of a
relevant control structure results in an automatic sequestration of an
accession number into the DCQ and the creation of an accession number-
linked audit file holding a QLIMS-generated clarification request. By
combining fax-gateway technology with the LIMS-based control structures,
an output structure is created which can be used to format and dispatch the
written DCQ as a facsimile image automatically. A background process
within QLIMS periodically polls the data clarification queue and initiates
the fax transmittal of the request to the investigator site using commercially
available fax-gateway technology (Omtool Inc., Salem, NH 03079, USA).
Since the accession number requiring clarification is linked by protocol-
synchronous structures in the protocol's time and events matrix, the
investigator is provided on the fax transmittal sheet with a QLIMS-
generated history listing all prior dates and visits for the patient in
question. Failure to return the requested information within 8 h results in
an automatic retransmittal of the data clarification request by the same
QLIMS background process which initiated the first transmittal. Third
requests, if needed, are automatically copied to a sponsor-designated
CLINICAL TRIAL LABORATORY SYSTEMS 95

Valid
Record Patient Ii -+
Information
Event

Clarification
Required

Figure 5.8 Using control and output structures to support the data clarification process.
Barcoded laboratory requisitions are scanned to activate protocol-synchronous control
structures used to validate operator-inputted data. Patient data which violate control
structures are sequestered by QUMS into a Data Clarification Queue pending contact with
the investigator. Requests for data clarifications are accomplished using protocol-synchronous
output structures, which provide an automatic fax message to the investigator site complete
with patient information held in the UMS.

recipient for administrative follow-up. Should receipt of the data clarification


result in the need to amend the sponsor's database, the accession number is
placed into the data revision queue (ORO) for prompt generation of a data
revision report. Additional protocol-synchronous control structures deter-
mine the manner in which data revision reports are to be generated for the
protocol and the identity of recipients for the amended reports. The fax
data clarification utility described is illustrative of the manner in which the
time and events matrix can be used in concert with protocol-synchronous
controls to provide a highly efficient and reliable output structure to
support the special needs of the clinical trial laboratory. In this particular
example, the fax-gateway output structure provides both automation and a
regulatory-compliant audit trail for data clarifications, a process which is
otherwise very time- and labor-intensive. It should be recognized that
while information technologies such as fax gateways can be used in
conjunction with traditional, sample-oriented LIMS, their inherent lack of
TEM-based structures makes it considerably more difficult to achieve a
similar degree of automation for the clinical trial environment.
96 ADVANCED LIMS TECHNOLOGY

5.6.3 Supporting electronic data services using protocol-synchronous


output structures
A clinical trial laboratory must provide timely and accurate transmittal of
patient demographic and laboratory data in the format specified by the
sponsor. These electronic data transmittals (EDTs) occur periodically
throughout the trial and can take place via CPU to CPU, magnetic or
optical media. At QLAB, protocol-synchronous output structures are used
to schedule and produce fully automated and validated data transmissions
using interactive, matrix-dependent library routines. Using protocol-
synchronous functionalities, the QLIMS operator can arrange for pre-
scheduled generation of electronic data files on a daily, weekly or other
scheduled basis. Library routines are used to collate and validate the data
simultaneously from the central database using modules, and these
sequester questionable or outlying laboratory values or demographic data.
A separate, protocol-specific, module is used to format the dataset into the
fields specified by the data recipient. An expansion of the concept of using
protocol-synchronous output structures to accommodate the special needs
of the clinical trial process involves the remote transmission of central
laboratory data from the QLIMS database directly to the sponsor's
personal computer via secure modem connections. In this scenario, output
structures within QLIMS are used to create daily transactional data
packets which are encrypted and dispatched on a nightly basis via
automated telecommunications equipment. Using a proprietary interrogat-
ive data management tool, the sponsor can display and print laboratory
values and patient demographics on both a patient and a protocol basis.
Only by combining the time and events matrix with selected protocol-
synchronous controls and output structures can the transactional data
packet be created in a truly automated and reliable manner.

5.7 Summary

The role of the centralized clinical trial laboratory is increasingly


dominated by the need to provide specialized, regulatory-compliant data
management services to the pharmaceutical and biotechnology industries.
This chapter has discussed the merits of using protocol-synchronous LIMS
structures as a means to meet the needs of the clinical trial laboratory more
effectively in the years ahead. Laboratories that support the clinical trial
process are encouraged to investigate this approach in order to better meet
the challenges of this dynamic and important industry.
6 Medical Laboratory Information Systems (LIS)
K.E. BLICK

6.1 History of clinical laboratory computerization

Computerization of clinical laboratories began with attempts to automate


the analysis of specimens in the early 1960s. At that time, laboratory
instruments were being developed that generated analog signals that could
be converted to digital information, which could be processed and stored
on computers. Since computer processing was primitive during this period
and the ability to store and retrieve information was limited to paper tape,
punched cards and magnetic tape, early attempts at clinical laboratory
computerization, while heroic, were not highly successful. Later on,
concomitant with the development of disk storage and retrieval of
laboratory data, a number of commercially available systems emerged
during the decade beginning with the mid-1960s. Commercially available
systems during this period included B-D Spear (Waltham, Massachusetts),
Laboratory Computing, Inc. (Madison, Wisconsin), The Medlab Company
(Salt Lake City, Utah), Community Health Computing (Houston, Texas),
Honeywell (Minneapolis, Minnesota), Diversified Numerical Applications
(Minneapolis, Minnesota), Advanced Medical Systems (New York, New
York), Berkeley Scientific Laboratories (Hayward, California), and
Medical Information Technology (Cambridge, Massachusetts). While
many of these vendors no longer market clinical laboratory systems, a
surprising number of them continue to offer modernized versions of their
earlier systems.
Early systems for the clinical laboratory had many common features:
(1) they were expensive; (2) they were difficult to install, implement and
support; (3) they required a significant amount of manual data entry
because of poor interface capability; (4) they were slow and had to be
purged frequently because of limitations of processing power and disk
storage; (5) they effectively addressed only selected areas of the laboratory
which were number-intensive and automated (such as chemistry and
hematology); (6) they were designed around inflexible hardware, program-
ming languages, and file systems; and (7) they were designed primarily for
batch analysis of clinical specimens. Not surprisingly, many of these
systems actually added to the complexity of laboratory testing and delayed
the flow of specimens and laboratory results through the testing process.
98 ADVANCED LIMS TECHNOLOGY

As computer technology has improved, so have the computerization and


automation of clinical laboratories. Clinical laboratories have gone from a
manual system with large numbers of technical and clerical personnel
performing manual tasks to a very highly automated system where
computers communicate with other computers, instruments using robotics
and biotechnology perform assays, and computer 'expert systems' make
both routine and nonroutine decisions based on combinations of events
and data.
Computerization in the past was an option for laboratory professionals.
Today, because of the demands of increasing complexity, variety, and
volume of clinical laboratory tests, together with the requirements for:
(1) improved accuracy and quality and (2) faster turnaround times for
laboratory test results, computerization and automation of the clinical
laboratory are essential for larger and even medium-sized clinical
laboratories, both private and hospital-based.

6.2 Computerization and automation of the 'testing process'

Computerization and automation of the entire 'testing process' is now a


requirement of modern clinical laboratories. The testing process includes:
(1) patient registration: logging of admission, discharge, and transfer
information; (2) order communication: the transmission and receipt of the
physician's order for laboratory services along with selected patient
demographic information; (3) phlebotomy: the logging and scheduling of
specimen collections along with specimen requirements and positive
patient identification; (4) receipt of specimens and prioritization: the
logging and scheduling of specimens for analysis in the laboratory (note:
this may include the transmission or downloading of laboratory test orders
to automated instruments or instrument controllers; (5) specimen analysis
with scheduling and data collection: the actual performance of the ordered
test(s) in real-time or batch mode using automated and nonautomated
methods; (6) quality control and verification: the verification of the
accuracy of the analysis, specimen identification and the integrity of the
specimen; (7) interpretative functions: decision and expert system support
of laboratory test results; (8) reporting of results and the printing or
electronic transmission of laboratory results in real-time and/or batch
mode; (9) statistical and quality assurance functions: logging and tracking
of specimens through the testing process; and (10) billing: the capture and
transmission of billing information for laboratory services provided. Other
aspects of the testing process not included above include handling of
specimens not requiring phlebotomy services, and automated, simultaneous
delivery of specimens and test requisitions to the laboratory for accessing
and testing via a mechanized process. How computers integrate into and
MEDICAL LABORATORY INFORMATION SYSTEMS 99

perform all or portions of the testing process in clinical laboratories


provides the basis for the discussion below.

6.3 How computers function in the clinical laboratory

6.3.1 Admission, discharge and transfer (ADT)


Before clinical laboratory test(s) can be ordered on a LIS, the patient must
be registered on the LIS system. ADT data are generally transmitted to the
LIS from the hospital information system (HIS) on new admissions, with
transfer and discharge transactions on the HIS being transmitted to the LIS
as admission updates. As discussed below, patient demographic data (or
admission data) can be transmitted for outpatients as part of the order
communications transaction from the HIS.
Admission, transfer, and discharge information for patients is usually
transmitted to the LIS for all patients, and hence the census files for
patients on the LIS and HIS coincide for inpatients and 'active'
outpatients. Knowledge of the patient's name, identification, etc., is
required (see below).

6.3.2 Order entry and order communication


Most modern hospitals provide for order entry for laboratory services from
patient service areas, including: (1) the patient's bedside or nursing station
for inpatients; (2) remote phlebotomy services for ambulatory patients;
and (3) ambulatory clinics where nurses, attending physicians and house
staff collect specimens for analysis. These orders are placed on a hospital
information system and transmitted to the laboratory information system
by a variety of electronic communication protocols. Along with the
laboratory test order, appropriate patient demographic information is
transmitted to the LIS including patient name, medical records number,
visit number, sex, age, date of birth, patient location, ordering physician,
etc. These data elements are required in order to provide positive patient
identification, phlebotomy services, normal-range interpretations, and
locations and physicians' names for results reporting, billing and special
reporting functions. Some HIS capture billing information along with the
order, and hence charge capture is not done on the LIS. This latter
procedure is generally unacceptable, since it is impossible to predetermine
charges for many laboratory sections such as microbiology, blood bank,
and anatomic pathology, where additional, add-on tests may be required at
the discretion of the laboratory personnel.
100 ADVANCED L1MS TECHNOLOGY

6.3.3 Phlebotomy service


Many blood tests are performed in the clinical laboratory. Most of these
blood specimens are collected by the phlebotomy team. The LIS schedules
the collection of all specimens by printing collection labels sorted by
collection routes throughout the hospital. Information regarding these
collections is also downloaded to a special barcode scanner. Details
regarding requirements of specimens (e.g. anticoagulants, volume require-
ments, special handling, etc.) are also printed to sample labels. Barcodes
for the patient's medical records number and the specimen accession
number are also printed on collection labels (see Figure 6.1). At the
bedside, the phlebotomist: (1) scans the barcoded wrist-band on the
patient along with the corresponding barcoded medical records number on
the collection label to ensure positive patient identification; (2) collects the
proper specimens for the test ordered; (3) then affixes the appropriate
barcoded label to each tube of blood at the bedside. The exact time and
date of collection of the samples are stored in the barcode scanner. On
returning to the laboratory, the barcode scanner uploads information on all
specimens collected in order to update the testing files and release these
newly collected specimens for scheduling and actual testing.

6.3.4 Receipt of specimens and prioritization of testing


As specimens for analysis are received in the laboratory, they are classified
by the LIS as 'available' for testing. At this time, laboratory tests are
encoded as STAT (or priority) and placed in batches ahead of routine
testing on printed worksheets generated by the LIS. Most LIS systems can
handle several levels of priority spanning the 'routine' to the so-called
'crash', the latter test being essential for a patient requiring immediate care
for a life-threatening condition. Turnaround times are logged and
monitored for management purposes for priority testing on most LIS
systems.

6.3.5 Analysis of specimens with scheduling and data collection

Batch analysis, nonautomated. For batch, nonautomated analysis, the


LIS collects requested tests for available specimens into a group and prints
a worksheet. This worksheet should correspond with the specimens
accumulated in the specimen rack for the particular method or assay.
Quality control specimens are also included on the worksheet along with
standards (see Figure 6.2).

Batch analysis, automated/non-'host query'. For automated instruments


without a 'host query' interface (see p. 102), worksheets are downloaded to
DEBRA L STRAM TIlE IlIIU _111.S DEBRA L 3/11/84 DEBRA L 3/11114
on. cm, • mzs PI I 21&0&153 829 PI:295OS153 829
PI:29505153 r 29 V I'IR:9147448 87202 5 "'h1147448 STRAM I'IR19147448 STRAM
PI I 2B51S3 87202 87202
3/11/94 829 IIJRH
"" ••14M4e UH CHDI UH CHEll IIIIIII~I
RED 10 lJl Of: 3/11114 Cll ClI S
UH tHEI'I SERUI'I
IIIIIIIIII~IS
Cll 87202 11I1l11U _mLS R R
ROUTINE GlLCmzKmzs
8720 S "L •• L "L
III UIIIIIII II1II111 . . . .111
" L UI
PI I 21K1i153 ....m In III
.... L .nOl IIUIM ."101 IfU'M 11101 IfU'M .nDI IIUIM
" ....14M4e Clf Clf CII CSI
UH CHE IIUIM
I.I•••• II~. 1.11.liIIUti. '.111111111111. 1.111111.1111.
Figure 6.1 Barcoded specimen label for phlebotomy and primary tube.
102 ADVANCED LIMS TECHNOLOGY

THE UNIVERSITY HOSPITALS


WORKLIST

LIST NUMBER 730CYCLO CYCLO

LN ACCN PATIENT II NAME CYCLOSPORN


PRI SPEC COLLECTED LOCAnON
SEX AGE SPECIAL HANDLING

12 5B15 29405719 DOE. JOHN


+ ••••••.•••.••••••••••••••••• PREVIOUS TESTS •••.•••••.••....••..•..•.... +
\TEST: CYCLO COLLECTION 11-4S lill-94 SPECIMEN: B

.
ICYCLOSPORIN 114
+ --•••••-.-••••••••••-.- •••••••.••••••••• _- ••• ----_ ••--.•••••.••••• --••••••••• -•.•••-•• +
13 5818 29089703 SMITH, BOB
R B 22:00 3/10/94 10210·01
M 23Y
+ •••••••••••••••••••••••••.•• PREVIOUS TESTS .••••..•••.•.••••.•••.•••••• +
lTEST: CYCLO COLLECTION: 1130 318-94 SPECIMEN: B
ICYCLOSPORIN 215

14 5819 29312311 WHITE, JANE


R B 22:00 3110/94 10212·01
37Y
+ •••••••••• -•.••••••••••••••• PREVIOUS TESTS ..••.•..•••-••••..•••.•••..• +
\TEST: CYCLO COllECTION: 2 13S 3~4 SPECIMEN: B
ICYCLOSPORIN 216

15 5827 29482122 JONES, JOE


R B 5:30 3/11/94 10178-01
M 57Y
+ •••-••• -.••-.-•••••• -••••• -. PREVIOU5 TESTS .••.•••••.••.•.••••.•.•••••- +
lTEST: (YCLO COLLECTION: 607 3110-94 SPECIMEN: B
ICYCLOSPORIN 166

16 5828 29505153 BLACK, DEBRA


R B 5:30 3111/94 10172·01
29Y
+ ••••••••••• -.••-••-•••• -- ••• PREVIOUS TESTS --••..-••.••..••.•-.-.•••-•• +
\TEST: (YCLO COLLECTION: 604 3110-94 SPECIMEN B
ICYCLOSPORIN 407

•• END OF REPORT ••

Figure 6.2 LIS-generated worksheet for 'batch analysis'.

the instrument in batch mode. Hence, when the instrument scans the
specimen barcode prior to analysis, the instrument can retrieve which
assays to perform on the specimen from the local database.

Real-time/host query. LIS support of real-time host query automation


requires that the LIS either (1) transmits test request information to the
automated instrument controlIer (or instrument) as specimens are received
and made available for testing in the laboratoty, or (2) stores the test
request in such a manner that it can be transmitted to automated
instruments on demand. Therefore, without intervention from the analyst,
MEDICAL LABORATORY INFORMATION SYSTEMS 103

specimens are placed on the instrument and scanned by the barcode


reader, and then requested tests are performed automatically by the
instrument in real time. To obtain test request information from the LIS,
the instrument queries the host computer (or instrument controller) as
barcoded specimens are scanned on the instrument.
Some older instruments require a LIS-generated 'load list', whereby test
results will be transmitted from the automated instrument sequentially by
sample cup position. These interfaces are primitive when compared to the
bidirectional, accession number-oriented interfaces now commercially
available on newer automated analyzers for the clinical laboratory. Some
features of newer automated instruments are listed in Table 6.1.

6.3.6 Quality control (QC) and results verification


A critical step in the analysis of clinical specimens is the verification of both
patient laboratory results and quality control by the technologist. The LIS
maintains quality control ranges on all lots of quality control material for
each analyte. When a QC specimen has results beyond acceptable limits,
the event is flagged by the LIS. The technologist then takes corrective
action prior to reporting of the accompanying patient results. These
occurrences are logged and noted on the LIS. The technologist also verifies
patient results and notes patterns to ascertain that they are 'reasonable',
based on the patient's clinical condition and any prior results obtained on
the patient. The LIS also performs a 'delta check' on each analyte to
indicate that an unexpected variance from previous results on this patient
has occurred. In addition to delta check, the LIS also compares the analyte
results to reference ranges for normal patients of this sex and age, and flags
all test results beyond these reference ranges. After the technologist is
satisfied with all QC and LIS range checking, patient results are verified
for reporting to the patient's chart.

Table 6.] Features of newer systems

(1) Less expensive and more reliable


(2) Good PC connectivity/intelligent workstation
(3) More flexible and adaptable
(4) 4GL programming/SOL
(5) Expert system
(6) Relational database
(7) Connectable/networking and instrument interfaces
(8) Screen-oriented
(9) Multitasking
(10) Portability
(11) Standards
104 ADVANCED LIMS TECHNOLOGY

6.3.7 Interpretative functions


Some laboratory results require interpretative comments to be included
along with test results. For example, if the sample is suboptimal due to
hemolysis, lipemia, or icterus, the technologist may comment that a certain
result may not accurately reflect the plasma level in the patient. The LIS
allows for the user to append comments to the analytical results: (1) by use
of a coded comment; (2) by adding free-text comments; or (3) by
programming the LIS's expert program to append a comment to analytical
results when a certain combination of data events occurs on the LIS.

6.3.8 Results reporting


Immediately after results are verified by the analyst, they are transmitted
by the LIS to the HIS for storage and retrieval by physicians and nurses. In
some critical care areas, laboratory results are printed by the LIS in real
time as soon as they are verified. Some hospitals place copies of laboratory
reports in doctors' e-mail (electronic mail) files for review in their offices or
from their home computers. Newer computers also support the automatic
fax of laboratory results to clinic areas and doctors' offices.

6.3.9 Statistics and quality assurance functions


The LIS accumulates various statistics and quality assurance data for
management functions and regulatory review. In many cases, hard-copy
reports are generated by the LIS for permanent records. LIS computers
with ad hoc report generation capability are especially adaptable to the
ever-changing regulations and requirements of laboratory medicine. New
regulations will tend to force computer systems to support a real-time
prospective QC/QA system rather than the current retrospective system.
The system will be required to detect errors and implement corrections
without operator intervention in some cases. Also, newer systems will be
required to support outcome monitoring as part of the QAlQC process.

6.3.10 Billing
The capture of laboratory billing data is an essential function of the LIS.
Billing charges are captured for some tests at test request time; other tests
are billed by the LIS when results are verified. The LIS must handle a
variety of billing discounts, especially in reference laboratories. Billing
transactions are transmitted to the HIS from the LIS on a daily basis. The
actual patient statements and accounts receivable functions are rarely
maintained on the LIS. Rather, these functions are usually performed by
the HIS accounts receivable module.
MEDICAL LABORATORY INFORMATION SYSTEMS 105

6.4 Acquisition of a LIS

The selection of a so-cal1ed 'turnkey' LIS remains a perplexing problem for


laboratory professionals. There are a number of commercial1y available
systems which present a variety of approaches to laboratory computerization.
The costs of such 'turnkey' systems span a wide range, and the selection of
a 'poor' solution may negatively impact the laboratory for years.
Therefore, laboratory professionals should pursue the selection process
careful1y, making sure that proper selection criteria and goals are clearly
established early in the process.

6.4.1 Areas of the laboratory computerized


The more common areas of the clinical laboratory selected for automation
and computerization include: (1) chemistry; (2) hematology; (3) urinalysis;
(4) serology; (5) coagulation; and (6) microbiology. Computerization of
the blood bank service is now quite common as well. In addition,
computerization of anatomic pathology, which includes (1) surgical
pathology, (2) cytology, and (3) necropsy, is now apparent in many
laboratories. However, 'computerization' to many laboratories for anatomic
pathology may merely mean that the word processing of the reports has
been computerized. Indeed, computerization of anatomic pathology
requires a very different approach from that used in clinical pathology
systems. Successful anatomic systems have been developed specifically for
this area. Some of the more sophisticated laboratories have opted to
address computerization of the laboratory office as wel1. So-cal1ed 'office
automation' is rapidly becoming a necessity, especial1y in the wake of
burdensome regulations with associated additional paperwork. Automation
of bil1ing functions is also a critical area of focus in most computerized
clinical and anatomic laboratories.

6.4.2 Single vendor versus multivendor


These days, it is highly unlikely that one single vendor can supply the best
solutions to al1 areas of the clinical and anatomic laboratory. For example,
a vendor may have excel1ent solutions for automated chemistry and
hematology while the solutions for other areas such as blood bank and
anatomic pathology are suboptimal. Nevertheless, vendors with such
offerings push for a total contract for al1 areas of the laboratory, touting
that a single-vendor solution is the only acceptable approach. Indeed, in
the past, connectivity and support of multivendor computer environments
were difficult. However, with today's technology (connectivity with
networking and microcomputers), a multivendor solution may even be
preferable, affording the user the option to select the best solutions for
106 ADVANCED LIMS TECHNOLOGY

various areas of the laboratory. Also, multivendor solutions can be


installed in a modular fashion and tend to cost much less to purchase and
support.

6.4.3 Computer hardware features


Newer computer hardware is essential for the computerization of modem
laboratories. Computer memory and processing power have increased
substantially in recent years, and these enhancements are now required for
most applications in the laboratory. Computer disk storage and tape
backup capability have improved markedly as well. Computer reliability
and supportability of newer systems are substantially better than old
systems. However, all of these features of newer computer platforms tend
to vary between vendors. The laboratory technologist is advised to select a
hardware platform with a proven track record in a variety of installations.
Many commercial vendors of laboratory systems sell both the hardware
and the software and frequently tend to underconfigure the more tangible
aspects of the system, i.e. the computer hardware. Users are advised to
purchase more hardware capacity than needed at the outset because of
anticipated growth and development on the new system. I have never
heard a laboratory user complain about a system being too powerful, too
fast, able to store and print too much data, etc.; quite the contrary, most
laboratory workers complain of processing being too slow. It should also
be noted that an underconfigured system tends to fail under load, i.e. at
the time when you need the system most. Also, the cost of additional
processing power has been reduced dramatically in recent years, and hence
underconfiguration of the hardware platform is not necessary for a cost-
efficient system.
Hardware should be reliable and must be functional essentially 100% of
the time. Unplanned downtime is no longer acceptable on newer systems.
The risk of data loss due to the inevitable disk failure (or 'disk crash')
should be minimized with: (1) disk mirroring (or shadowing) and (2) a
transaction logging file backup. This latter file allows for an automated
'roll forward' from the last backup in the event of data loss due to a disk
failure. Failures of newer systems tend to focus on the mechanical aspects
of the system, while newer processors and electronic components tend to
be very reliable. Hence, the expense involved in the purchase and support
of a totally redundant or 'fault tolerant' system appears to be no longer
warranted when proprietary minicomputer solutions are purchased. It
should be noted, however, that newer 'open' Unix systems and micro-
computer network solutions can provide a totally 'fault tolerant' system at
an affordable price. Selected features of older laboratory information
systems are listed in Table 6.2 while some aspects of newer systems are
summarized in Table 6.1.
MEDICAL LABORATORY INFORMAnON SYSTEMS 107

Table 6.2 Features of older systems

(1) Many installed sites


(2) Large minicomputer with 'dumb' terminals
(3) Expensive to buy and support
(4) Inflexible
(5) Field-oriented software
(6) No networking
(7) Proprietary vendor attitude
(8) No intelligent workstations

6.4.4 Minicomputer versus microcomputer solutions


Actually, the best LIS solutions for the laboratory involve a combination of
minicomputers and microcomputers with the latter serving as intelligent
terminals to powerful minicomputers. The minicomputers and micro-
computers are linked together in a network, with the microcomputer
workstations equipped with an emulation program running under a graphic
user interface. Various applications and computer systems in the hospital-
wide network appear as icons on the screen with the workstation
supporting 'hot-key' access to all systems on the network. Newer
minicomputers can also serve as 'client servers' allowing a variety of
different personal computer workstations. For example, such a system may
include the use of IBM personal computers (or clones), Apple Macintosh
and Unix workstations, all serving as intelligent terminals on the host
minicomputer.

6.4.5 Software and database design


Older computer systems and newer solutions for the laboratory are best
distinguished by the appearance and functionality of the software. Older
systems are 'field oriented' while newer systems are 'screen oriented'. For
example, on newer systems, the user fills a screen of data and then enters
the screen of data with a single keystroke, while on older systems, the user
enters data one field at a time. Navigation, or moving from one to field to
another on a screen, is possible with screen oriented software while screen
navigation is limited with field oriented software.
The inclusion of other features in the software of a LIS should be noted.
For example, newer software solutions make use of: (1) function keys for
special tasks; (2) color; (3) so-called pull-down menus and tables along
with user 'help' on manual data entry; (4) multiple keys for navigation
between fields on a screen and between screens in the application;
(5) multitasking with the ability to run several applications from within an
application from menus or command lines; and (6) decision support. The
system software should be user-programmable with a 'user-friendly'
108 ADVANCED LIMS TECHNOLOGY

programming language for special ad hoc reports and downloads of data in


batch to microcomputers. Downloads of portions of the database to
microcomputers is essential on newer systems, allowing specialized reports
and graphics to be accomplished on microcomputer software especially
designed for such purposes. The design of applications should be
'intuitive', with screen instructions and help. Older solutions tend to be
more cryptic and user-unfriendly, requiring extensive documentation and
training along with considerable support. Various software issues for
laboratory computer systems are listed in Table 6.3.
Decision support and expert programming should be parts of newer
solutions for the laboratory. However, decision support programming
should not degrade overall system performance in a noticeable fashion. In
my experience, decision support features on non-database systems tend to
degrade performance because of substantial additional disk access require-
ments. Expert-system features which allow the ability to write special
programs to minimize user keystrokes and screens are essential. Using
such tools, users can adapt the system to the needs both of users who are
computer-literate and of those who are not.
The use of a relational database is a desirable feature in any LIS.
Software that is designed independent of file structures is more easily
adapted to the changing laboratory environment. Also, ad hoc reporting is
facilitated by a relational database system.

6.4.6 Most important issues when selecting a LIS


The most important features which must be present in a LIS system are:
(1) processing speed; (2) reliability of the system; (3) connectivity and
networking; and (4) flexibility and adaptability. I have followed these
general criteria in the selection of replacement systems with good success.
As discussed above, newer systems tend to be more powerful and reliable
provided that the system is configured properly. Specifications for
response time under load should require 1-2 s between screens in an
application. Program loads should take no longer than 5 s. Some vendors
provide the illusion of rapid response from screen to screen by spooling

Table 6.3 Software issues

(1) Standard programming language


(2) Database design (file-independent software)
(3) Department subsystems
(4) Instrument interface software
(5) Networking
(6) Portability
(7) Off-the-shelf solutions
(8) Ad hoc query
MEDICAL LABORATORY INFORMATION SYSTEMS 109

disk updates. This programming 'technique' is an obvious fraudulent


practice used commonly on many commercially available systems. Users
should be aware that slow response times are more commonly due to
system software design and slow disk access rather than to the processing
power of the system.
Another common problem that users must face is called 'vaporware'.
Many vendors tend to sell future developments rather than their older and
less impressive existing products. Also, vendors try to make anything they
do uniquely in the industry the main issue or criterion for selection of a
system. For example, a vendor with a single-platform solution will try to
convince the potential user that this is the single most important issue to
consider. Others will try to sell the user on the basis of their large number
of installed systems when, in fact, they have been in the business for years
and should have many more installed sites than a newer vendor. Also,
many vendors offer highly proprietary solutions, i.e. solutions where users
have little in-house programming control over how the system operates.
These systems tend to be very expensive in the long run, since the user
must pay the vendor for all programming activity including ad hoc reports.

The sales 'pitch' during demonstrations. The user should beware of being
mesmerized by a gifted salesperson who is adept in presenting the system
in its best light during demonstration sessions. Note that the salesperson
has been trained to maximize the system's strengths while minimizing its
weaknesses. It is the potential user's responsibility to probe with
appropriate questions during such demonstrations and find the weaknesses.

The importance of site visits. The prospective user should pay more
attention to the function of the LIS system during various site visits where
the system is actually being used. Prospective users should talk directly
with site users of the system and log their unbiased remarks. Good systems
tend to have a cadre of devoted users who demonstrate a great deal of
'ownership' during site visits. Because the selection of a LIS involves the
purchase of a highly complex package of technology, the less enlightened
potential user tends to be easy prey to the pitfalls of the selection process.
Certainly, the use of an experienced consultant is a worthwhile investment,
assuming that unbiased advice is obtainable.

6.5 Future of laboratory information systems

With the increasing demands on medical laboratories (see Table 6.4) to


perform more tests with less resource, automation and computerization
will be the only means of meeting the demands for high quality testing in
the future. Even in smaller laboratories, automation and computerization
no ADVANCED L1MS TECHNOLOGY

Table 6.4 Business demands on clinical laboratories

(1) Medical technologist shortage


(2) Demands for more testing
(3) Department subsystems
(4) Medicolegal implications
(5) Infectious specimens
(6) Diminishing resources
(7) New regulations
(8) New technologies
(9) New standards
(10) Coverage
(11) Socialized medicine

will be required for survival in the future. Newer computers will continue
to be more powerful and less expensive. Networking will improve along
with better-designed software. New systems will employ more powerful
workstations and networking. Software will be more 'off-the-shelf', much
as personal computing software is presently. Users will become more
sophisticated and will demand more flexibility and adaptability in
commercially available systems. Clearly, vendors will be forced to offer
less proprietary solutions than presently being offered. Many vendors will
join forces to offer a variety of solutions and options to clients as
connectivity becomes easier. Governmental regulations will continue to
expand and will require computerization of quality control and quality
assurance programs. Standards for data interchange between computers
and between computers and automated instruments will continue to
improve the connectivity between systems. Automation and computeriza-
tion will continue until essentially all aspects of the testing process are
included. Users will continue to expand their knowledge and skill in the
use of laboratory information systems. We can anticipate the entire
medical laboratory industry undergoing a vast improvement in the
accuracy and speed of laboratory services because of new capabilities
afforded by automation and computerization.

References

ASTM Committee E-3I (1983) ASTM Standards on Computerized Systems. American


Society for Testing and Materials, Philadelphia, PA.
ASTM Committee E-3I (1991) Specification for Low-Level Protocol to Transfer Messages
Between Clinical Laboratory Instrument and Computer Systems, ASTM Standards vol.
14.01. American Society for Testing and Materials, Philadelphia, PA.
Elevitch, F.R. and Aller, R.D. (1989) The ABCs of LIS. ASCP Press, Chicago, fL.
Johnson, J.L. (1975) Archiving the Optimum Information System. J.L. Johnson Associates,
Northfield, IL.
Johnson, J.L. (1976) Archiving the Optimum Information System (update). J.L. Johnson
Associates, Northfield, IL.
7 EPA's Relational Laboratory Information
Management System: Development and
implementation
W.M. SHACKELFORD, D.M. CLINE and D.L. CLAY

7.1 Introduction

The information produced by an environmental laboratory often results in


regulatory actions or policy decisions. The environmental laboratory also
supplies the information that determines corporate compliance with State
and Federal environmental regulations. That a laboratory is operated by
industry, an independent contractor, or the Government is immaterial to
the laboratory's responsibility to provide information of known quality
within a useful time frame.
In the regulatory arena, that information must also be of documented
integrity, i.e. the information, the sample it represents, and indicators of
data quality must be linked unambiguously. Chain of custody procedures,
good laboratory practices (GLP) and quality assurance plans have
provided this linkage for paper-based systems. Clearly the same principles
apply in the automated laboratory, but the application of these principles
to computer systems is not as obvious.
Key goals of a laboratory information management program are:
• To make data available to the user when needed
• To make sure that the data provided are useful for the purpose for
which they are intended
• To ensure that the data quality and integrity are documented.
The elements of laboratory information include those data needed to
meet the goals of the laboratory:
• Unambiguous sample definitions, required to link samples with results
• Defined procedures for analysis, calculations, and quality control that
provide the documentation of data quality, and allow the user to
determine the utility of the data
• The definition of the resources needed and schedules based on
resource use, and of the required information for effective laboratory
management
112 ADVANCED LIMS TECHNOLOGY

• Information on client needs, which tells the laboratory how the


analysis should be carried out, what limits should apply, and how data
are to be distributed.
The EPA's Relational Laboratory Information Management System
(RUMS) was designed and implemented to answer the specific needs of an
environmental laboratory besides addressing the general goals and
elements of laboratory information management. RUMS is a flexible
environmental UMS that supports both planned and ad hoc activities such
as project and sample management, and reporting. It contains a complete,
normalized database that supports core requirements as well as require-
ments for EPA regional and Superfund additions. Since the RUMS was
developed primarily with Fourth Generation Language (4GL) tools, the
modules are easy to customize.

7.2 Development

7.2.1 General principles


Based on the EPA's experience with two prior generations of laboratory
information management software, good application development
practices, and adherence to Agency policies and guidelines, a list of general
criteria was developed. The following were the guiding principles for the
development of RUMS.
• Use a true vendor-supported RDBMS that support standard query
language (SQL), and has high-quality application development and
reporting tools.
• Have an open system architecture that could be supported on Agency
standard platforms, could be integrated easily with existing data
collection systems, and could be migrated easily to other technologies
as they matured.
• Have a common database structure at a national level across all
laboratories.
• Include an appropriate balance between common nationally supported
functions and local customization.
• Provide source code to all sites to ease local customization.
• Provide full system documentation and training support including
Functional Specification Manual, Acceptance Test Plan, User Guide,
and a formal training program.
• Support for the (EPA's) Good Automated Laboratory Practices
guidance.
• Support for the EPA Data Standard for the Electronic Transmission of
Laboratory Measurement Results.
• A customer-driven requirements definition and user interface.
CONTRACT ENVIRONMENTAL LABORATORY SYSTEMS 113

7.2.2 Development philosophy/approach


RLIMS was developed using a rapid application development (RAD)
methodology (Figure 7.1) . EPA laboratory representatives were heavily
involved in requirements analysis through a series of prototyping workshops
held throughout the development process. The requirements were proto-
typed using Oracle's 4GL tools.
Due to the EPA's prior experience with laboratory information
management systems, there was a good understanding of the types of
automated processes that could be used to support a LIMS. But, because
of the nature of the laboratories' evolving requirements and increasing
capabilities of LIMS technology, the laboratory representatives needed to
identify:
• Which functions should be included in the LIMS
• Which functions would be critical to the successful operation of the
application
• What features would be needed in each function
• What data would be contained in and tracked by the LIMS
• How should the user interface look?

YES

DOCUMENTATION COMPLETED
LOGICAL DESIGN UPDATE - FUNCTIONAL SPECS. DOC.
• CASE TOOLS - ACCEPTANCE TEST PLAN
• FUNCTIONAL SPECS. DOC. - USER'S REFERENCE GUIDE
• DATA DICTIONARY

NO

Figure 7.1 RUMS prototype process.


114 ADVANCED LIMS TECHNOLOGY

The methodology used for development of this application had to be


able to determine and specify these requirements uniformly and clearly
across diverse laboratories. This led to the selection of RAD with
prototyping workshops as a way to keep the intense user involvement in
the design and development process that is needed for a successful
application of this type.

7.2.3 Face to face with the user


Scientists are noted for their individuality and independent thinking, and, if
anything, the attenders of the requirements prototyping workshops were
even more so. It seemed that no two laboratories were quite the same in
their requirements or how they operated. This has caused a problem with
the adoption of many LIMS, in that the laboratory must modify its
operations to use the LIMS. Regardless, ground rules were needed to
ensure sufficient agreement concerning the design of the system. The
following complementary rules for the workshops were decided upon and
seemed to work well.

• A function or feature would be included in the system only if 70% of


the workshop participants agreed it was needed
• Any data that any participant wanted to include in the database
structure would be included, regardless of whether it was used by any
of the functions
• The system would be delivered with source code and laboratories
could customize the system to meet their local needs.
The first ground rule allowed the participants to avoid any potentially
lengthy, and probably futile, debate intended to win others over to a single
point of view. Either there was a 70% consensus or not on the feature. The
70% figure was arbitrary but was intended to make sure that a significant
majority wanted the function or feature.
The second rule supported the previously stated general principle of
having a common database structure on a national level. Several
advantages accrued from this rule.
• It facilitated the ability to have national databases and the uploading
and downloading of data to/from them; the goal is that a single
application interfacing to a national database could be written and
used for data transfer at all laboratories
• During the workshops, when a decision was made, based on the first
rule, to omit a function, the relevant data could still be included in the
database for future customization; in this way no participant would
feel disenfranchised by the decision, which made it easier to negotiate
quickly and continue.
CONTRACT ENVIRONMENTAL LABORATORY SYSTEMS 115

• Together with the third rule, it meant that the laboratories could adapt
the system to their needs rather than adapting the laboratory to the
constraints of the UMS.
The key consequence of the above rules is that all the requirements of all
of the participants could be accommodated in some way.
To help the process along using the ground rules, a list of all the possible
functions that could be identified was developed for the first workshop.
This was the basis from which the system requirements were specified, and
bound the scope of the system at the top architectural level.
A total of six workshops was held over about six months. Between each
workshop, the functions, features and interface specified at the previous
workshop were prototyped for presentation at the next workshop.
Simultaneously the logical design was updated using the RDBMS's CASE
tools, and the RUMS Functional Specification document and the data
dictionary were modified to reflect the latest requirements. A package
containing the latest documentation was sent to the workshop participants
one to two weeks before the next workshop. At the workshop, the
prototype was demonstrated and the participants exercised the prototype
during hands-on sessions.
Rapid prototyping helps to avoid several major pitfalls common in
analyzing user requirements. Some conditions that rapid prototyping
alleviates are:

• The user has an incomplete understanding of the requirements and/or


is unable to describe them clearly and thoroughly.
• There is disagreement between users and developers over the
interpretation of requirements.
• The user is having difficulty visualizing how the system will work and
look.
A final review of the entire prototype was done at the last workshop.
Related laboratory data was tracked from beginning to end through the
prototype. This allowed the workshop participants to get feel for how the
entire system would look and function when completed.

7.2.4 Completion and testing


When the final workshop was finished, the system was in a varying state of
completion. Some prototyped modules were complete. However, several
functions had been prototyped but not completed. The implementation of
these functions and the incorporation of error processing were then
completed.
In parallel, the design documentation and the EPA RUMS Functional
Specification document were completed, and the EPA RUMS Acceptance
116 ADVANCED LIMS TECHNOLOGY

Test Plan was written. A 600-page EPA RUMS User's Reference Guide
was also produced.
The various components of the system were peer-reviewed and any
inconsistencies in appearance or functioning were eliminated. Next, the
components were integrated and tested. The final testing was based on the
EPA RLIMS Acceptance Test Plan. Three rounds of acceptance testing
and subsequent corrections were required to achieve final system accept-
ance.

7.3 Implementation

7.3.1 The finished product


RUMS is implemented using the Oracle RDBMS with Oracle tools. It
runs both on Digital Equipment Corporation VAXes with VMS and on
Novell NetWare servers. The use of the Oracle RDBMS makes RUMS
easily portable to other platforms as needed.
RUMS is a flexible, sample-planning, analysis-tracking, and result-
reporting system specifically targeted at the environmental laboratory.
RUMS has a comprehensive database with a set of core modules that
address functions that are common to most laboratories (Figures 7.2 and
7.3). The core set of modules and functions includes the following .
• The System Data (or static data) module includes the information that
defines the laboratory operations so that RUMS can manage the
functions and data for that specific laboratory; it is a way that RUMS

Figure 7.2 RUMS functional overview.


INSTRUMENT RUN

EVENT
'" ""
EVENT
/ /
""
EVENT
I / '"-'
EVENT
"'/
,;f"
'"

FLAT ASCII FILE


OF INSTRUMENT
RUN DATA

Figure 7.3 RUMS sample data flow.


118 ADVANCED LIMS TECHNOLOGY

is customized for each laboratory; major components of the system


data include data items such as service groups (e.g. inorganic or
organic sections), parameters, analyte lists, methods, and statuses;
other system data are accounts, control types, facilities, instruments,
laboratories, matrices, on-line help, result qualifiers, result types,
sampling organizations, and units.
• The optional Project Planning modules provide workload planning
before sample log-in; this module enables the laboratory to schedule
sampling events and forecast laboratory workloads.
• The Sample Management modules handle pre-log-in (for integrating
label generation) and routine sample log-in; log-in can be done with
project data or without project data.
• The Results Management (manual data entry) modules support
results entry by sample or instrument run; the users can also enter
results for tentatively identified compounds from GM-MS instruments;
these modules use an analyte list for data entry; the analyte list allows
easy and flexible reporting of results from instruments such as ICP
metals and GC-MS volatile compounds; the analyte list feature
provides sample batch-specific lists and prevents duplication of
method information.
• The Laboratory Workstation (automated results entry) modules
provide a set of software tools for interfacing instruments to RUMS
(Figure 7.4) using the EPA Data Standard for Electronic Transmission
of Laboratory Measurement Results, and the modules support
customization to comply with the EPA Good Automated Laboratory
Practices (GALP) guidelines.
• QAlQC support functions and data are specifically designed for
environmental methods; the QC requirements are defined for

Figure 7.4 Workstation data flow.


CONTRACT ENVIRONMENTAL LABORATORY SYSTEMS 119

flexibility and include sample QC, instrument run QC, and prep batch
QC information; they also define the method QC requirements, and
advance analysis statuses with a review option if desired .
• The Archive/Restore modules provide the capability to move historical
data from the database or to the database as required to use storage
space best; the data may be archived or restored based on project,
batch, or instrument run.
Reports are not a part of the core system, but many example reports are
included in the delivery package as a help to the laboratories in designing
and implementing their own laboratory-specific reports.
The normalized, relational data model efficiently represents laboratory
information. RUMS has a unique way of processing analyses that require
multi-analyte reporting. RUMS handles these analyses through predefined
or sample-specific analyte lists. The analyte lists are subsets of RUMS
methods and allow customized reporting of analytes such as metals by ICP
(inductively coupled plasma) instruments.
Another distinctive feature of RUMS is the instrument run. Laboratory
samples, calibration standards, method blanks, and other QC samples are
stored with the original instrument sequence in the RUMS instrument run
tables. RUMS reports can then recreate the instrument run for reporting
samples with the appropriate QC data.
The RUMS database and core set of modules, delivered with source
code, documentation, training, and flexible Oracle 4GL and reporting
tools, provide a laboratory with a solid base for implementing a UMS.
And RUMS fully conforms with the nine general principles listed above
that were the basis for its design and implementation.

7.3.2 Laboratory installation and implementation


The delivery package for RUMS includes complete installation instructions
and several command files that help in the automatic installation of
RUMS. Since RUMS is an Oracle application, it assumes that Oracle has
already been properly installed on the target system. RUMS is presently
available in both Novell NetWare and VAXlVMS versions. In both
versions the only host or server requirement is that they have sufficient
hardware (i.e. memory and disk storage) for Oracle to run efficiently. For
the Novell NetWare version, the client must be at least a 386 with 4
megabytes of memory to run RUMS with the required Oracle tools.
The implementation of a major system such as a UMS requires detailed
planning for the most efficient and successful outcome. An implementation
oversight team should be formed to plot the path to the full implementation
of RUMS. Implementation and support of a UMS are an interdisciplinary
effort and the team should include representatives from the laboratory, the
120 ADVANCED LIMS TECHNOLOGY

computer support group, and management. The team should develop an


implementation schedule and assign responsibilities for various tasks. The
team must ensure that adequate resources (software, hardware, and
personnel) are dedicated to the implementation effort. While this may
hamper on-going work in the short term, the payoff will be significant.
From experience, the co-ordination and entry of the static or system data
are the most difficult obstacles to implementing RUMS (or any UMS
system for that matter). It takes a great deal of time and effort by the
laboratory analysts to plan, organize, and enter all the data that define
their instruments, methods, analytes and procedures to RUMS. At the
same time they seldom get a respite from their usual workload to work on
the static data.
Another item that the implementation team will need to address is the
types of reports required. Reports were the single area that produced no
agreement among the participants in the workshops on common require-
ments. The reports requested were as individual as the laboratories.
Consequently, no reports are included as a part of the core RUMS system.
But reports are the most important features for laboratory managers.
Reports on backlogs, workloads, schedules, and quality control are what
make a UMS a necessity with management. Luckily, reports are easily
developed with the Oracle tools used by RUMS. And numerous reports
are included with the RUMS delivery package as examples and starting
points in developing the specific reports for the new RUMS laboratory.
Vital areas to be addressed by the implementation team are 'vhat new or
modified functions and features will be required for the laboratory to
operate the way it would like. Plans need to be developed for customizing
the functions of RUMS for the particular laboratory. This is just a starting
point, since additional ideas will surface during the actual implementation,
and even more after the laboratory personnel have used RUMS for a
while. With the comprehensive database that resulted from the workshops,
modifications to the database structure should be unnecessary. An
implementation (customization) guide is provided with the RUMS
delivery package that gives guidelines for function modification or
addition. Following these guidelines will ensure that future upgrades to the
RUMS core system will have a minimal impact on the customized
software.
A key decision that the implementation team will make is whether to
implement the RUMS for the entire laboratory, or to do it in phases based
on different sections of the laboratory. The differences in the two
approaches are as follows .
• The entire laboratory is moved to RUMS as a whole: this would entail
entering all the static data, implementing all the customizations and
reports, etc., for the entire laboratory; typically the UMS functions
CONTRACT ENVIRONMENTAL LABORATORY SYSTEMS 121

are done in order from beginning to end (i.e. project management,


sample tracking, etc.); the advantage is that the complete implementa-
tion of RUMS would take less calendar time; the disadvantages are
that the implementation would affect, and perhaps disrupt, the entire
laboratory, and it would take longer to see any results from all the
effort and resources being expended .
• The other approach to implementation would be to implement in a
section of the laboratory completely; for instance, the inorganic
section could be moved to RUMS while the rest of the laboratory
operated as in the past; this would take less calendar time and would
only affect the operations of the one section, and management could
see a quicker payoff.
Of course, one of the above approaches may be more appropriate for a
particular laboratory based on its operational characteristics or organiza-
tion.
During and after implementation of RUMS, there are some continuing
support requirements that have to be considered by the implementation
team. A database administrator is required to manage the Oracle database
resources and to perform the routine maintenance operations for Oracle.
This role will already be filled at an Oracle installation. An RUMS
administrator is also needed, and may be the same person who is the
database administrator. This person would control the RUMS application
and control access to and usage of RUMS. Another responsibility would
be to monitor and tune the application together with the database
administrator, and respond to user questions and problems. The Novell
NetWare version will require the support of a network manager who, as
with the database administrator, should already be present in a Novell
installation. And finally, support personnel will be needed to provide
routine user support including training, problem resolution, and ad hoc
report development. These personnel may also be involved with periodic
customizations that become necessary over time due to new or changing
requirements and procedures. The personnel numbers and time require-
ments for this role will vary from site to site depending on the number of
users, and frequency of enhancements or modifications to RUMS.
Throughout the installation and implementation stages at the various
laboratories, the on-site support groups were augmented by an RUMS
Central Support Group (CSG). Besides supporting the local support
groups, the CSG responded to bug reports and enhancement requests.
And some laboratories contracted with the CSG to do specific customiza-
tions for them. The mix of local and central support provided the most
flexible and efficient implementation support.
122 ADVANCED LIMS TECHNOLOGY

7.4 Conclusions

The development and implementation of an environmental laboratory


information management system are complex and difficult tasks. But using
a defined philosophical and technical approach with detailed planning will
greatly enhance the chances for a successful LIMS. Another key is deep
involvement from the users in determining the system requirements and
appearance. And a team approach with a mix of technical and management
skills that is not traditionally found in the laboratory is necessary to utilize
a LIMS fully.

References

Good automated laboratory practices. Recommendations for ensuring data integrity in


automated operations with implementation guidance. Draft (1990) OIRM, US EPA, RTP,
NC 27711, USA.
EPA Standard for electronic transmission of laboratory data. EPA Order 2180.2, OIRM,
Washington, DC 20464, USA.
8 LIMS to robotics interface: A practical approach
R. YUILLE

8.1 Introduction

North West Water Limited is responsible for providing water and


wastewater services to the North West region of England. The company
serves a population of seven million people, comprising 2.7 million
domestic properties and 200 000 businesses in a geographical region
covering 14 000 square kilometres, which includes the major cities of
Manchester and Liverpool. The North West Water laboratories provide an
analytical service to support the core business, which involves the analysis
of water and wastewater samples to satisfy operational and regulatory
requirements. During the period 1988-1991 the workload of the laboratories
increased by 50% and at the same time the number of laboratories was
reduced from 29 to 9. This was achieved by the more efficient use of
analytical equipment and methods, but without a significant reduction in
staff numbers. The staff of 270 includes 50 water quality officers
responsible for the collection of samples across the region.
In 1990 a new strategy for the development of laboratory services within
North West Water was approved with the mission to transform North West
Water into a recognised water industry leader in laboratory analysis. The
goals defined for this new service were:
Provision of a high quality service to the core business
Improvement of turnaround times from sample collection to delivery of
results
Reduction in operating costs even in the face of increasing workloads
Increasing the potential level of revenue and profit by performing
analytical work for external enterprises.
This would be achieved through:
Building a central laboratory with the capacity to accommodate the total
workload
The introduction of new technologies and automation to support the
previous goals
Achievement of National Certification.
The current workload of North West Water laboratory services is 2 000
124 ADVANCED L1MS TECHNOLOGY

samples per day, which equates to over 2 500 000 determinations per year.
The type of samples analysed include: wastewater; soils; trade effluent
discharges; and sewage sludges. The range of determinands covers:
microbial agents; metals; biological and chemical oxygen demand;
nutrients; and organic pollutants.

8.2 The case for automation


To achieve these goals the following options were reviewed.

1. Provision of a service based on tried and tested manual methods with


state-of-the-art analytical instruments. This would be achieved by essenti-
ally maintaining the present service, continually looking for improvements
with minimal investment.

2. Provision of a service with standalone automated equipment and


automated instruments. This option required investment in off-the-shelf
automated equipment to cover a limited range of determinands and a tried
and tested Laboratory Information Management Systems (LIMS) to cater
for the new operation, which might involve a small amount of data capture.

3. Provision of a fully integrated laboratory with a LIMS scheduling and


controlling equipment. This option is more viable if laboratory services
were centralised in one large, purpose-built facility, designed to accom-
modate the equipment necessary to support fully automated operations.
Automation of a large part of the analytical service proved to be a viable
option, particularly as the vast majority of analytical methods involved a
series of repeatable actions. Methods could be automated using tried and
tested technologies already in operation in other industries. This approach
was justified because recent years have seen an increase in the development
and availability of automated equipment for analytical processes. The
focus has been to reduce traditional labour-intensive manual methods of
analysis by replacing them with robotic systems, which can perform tasks
reliably and with improved precision. Automation has the added advantage
of allowing methods involving potentially hazardous chemicals and
conditions to be executed with the minimal amount of operator interven-
tion.
The increasing demand on laboratories to detect compounds at ever
lower levels has made necessary the purchase of expensive state-of-the-art
analytical equipment. To offset this expenditure laboratories are expected
to produce additional benefits such as:
Increased throughput to reduce cost per sample
Improvement in the quality of results
L1MS TO ROBOTICS INTERFACE 125

More efficient employment of personnel


Faster turnaround of analysis.
Centralised laboratory services, with their increased sample throughput,
are able to justify the capital investment required to purchase automated
systems by producing further benefits:
More efficient use of chemicals and resources. Economy of scale reduces
the use of chemicals and resources, particularly if the tests were
originally carried out amongst a number of laboratories.
Reduction in operating costS. On average salaries account for some
70-80% of laboratory operating costs. More efficient use of the
workforce will produce significant savings in future costs.
Improved quality of results. Automation will enable tasks to be
performed more reliably and with a higher level of precision, leading
to an improvement in the quality of results produced.
Speed of response. Automated analysis can be performed during periods
when the laboratory is unstaffed. Direct transfer of data to the LIMS
database reduces the time taken to report results to customers and
increases analytical capacity.
However, there are a number of disadvantages to complete automation:
Capital funding. Capital funds are required to purchase the necessary
equipment. To guarantee a return on investment the equipment will
need to process a higher volume of samples.
Consumable costs. Depending on the design, the consumable items
needed to operate the equipment may be more expensive, particu-
larly if there is only one supplier. Economies of scale will, however,
reduce the overall cost per determinand. The increased buying power
of automated laboratories should enable the procurement of
materials and chemicals at more competitive rates.
Building modifications. The cost of modifying a building to accom-
modate automated equipment may be significant. Automated equip-
ment will require more floor space and established laboratories tend
to be congested areas. Alterations of the laboratory work area to
accommodate automated equipment may lead to a temporary
reduction in the level of service.
Culture shock. Traditionally, laboratory automation has meant automa-
tion of the instrumental analysis stage of the analytical process, which
has led to the development of islands of automation. The automated
laboratory links these islands of automation by automating the
delivery of the samples to the workstations. The ability to link these
delivery systems with instruments, both mechanically and electronic-
ally, is of fundamental importance. The introduction of a LIMS
system to control operations effectively industrialises the laboratory.
126 ADVANCED L1MS TECHNOLOGY

Management of these industrialised laboratories will require a radical


new approach with the emphasis on delivering a quality service. The
impact of these new technologies and management paradigms will
require a radical change in the familiar laboratory culture of scientists
and technologists.

8.3 Role of a Laboratory Information Management System

Traditionally LIMS were developed to store, track and report samples,


tests and results. This process usually involved a number of dedicated
personnel continually booking in samples and results and retrieving
information for customers.
With the increase in computer control of instrumentation, laboratory
management systems are being used more frequently in 'real-time' control
of laboratory equipment. Interfacing analytical instrumentation and
automated robotic equipment becomes a crucial part of the process, with a
considerable number of routines being run on the LIMS as background
processes. In addition to electronic connections between LIMS and the
various pieces of equipment, other factors require consideration:
Sample planning and scheduling
Sample container management
Sample shelf-life
Transportation of samples to equipment
Sample and container recognition system
Control of services
Auditability
Error logging.
All the advantages to be gained by automating analytical techniques can
be lost if consideration is not given to sample collection, sample
registration and container management. A well-designed LIMS can gather
and collate information, enabling staff to concentrate on analytical
support.

8.4 Sample planning and scheduling

As the laboratory will be receiving a high regular workload, the


registration of samples in the LIMS can be time-consuming and is a known
source of errors. To enable laboratory staff to prepare equipment for the
day's work, it would be advantageous if the planned workload were known
in advance. The solution is to use a software package that can produce
daily work schedules and electronically transfer the planned work lists to
the LIMS. Considerable benefits derive from this interface, particularly if
LIMS TO ROBOTICS INTERFACE 127

there is capacity management functionality in the scheduler. Such


functionality will only schedule the maximum number of samples that the
laboratory can handle within a day. This will give staff assurance that the
amount of work planned for any particular day is achievable, even if an
instrument is out of action, because this too can be taken into account.
Obtaining the fastest possible turnaround of samples must be the ultimate
objective, because eliminating delays in reporting results has the additional
impact of reducing the need for valuable storage space. A good scheduler
must be able to accept any resampling events automatically. These can
result where a regulatory or process limit is exceeded or if the container is
broken during transit or analysis.
Electronic work sheets of the expected workload can then be generated
by the LIMS and transmitted to instruments and robotic cells.
The large geographic area served by the laboratory necessitates a series
of sample reception centres (SRCs) distributed around the region. Staff at
these centres register samples in the scheduling system and place
containers in crates for transportation to the laboratory. Organising
container distribution and sample registration through the SRCs ensures
that all the activity in the laboratory focuses on analytical effort.

8.4.1 Sample container types


When using robotic cells, the sample container type is fundamental to the
operation of the equipment. Robots (Figure 8.1) are taught to pick up only
containers with a given specification. This can be complicated by the fact
that a number of tests may require containers to be prepared differently.
For example, containers may need to be autoclaved before being used for
microbiological tests or, if a container is to be used for metals analysis, it
will need rinsing with dilute acid to remove possible contamination. One
disadvantage of automation is that it can generate the need for more
container types. The only way of ensuring a reliable service from an
automated laboratory is to supply customers with the appropriate
containers whenever an analysis is scheduled. This will at least ensure that
the sample will arrive in the appropriate container and the analysis is not
subsequently invalidated.
Fast and efficient bottle and container management functions are
necessary to support an automated laboratory. These functions clean and
prepare bottles and containers as well as returning them with their crates to
the SRCs for reuse. If a container has an expiry date, there will be a
requirement to track the prepared container's shelf-life using the LIMS.
Staff and equipment can then be prompted to reject or reschedule samples
that have been taken in containers that have exceeded their expiry date.
This functionality can also be used for stock control, providing details of
broken containers to be replenished by suppliers.
128 ADVANCED LIMS TECHNOLOGY

Figure 8.1 An example of a gantry-type pick-and-place robot.

8.4.2 Sample shelf-life


It is important that LIMS and robotic cells can prioritise analyses because
some sample types are known to have a limited shelf-life or holding time.
This feature will ensure that priority samples are fast-tracked to laboratory
areas. This information must then be made available to the automated
equipment, so that any sample close to its critical time can be analysed
first. Having the facility to prioritise samples will also enable customers to
request a quicker response for special samples resulting from incidents or
emergencies.

8.4.3 Transportation of samples to equipment


Transportation of samples to the laboratory and their registration into the
LIMS is another time-consuming task. Transportation of crates containing
containers from the SRCs to the laboratory requires a dedicated courier
system. The courier service must operate within very tight time windows in
order for staff to be confident of getting valid samples to the laboratory.
When samples arrive at the laboratory there are a number of systems that
can be controlled by LIMS for transporting crates and containers to the
UMS TO ROBOTICS INTERFACE 129

equipment, e.g. conveyors or automated guided vehicles. This tried and


tested technology is widely used in manufacturing and can easily be
adapted for use in a large laboratory (Figure 8.2). However, unless a
building is specifically designed with transport systems in mind, the area
available and cost of installation may rule them out as an option. If such a
system is installed in a laboratory then real-time control by the LIMS is a
crucial interface.

8.4.4 Sample and container recognition systems


The input of sample and container information, such as container identity,
date and time, etc., can be time-consuming and potentially a source of
errors, particularly if keyboard entry is used. One solution is the use of
barcodes or electronic tags on containers, which enables information to be
recorded quickly and accurately. This also allows the LIMS to track the
samples status in the laboratory, in order to provide a very quick and
reliable method for monitoring analytical performance. By design, the use
of automated equipment does not require human-readable labels to
identify samples or subsamples during analytical procedures. The system
will always track containers automatically.

Figure 8.2 Simple conveying system showing a pusher unit. which can be used to transport
sample containers or crates.
130 ADVANCED LIMS TECHNOLOGY

The use of barcode technology has the additional advantage of reducing


time taken for registering information, enabling samples to be transported
to equipment more rapidly, so improving the level of service.

8.4.5 Control of services and consumable items


Optimisation of the use of analytical instruments and automated equipment
requires the continuous monitoring and control of the consumption of
power, gas and other consumable items, particularly if the equipment is
unstaffed. Regular status information needs to be gathered and reported.
This information can give an early indication of problems with equipment
or workstations and trigger warning messages for display on LIMS
terminals. Accurate costing of analytical tests will yield additional
competitive advantages to the automated laboratory.

8.5 Auditability

Equipment operating with the minimum of human interaction requires all


information to be gathered and transferred electronically. The test result
often forms only a small proportion of the data required to support the test
in order to provide for an auditable trail. These data are essential if the
laboratory is to comply with the quality standards necessary for certification.
This information may include:
Analytical quality control performance of the method
Equipment identity
Status of peripheral equipment (incubators, ovens, etc.)
Operator identity
Training records of operators
Maintenance logs
Method identity and version number
Standard operating procedures
Chemical and consumable batch number.
If the audit capability of automated equipment is not considered during the
design phase, it will lead to the laboratory having to store large amounts of
paper records to satisfy auditing bodies. This requirement can introduce
inconsistencies, especially if reports are redesigned over time.

8.5.1 Error logging and reporting


It is important to monitor the status of all equipment and processes when
operating automated methods. All errors must be logged and, depending
upon the severity, may require immediate action to rectify faults. All this
information must be collected in real time and be immediately available.
LIMS TO ROBOTICS INTERFACE 131

When analysis is carried out by automated equipment any exception or


failures must be reported immediately either for validation by the operator
or to inform the customer who requested the analysis. LIMS is the ideal
place for the production of all reports and with the appropriate
communication links any reports generated can be sent directly to
customers. All the time advantage gained with automated equipment can
be lost if results and errors are not reported promptly.

8.6 Information transferred via the interfaces

To enable the laboratory to operate smoothly it is important that a detailed


planning exercise is carried out to define the type and volume of
information to be transferred across the interfaces between LIMS and
other systems (Figure 8.3). The time spent on this activity can have
considerable benefits when the equipment goes into live operation.
When the interface structure has been specified it is essential that it is
documented accurately, particularly if a number of contractors are
supplying different pieces of equipment. When the equipment is eventually
connected to the LIMS it has to be able to communicate. Without a clear,
tight specification a considerable amount of reworking will usually be
necessary. Information will be passed between the LIMS and equipment
using either predefined batches or single data values.
Because the LIMS is central to the whole operation, it makes more sense
to use the LIMS as the master and all other equipment as slaves. The type
of information gathered and transferred must be clearly defined down to
field format and length. Examples of the type of information transmitted at
the three main interfaces are given in the Appendix to this chapter.
When the interface specifications have been designed and agreed by all
parties any changes resulting from modifications to equipment or the
introduction of new requirements must be communicated to all contractors
involved. This can be in the form of a quality-controlled documentation
procedure. This approach will ensure that all development work proceeds
smoothly. If the interface is specified correctly then the number of changes
required will be minimal. Uncontrolled changes can have a serious impact
on time scales for installing and commissioning automated equipment. This
will incur extra costs, which will be considerably higher than the cost of the
original interface.

8.7 Laboratory Information Management System network

As a consequence of the decision to invest in automation, a study was


commissioned to evaluate the options of either uprating the existing LIMS
! BARCODE]
1 READ~~J

CONTAINERSI SAMPLING ~_t .. ._ .__.._,


CRATES ISTAFF I -.. 'SAMPLE
I
f----+-ITRANSPORTATION I
SYSTEM _
:1M ;

BAR CODE
---l SAMPLE
rIll:ADE.!S.~ ·1 PLANNING AND
SCHEDULING CUSTOMER ~
SYSTEM I !' ( ~~IMAINFRAME:
,~/ COMPUTER ,
..------~
LABORATORY
TERMINAL~ MANAGEMEN
;~:~~A~ Ji'..IlI -----.-
, SYSTEM_.. __ .._ . ~---------.. . ANALYTICAL TlONI
INSTRUMENTA
i BARCODE
~
jt --
,I
L ~
~AiJTOMATEDI
iEQUIPMENT _ t
BARCODE I
READERS _.J
Figure 8.3 Laboratory information management system interfaces supporting automated equipment.
LIMS TO ROBOTICS INTERFACE 133

or procuring a new more appropriate system. It soon became clear that the
traditional PC-based LIMS would not be able to support real-time
operation, unless it was running a multitasking operating system and had a
very large database capability. Thus North West Water invested in a new
LIMS system. The preferred system had to support a fully automated
environment.
The selected LIMS package (ChemLMS, Hewlett Packard) conformed
with a functional design specification and technical specification produced
as part of an initial design study, ensuring that the basic required
functionality was available. This included a certain amount of forward
planning to cater for future requirements. Justification of this level of
investment depended upon the business benefits to be derived from
centralisation of the laboratory services. The new LIMS system had to be
reliable with the minimum amount of downtime, particularly as all the day-
to-day operation would be under direct control of the LIMS. The added
risk to the company had to be minimised by providing contingency plans.
This involved having two computers, one on duty and the other on
standby, in order to protect the business against hardware and software
failure. Maintenance and support agreements with suppliers of the
hardware and software have been negotiated to provide for the right level
of response to minimise loss of operation.
Providing the links from equipment to the LIMS required an extensive
data network involving Ethernet and token ring cabling. A schematic
diagram of the LIMS network is presented in Figure 8.4, where it can be
seen that a number of laboratory area networks were installed which, for
resilience and reliability, were interconnected. This architecture reduces
the effect of a failure by a network segment because only a simple patching
exercise is required to reconnect the equipment to another segment. Each
computer has RAID (redundant array of inexpensive disks) storage
facilities. These again are interconnected so the possibility of losing data is
minimised. Connection of equipment to the networks has to be controlled
and sufficient spare capacity made available for expansion. Laboratories
by their nature tend constantly to uprate instruments and equipment, so
there must be spare capacity to accommodate changes.

8.8 Analytical process automation

The design study on which the decision to automate the analytical


processes was based included an assessment of the cost benefit. Some parts
of the analytical methods proved very expensive to automate and the
actual benefits minimal, e.g. the placing of Petri dishes in incubators. This
particular action could be completed manually within seconds by analysts.
It was essential that all designs used in the automated cells were practicable
Chromatography Sample
Data Transportation
System System

Figure 8.4 An example of a laboratory information management system network supporting automation.
L1MS TO ROBOTICS INTERFACE 135

and provided added value, and were not just designed to look impressive.
Automated equipment had to be reliable and resilient in order to
engender confidence in the North West Water staff. This was achieved by
incorporating tried and tested technology in the design of automated cells.
Part of the initial design study was an evaluation of types and suppliers of
the individual components to be used in analytical cells. The source of
some components came from areas not normally associated with laboratory
functions e.g. conveyors are normally used in factories and warehouses
(Figure 8.2).
Scientific equipment tends to be expensive. A custom-built robot for use
in a laboratory can cost up to £50 000. A similar robot used in industry
(Figure 8.5) will cost £25 000, so this can represent a considerable saving.
However, these prices do not include the cost of development work needed
to get any system into a production ready state.
When the equipment has eventually been manufactured and installed
there has to be a commissioning period during which the automated cells
are optimised and their analytical capability validated (Figure 8.6).
Automated equipment only starts to be of value when it goes into service.
To ensure that commissioning was completed within the minimum period
testing of the systems followed a defined plan, enabling progress to be
monitored regularly and accurately. The high level of uncertainty in

Figure 8.5 An automated cell consisting of an articulated robot and a sample container
conveying system.
136 ADVANCED LIMS TECHNOLOGY

Figure 8.6 General layout of an automated cell during the commissioning phase.

commissioning usually leads to suppliers insisting on being paid on a time


and materials basis. These resources must be used efficiently if costs are to
be kept down.
When the equipment goes into full operation, maintenance and support
of the automated equipment are crucial, particularly if it is the only way
that the test can be performed. These services can become an additional
revenue cost, which must be included in future laboratory budgets. Any
maintenance contracts must define response times and level of service, so
that the amount of downtime is kept to a minimum.

8.9 Impact on the laboratory working environment

The decision to automate was not undertaken lightly and the laboratory
now has the appearance of a production line in a factory (Figure 8.7). It
was necessary to redefine the skills required by the laboratory personnel
and consider employing staff with different skills to use the equipment.
When all the equipment is fully operational there will be a significant
reduction in staff numbers. To enable this reduced work force to be more
effective, the concept of self-managing teams was introduced, which will
LIMS TO ROBOTICS INTERFACE 137

Figure 8.7 General layout of a beaker conveyor system under assembly.

enable staff to take more responsibility for the day-to-day running of the
laboratory. This approach will also facilitate the culture change required.

Appendix

LIMS to sample transportation system (STS)


Types of messages provided between the LIMS and STS are as follows.
• Send 'system startup/shutdown' request to STS. These messages are
required when there is a need to start or stop the sample transportation
system.
• Send 'state of health' request to STS. A poll message is sent to the
sample transportation system to trigger it to upload equipment status
information.
• Accept 'state of health' reply from STS. This message is received from
the sample transportation system as a response to a poll message.
138 ADVANCED LIMS TECHNOLOGY

• Send delivery request to STS. This message is sent to the sample


transportation system to request it to move a specific crate or
container to a specific destination.
• Accept crate or container delivery confirmation from STS. The LIMS
receives a crate or container delivery confirmation message from the
sample transportation system, each time a crate or container arrives at
a destination point.
• Send maintenance commands to STS. The types of messages are:
Halt system
Remove a crate or container
Add a crate or container
Operate a component
Operate a conveyor or vehicle
Enable or disable maintenance.
These messages allow the conducting of various maintenance functions on
the sample transportation system.

LIMS to automated equipment interface


Depending on the number of automated analytical cells, particularly if
there are a number of the same type, it may be advantageous to have the
real time cell controllers under the command of a laboratory area
controller (LAC) which then reports to the LIMS. This can have benefits if
an individual LAC controls the workload in separate laboratories.
Whatever architecture is employed the messages provided are:
• State of health poll. The LIMS polls each cell controller or LAC for its
state of health.
• State of health reply. A cell controller or LAC responds to a state of
health poll with a status message.
• Sample container worksheet. A sample container worksheet contains
details of all the preparation and/or analysis required on a sample
container.
• Sample crate contents. When an automated cell receives a crate, it
confirms the contents of a crate match the list stored on the LIMS.
• Delete container. If a sample container is missing then a message is
sent to the LAC or cell controller to cancel all tests for that sample
container.
• Request results. The LIMS will request results from the LAC or cell
controller when informed that they are available.
• Container results available. This message is sent from the LAC or cell
controller to LIMS together with the state of health reply when results
are available.
LIMS TO ROBOTICS INTERFACE 139

• Results report. The LAC or cell controller generates a result message


when an automated cell completes preparation or analysis of an
aliquot taken, for a sample container or standard.
• Batch end. A LAC or cell controller notifies the LIMS when it has no
more samples to process.
• Equipment or method file transfer. This defines an analytical method
to the LAC or cell controller.
• Startup command. This command will initialise an automated cell and
put it into service.
• Shutdown command. This message will take an automated cell out of
service.
• Maintenance command. This message is used to send free-format
maintenance messages to a LAC or cell controller. Typical messages
are halt, delete crate or operate a subcomponent.
• Receive text message. The LAC or cell controller will inform LIMS
when samples are required to be removed manually from incubators
or other peripheral equipment.

LIMS to sample scheduling system


To enable samples to be scheduled, examples of the type of information
required by LIMS are:
• Monitoring point identity. This is a unique code assigned to all
sampling points.
• Sampling event identity. This is a unique identifier made up of the
monitoring point identification and a date and time stamp.
• Sample material code. This indicates the type of material being
sampled, e.g. raw material, waste, etc.
• Sample purpose code. This indicates the reason why a sample was
scheduled to be taken.
• Sample priority class. An indicator to allow some samples to be
analysed immediately.
• Sampling method. A flag to indicate how a sample was taken.
• Sampling main type. Indication of whether a sample is (e.g.) spot or
composite.
• Project job identity. A unique identifier to allow samples to be
allocated to different projects.
• Sampler identity. A unique name for each person taking samples.
• Sampler's comments field. To allow the samplers to record free text.
• Determinant code. The code to indicate which determinant is specified
for analysis.
• Container identity. Each container has a unique identifier which is
entered into the system when the container is used.
140 ADVANCED LIMS TECHNOLOGY

References

Rushbrooke, J.N. (1992) A Business-Like Approach to LlMS Selection. Paper delivered at


6th International LIMS Conference, Pittsburgh, PA, USA.
Yuille, R. (1992) Laboratory Interface Specifications. North West Water Internal Document.
Yuille, R. (1992) New Laboratory Project Specifications. North West Water Internal
Document.
9 Interfacing the real world to LIMS
M.J. CROOK

9.1 Introduction

Computer systems are very adept at taking data that have been entered
and then manipulating, storing and passing them on to other computer
systems. The perennial challenge is getting the data into the system in the
first place.
Originally, punched cards and then paper tape were the only methods
available. Over the last 20 years, interactive data input using a keyboard
has taken over as the method of choice. Any data that needed to be
entered into the computer system were entered by keyboard operators.
This is how the early Laboratory Information Management Systems
(LIMS) operated. The analytical results were obtained from the appropriate
instrument and the data were entered along with sample information into
the LIMS. A summary of the development of manual data entry to LIMS is
shown in Table 9.1.

Table 9.1 Manual data entry to L1MS

Sophistication Configuration Explanation

Least A simple terminal is connected to


the LIMS computer, where the
terminal acts only as an input
device, and the LIMS computer
\
undertakes all the processing
procedures.

Intermediate A PC/workstation is connected to


the LIMS computer, probably as
part of a network, and local
processing is undertaken prior to
downloading the results to the
LIMS computer.

Most A PC/workstation is connected to


the LIMS computer, probably as
part of a network, and part of the
LIMS runs locally, downloading
only the information required to
operate the LIMS efficiently. This
is a client-server LIMS.
142 ADVANCED L1MS TECHNOLOGY

Unfortunately, although easy to perform, manual data entry has a


number of disadvantages:
• Transcription errors, i.e. the data that are produced by the instrument
are not entered correctly
• Excessive volumes of data; an instrument such as a GC produces a lot
of information that must be processed to give the result(s)
• Laboratories are running increasing numbers of samples per annum,
giving the major problem of operator time to enter the results to
update the LIMS data files.
The current generation of powerful LIMS runs on computers (or
platforms) that allow data to be sent remotely to the computer for
processing with a wide degree of sophistication. As more power, and by
definition flexibility, is given to the user's computer, more can be achieved
locally including data logging and instrument control. This has the
advantages of decreased network traffic and faster response time at the
terminal.

9.2 The analysis procedure

As with all systems, the operator likes to be in control. However, in the


laboratory where there are large numbers of near-identical samples being
analysed, it is more efficient for the LIMS to exercise a certain amount of
that control. This can be achieved by entering the sample details once and
allowing the LIMS to track the sample container around the analytical
process. Barcode labels attached to the sample at this stage of the analysis
facilitate the process of sample indentification.

9.2.1 Standalone instrumentation


The LIMS can locate any particular sample, but the instrument to perform
the analysis does not have the information about the sample (Figure 9.1).
At the chosen analytical instrument, the operator has to indicate which
sample is to be analysed. In many cases, the data referring to the sample
have to be re-entered to the instrument as well as the information
pertaining to the analysis method itself.
On completion of the analysis, the results are entered on to the LIMS,
probably using a worksheet. There are a number of places where problems
may arise, but all relate to transcription errors. However, if the samples
are labelled with barcodes, using a barcode reader will probably eliminate
sample identification errors.
The use of barcodes is becoming more popular because the samples can
be identified quickly by a simple reading device. There are three major
INTERFACING THE REAL WORLD TO L1MS 143

UMSPC Instrument PC

Read Sample Enter


Analysis Sample
Details Details

Report
I Manual Data Make
Result Transfer Measurement

I L...
__ ~I+-. _

Figure 9.1 Non-automated analysis procedure.

methods of reading barcodes; pen-swipe, hand-held and desk-mounted.


These are described further in Figure 9.2.

9.2.2 Integrated instrumentation


The alternative is to integrate the LIMS with the analysis instrument
(Figure 9.3). This allows the sample identification and analysis methods to
be passed directly to the instrument and the analysis data to be returned
automatically. Again the use of barcoded sample identification labels
eliminates transcription errors.

9.2.3 Data transfer methods


The ability to transfer the data to and from instruments is handled
differently by each LIMS manufacturer. The general form of the interface
is usually a hardware box with controlling software compatible with the
LIMS. The level of sophistication varies, but features usually include:
• Data acquisition (serial, analogue to digital)
• Results checking
• Calculations from the results
Any discussion of instrument automation must first explore and then
recognise the difference between 'instrument interfacing' and true 'instru-
ment automation'. Interfacing instruments to a LIMS host application is a
well-understood matter and does not address the entire scope of the
instrument automation process. Instrument interfacing, as defined here,
simply provides a means to transfer data files or strings electronically from
an instrument source to a destination. A very simple example of on-line
(a) (b)

(c)

Figure 9.2 Barcode reading system: (a) bench-mounted; (b) hand-held scanner; (c) pen and
wedge.

L1MS PC Instrument PC

Read Sample
Analysis
Enter
Sample
D
Details Details

Report
I Manual Data
I
Automated Data Make
Result Transfer Transfer Measurement

I L.._~...._~
__--'
Figure 9.3 Fully automated analysis procedure.
INTERFACING THE REAL WORLD TO L1MS 145

UMS
Database

Balance UMSUnk
Figure 9.4 Very simple instrument automation application used to illustrate the difference
between 'interfacing' and true automation.

data acquisition is illustrated in Figure 9.4. The discrete components of a


true instrument automation process are identified. A balance equipped
with an RS-232 output is connected to an interface device, which in turn is
connected to the LIMS on a host computer by way of a local area network
(LAN). An analyst performs the analysis using the balance and LIMS
interface device. This device hosts a description language, which allows the
interface to be set up to respond to the data-handling requirements of
different instruments.
As differentiated from simple interfacing, the basic components of a true
instrument automation application are as follows:

Verify analyst. Regulated laboratories need to provide an audit trail of all


actions associated with sample processing including on-line analytical
testing. Therefore the LIMS interface prompts the user for their user
identification and password. This information is then verified against the
LIMS database. If desired, the analyst's training record may also be
verified. Once identified to the system, LIMS will automatically stamp the
test results with the analyst's user identification.

Identify sample/test. In this simple application, this analyst is promoted to


enter the sample identification and test code. Often, this information is
entered using a barcode wand or scanner. The entered sample identification
is then transmitted to the LIMS database to ensure that the sample and test
exist and that the sample is available for testing. The LIMS interface
displays the full sample identification, test code and description, and
status. More complex applications may be employed to download worklists
as opposed to single samples.

Verify instrument. Regulated laboratories must ensure that the instru-


ments used in analyses have been calibrated and maintained. The
instrument identification is verified against the LIMS database Instrument
146 ADVANCED L1MS TECHNOLOGY

Table, where its calibration and maintenance histories are stored. If the
instrument is outside the stated 'grace' period, LIMS can disallow use of
the instrument and present a choice of alternatives.

Replicates. The analyst is prompted to enter the number of replicates for


the analysis. Alternatively, the LIMS interface program can query the test
record specification and display and then enforce the number of replicates
to be tested. The LIMS interface then indicates that it is ready to receive
data by displaying a message, e.g. "Awaiting Reading 1 ... ".

Acquire data. The analyst measures solution pH with the pH meter and
after the reading equilibrates, strikes the pH meter's 'send data' key. The
LIMS interface program acquires and decodes the ASCII data string sent
from the pH meter and displays both the pH value and the temperature of
the first replicate with the message, "Awaiting Reading 2 ... ". This
process is continued until all replicates have been analysed.

Execute calculations. After all the replicates have been acquired, the
LIMS interface may execute a simple calculation to determine the average
of the replicates.

Compare against limits. The LIMS interface program queries the LIMS
database for the specification limits fo this pH test for the specific product
or sample type. A test message of "Pass, Fail, or Warn" is determined and
the status is displayed to the analyst, prior to posting the data to LIMS
permanently.

Accept/reject results. The LIMS interface then presents the analyst with a
choice. "Accept" the results will result in the data being transmitted to the
database. "Reject" discards the acquired data. Implementation of this step
in the instrument automation process mayor may not be desired by
laboratories. Definitions of raw data vary from laboratory to laboratory.
Many laboratories prefer to empower analysts with the authority to
interpret the validity of an analytical result at the instrument level. This
implies that the data do not exist and that the raw data have not been
generated until they have been stored in the LIMS data files. Other
laboratories choose a much more rigorous approach and do not present the
analyst with a choice. This implies that the raw data come into existence
when they are output from the digital instrument. The definition of 'raw
data' is a fundamental responsibility of the laboratory and is usually
documented in the organisation's standard operating procedures.

Update results. The LIMS interface link formats a transaction and


transmits the message across the port electronics (host or network
INTERFACING THE REAL WORLD TO LIMS 147

connection) to the LIMS. This automatically stores the results against the
appropriate sample and test, assigns a test disposition, and updates the
status of the test. The test results are recorded with a full audit trail of
testing activity including the time-date-user stamp, instrument number,
and all replicate results if desired.
From this simple example, it is evident that 'instrument interfacing' or
the data acquisition represents but one component of the complete
instrument automation process. Instrument automation automates the
laboratory's procedures, their rules for operation or standard operating
procedures (SOPs), for analyses. Even a concept as fundamental as the
definition of raw data must be recognised and then effectively implemented
in instrument automation.
Instrument automation applications are further complicated by the
range of completely different analyses that are typically performed on a
single instrument. For example, a balance in a laboratory may be used at
different times for simple weight measurements, loss on drying, or content
uniformity. The calculations, logical processing, even the qualified analysts
for each of these methods will vary significantly. Most laboratories
establish basic guidelines for instrument automation applications which
dictate common procedures for user, sample, and test identification,
analyst and instrument verification. A number of different aspects of
instrument interfacing to LIMS are discussed in the following sections
illustrated by the approaches of three of the major LIMS manufacturers.

9.3 Beckman

9.3.1 Instrument automation components


Beckman utilises four basic software modules and special hardware
instrument couplers to automate digital and computer-assisted instrument-
ation with LabManager LIMS, as illustrated in Figure 9.5. These are:
The LIMS Link Instrument Coupler
The Instrument Data Acquisition System (IDAS)
The Transaction Processor (TPO)
The Laboratory Interface Language (LIL)
The flexibility and power provided by the individual components and their
interaction with each other permit a wide selection of instrument
automation alternatives to laboratories. The LIMS Link Instrument
Coupler is particularly well suited to a range of instrument automation
problems, but not as attractive for others. Some laboratories prefer to limit
the user interface to LabManager forms and have no desire, or need, to
distribute processing to the instrument level using LIMS Links. Other
148 ADVANCED L1MS TECHNOLOGY

• L1L Programs can reside in


IDAS or LIMS Unk
- IDAS acts as "Instrument O/S"
• IDAS manages mutti·tasking LIL programs
and transaction management tolfrom
TPO

LabManager
Database

LP = LlL Program
Lab
Manager
Forms
Applications

Figure 9.5 Instrument automation software components and their interaction with each
other.

instrument systems such as minicomputer or PC-based chromatography


data systems can accept and output files of data or worklists. A different
automation solution may be called for in this case. Working as individual
modules or in concert, IDAS, LIL, TPO, and the LIMS Link Instrument
Coupler provide a flexible and powerful solution for instrument automa-
tion.

9.3.2 LIMS Link Instrument Coupler


The LIMS Link operates under microprocessor control and provides eight
RS232C ports so that it can communicate to a variety of instruments,
autosamplers, robots, etc. The LIMS Link also provides a vital interactive
communications function in real time. An operator can use the keypad,
large LCD display, and function keys to select instruments, schedule tests,
print worklists or reports, or enter sample IDs, user· name, the test
disposition or other test-related information. As data are collected, the
operator may review, accept, reject, or process them, and then transmit
them to the LabManager LIMS for storage and further processing. The
LIMS Link also contains a barcode reader port. An important benefit of
the LIMS Link is that it is equipped with its own microprocessor and
resident memory and may therefore operate independently if the host
computer is unavailable. It will continue to collect and store data generated
in real time and then relay them when the host is free. The LIMS Link is
especially well suited to instrument automation applications that involve
multiple instruments in a 'work cell' configuration, as shown in Figure 9.6.
INTERFACING THE REAL WORLD TO LIMS 149

Karl Asher Utrlmeter


% Moisture

fJJ!.!}i4
< >
SDectCQDbotpmeter
Assay
Barcode
~

{I
Wodcsheet
Weight
LlMS Link
--+;;;;;;
11I11I .
11I11I ••
11I11I

LabManager
CPU

LAN

Figure 9.6 Examples of a LIMS Link employed as a 'work cell' instrument automation
configuration. Three instruments and a barcode wand are connected to the LIMS Link.

In the example shown, a balance, a Karl Fisher titrator, and a UV/VIS


spectrophotometer are used to determine the water-adjusted assay result
or percentage label claim for a sample. The instruments are located near
each other and the results generated from each are used in the calculation
of the final result. The analyst interacts directly with the LIMS Link that
provides bidirectional communication with the LabManager LIMS data
files. These features in combination with the small footprint and
distributed processing capabilities of the LIMS Link provide an excellent
solution for a wide range of laboratory instrument automation applica-
tions.

9.3.3 Instrument Data Acquisition System (IDAS)


IDAS was specially designed to simplify the automation of laboratory
instruments. It essentially acts as a multitasking 'instrument operating
system' by storing and managing a directory of all instruments connected to
the system as well as managing the functions of the various LIL programs
that control instrument automation processing. It provides bidirectional
communication with the LIMS Link Instrument Coupler, the Transaction
Processor, the LabManager LIMS application, and LabManager LIMS
forms. It supports network protocols and may in fact reside on a separate
host from the LabManager LIMS application. For flexibility, IDAS
processing is controlled by user-specified programs written in the Labora-
tory Interface Language (LIL). LIL programs developed for both LIMS
Link Instrument Couplers and specified LabManager forms are maintained
and processed by IDAS.
150 ADVANCED L1MS TECHNOLOGY

9.3.4 Transaction Processing Option (TPO)


The LabManager LIMS Transaction Processing Option (TPO) is a
standard, integrated transaction-based host interface to the LabManager
LIMS data files and supporting tables. TPO permits users to customise
LabManager LIMS further with their own programs without modification
to the LabManager LIMS source code. Virtually any function executed
manually from a keyboard may be accomplished with a documented
transaction format. This simplifies development and ensures users that
their custom applications will continue to operate independently of
successive revisions of LabManager. TPO supports interfaces to Fortran,
C++, host computer procedural languages such as Digital OpenVMS
DCL, batch operations, and integration with the IDAS software. TPO
permits users to develop highly specialised LabManager LIMS applications
in a very cost-effective and highly supportable fashion.
Once established, LIL subroutines for these functions may be used
repeatedly for all applications. Only the calculation and data acquisition
portions of the instrument automation program need be modified. This
approach promotes and can enforce consistency in all online instrument
automation while simplifying the task of handling the multitude of
different instrument types and methods.

9.3.5 L/Llforms operation


LIL is a descriptive language that allows the instrument interface to be set
up to respond to the data handling requirements of different instruments.
Another option for instrument automation is the use of LabManager Test
Entry forms. LabManager LIMS FormsBuilder permits users to define
forms for any LabManager LIMS function. These forms reflect the
laboratory's local terminology and cosmetic attributes, and, importantly,
enforce standard operating procedures. LIL programs may be developed
for use with any LabManager LIMS form. The LIL program resides and is
managed by IDAS on the host and is invoked by a soft key that is defined
by the user in the form's definition.
The integation of LIL programs with LabManager LIMS forms
dramatically extends the scope and range of applications that may be easily
integrated into standard sample processing. Some of these applications
include complex secondary calculations, conditional validation rules, and,
as in the following examples, instrument automation.

9.3.6 Typical instrument automation scenarios


The suite of tools and facilities which Beckman supplies provide the
framework from which virtually any instrument automation process may
INTERFACING THE REAL WORLD TO LIMS 151

be customised to suit a laboratory's local requirements. In practice, there


are two basic scenarios that are implemented.

9.3.6.1 LIMS Link Instrument Coupler. Figure 9.7 illustrates the instru-
ment automation scenario utilising a LIMS Link Instrument Coupler. The
LIMS Link is a particularly attractive tool for automating 'single-reading'
instruments such as balances and pH meters. It is also employed often as a
'work-cel\' station when two or more instruments are proximate and are
used together in related analytical techniques. For instance, a balance and
Karl Fisher titroprocessor form a common instrument 'cell' for the analysis
of water content.
In the example illustrated in Figure 9.7, a LIL program for the
application has been developed and downloaded into the LIMS Link,
where it resides. The program typically provides many of the discrete
functions such as analyst and instrument verification, acquisition and
parsing, database access and transmission of results, etc. The LIMS Link
may optionally be equipped with a barcode wand or scanner to facilitate
entry of analyst, sample, and/or test information. In the event of a power,
network, or computer failure, the battery-backed and buffering capabilities
of the LIMS Link can support continued, autonomous operation .

. Outputs results 10 serial RS·232 port


- Resuhs are ASCII. no file transfer
. Instrument may accept parameters and commands

- lIMS link. a local benchtop user


- LIL Instrument program resides in LIMS link
reads and parses data. communicates to host using
TPO transactions .displays prompts and information to
- lIMS link has FIFO buffering.

-Serial port defined to host


- may be on networ1<, remote from host

. TPO posts resuhs to LabManager


- Allows lIMS link to Query database

T
LabManager
P
Database
o

Figure 9.7 Logical processing for the instrument operation scenario employing the LIMS
Link Instrument Coupler.
152 ADVANCED LIMS TECHNOLOGY

After appropriate interaction with the LIMS Link, data from the
instrument are acquired by the LIMS Link in an ASCII format. The LIL
program parses the string or file, calculates final results, and if they are
accepted, creates a test data entry transaction (TDAT). The transaction
(message) is transmitted through the host port of the LIMS Link, which is
defined to the LIMS host. The serial port may be directly connected to the
host or may operate on a terminal server on a local area network. In this
example, IDAS operates as an 'instrument operating system'. It maintains
a directory of all the instruments connected to the LabManager system and
their associated ports, manages the multitasking of all the other LIL
programs that may be resident on the host, and serves as a 'traffic cop' in
directing externally generated transactions to the TPO. Once accepted by
TPO, the transaction is processed by LabManager LIMS. In the TDAT
format, the individual components of the test result are updated into the
appropriate test and sample. The results are compared to specification
limits (if they exist) and assigned a disposition of "pass, fail, or warn". The
result data are tagged with the time-date-user and the instrument number
used in the analysis. A message denoting that the data were acquired
online is also possible. The test status is then automatically updated from
"Awaiting testing ... " to "Tested, pending validation ... ".

9.3.6.2 LIUforms operation. Another option for instrument automa-


tion is the use of LabManager test entry forms. Figure 9.8 illustrates the
basic components of a LILiforms instrument automation application.
Proceeding in chronological order, the analyst enters Lab Manager LIMS
and accesses a Test Data Entry screen. The sample ID is entered on to the
screen and the various tests associated with the sample are displayed. The
analyst may be optionally required to enter the instrument ID for the test
as well as instrumental parameters that will be subsequently downloaded to
the instrument. The analyst then strikes a soft key on the test entry form
defined for data acquisition, perhaps called 'Acquire Data'. The LIL
program defined for the test and instrument combination then assumes
further processing.
The instrument's data are first acquired. The LIL program reads and
parses the data, performs any specified secondary calculations, and then
displays the results in the required format on the test data entry form. This
scenario is executed without the need for a LIMS Link Instrument
Coupler. Once presented on the screen, normal LabManager processing
can qualify the results against predefined specification or control limits and
assign a disposition of "pass, fail, or warn". The LabManager LIMS data
files are updated with the test results by striking the 'Post Results' softkey.
INTERFACING THE REAL WORLD TO L1MS 153

· Oulputs resullS to serial RS-232 port


· Results are ASCII, no file transfer protocol
· Instrument may accept parameters and commands

-Serial porr defined to host


- may be on network. remote from host

- LIL program reads/parses instrument output


- labManager screen used to schedule analysis and
enter results
- labManager. nol TPO, responsible for posting results

labManager
Database

Figure 9.8 Logical processing for the instrument automation scenario employing the
LILiforms approach.

9.4 Hewlett Packard

9.4.1 HP ChemLMS
HP ChemLMS is an Oracle-based LIMS that operates within the Hewlett-
Packard HP-UX open systems environment. HP ChemLMS can be
installed on a wide range of hardware platforms, starting from the HP 9000
Series 700 workstation up to the HP 9000 Series 800 symmetric
multiprocessing Corporate Business Server. This product can be used to
create a highly customised, automated LIMS that helps laboratory
personnel to be more productive. HP ChemLMS automates laboratory
methods management and provides a secure data store and integrated
audit trail of all changes to laboratory definitions, processing rules and
data. The system can be tailored to work and automated as required by
using the component parts of the operating system (Figure 9.9).

9.4.1.1 Procedure Specification Language (PSL). PSL is a text-based


scripting language modelled after the terminology chemists use to write
standard test procedure methodologies. PSL allows users to translate
154 ADVANCED LIMS TECHNOLOGY

HP-supplied applications
Custom languages (PSLlCP)
Base product clients and servers
Oracle relational DBMS
PA-RISC servers/workstations/H P-U X

Figure 9.9 HP LIMS product structure.

laboratory procedures that currently exist only on paper into an automated


test procedure management system that is maintained within the Oracle 7
database under audit trail and revision control.
This powerful defining language allows chemists to describe what test
measurements to make, how many replicates to carry out, what calculations
to perform, limit specification checks, conditional logic checks, etc.

9.4.1.2 Command Processor Language (CPL). CPL is a text-based


scripting language that uses PSL procedure definitions to drive all standard
LIMS functions such as sample log-in, work list generation, data entry,
sample validation, and reporting. CPL is a Basic-like language which
allows commands to be combined into macros that perform specific LIMS
activities. Laboratory staff can use CPL with little or no knowledge of the
underlying Oracle 7 database or Structured Ouery Language (SOL).

9.4.1.3 HP ChemLMS clients and servers. HP ChemLMS has been


designed as an integral client-server application that allows laboratories to
leverage available computing power. The HP ChemLMS software licence
provides both the client and server software modules with the right to use
the client modules on an unlimited number of client systems.

9.4.1.4 HP ChemLMS servers. HP ChemLMS provides six background


server processes that perform various Oracle database activities on behalf
of client CPL macro requests. The DaemonUCP is the server which is
responsible for automatic data entry.
NetMgr: Coordinates client to server messaging, starts and stops
background servers
Scheduler: Inserts new sample or result data into the Oracle 7
database
LimitChecker: Performs complex result data limit checks
INTERFACING THE REAL WORLD TO LIMS 155

Evaluator: Performs complex result data calculations through


macros
QueueCP: Queues long-running reports or other CPL macros
DaemonUCP: Time-based UCP which will schedule CPL macros
when the arrival of a file is detected in the Unix file
system. It is used primarily for automatic instrument
data posting to the Oracle 7 database.
Client and server messaging is performed through standard network
ARPA messaging protocols and through redundant Oracle 7 table
message storage. This ensures fault-tolerant messaging between clients and
servers. To provide a substantial improvement in server throughput,
multiple copies can be configured to run simultaneously.

9.4.2 Automatic data entry


Automated result entry is accomplished by using CPL commands grouped
in the Datamap command set. A set of CPL commands can be combined in
a macro to accomplish the following capabilities:
• File manipulation commands that allow the macro to open an ASCII
file, read a record, find a text string, move to a record, and close the
file .
• A set of commands that allows the developer to 'map' fields read from
the ASCII file into local CPL variables and from these variables to the
Oracle database.
In general, all devices that are able to communicate with the HP-UX open
systems environment through standard mixed computer networks, which
can include PCs, workstations or minicomputers, etc., or standard RS-232
interfacing, are able to perform automatic data entry into HP ChemLMS.
Instrument result file data 'mapping' can be configured for automatic,
unattended operation by having instrument acquisition systems transfer
ASCII report files to the HP ChemLMS DaemonUCP disc directory. The
DaemonUCP can be configured to schedule the appropriate mapper CPL
macro based on the ASCII report file extension. The DaemonUCP
periodically scans its directory at user-defined intervals, searching for
incoming report files. When a file with a predefined extension is detected,
the mapper CPL macro is run, the data are 'mapped' to HP ChemLMS,
and the report file is moved to an 'error' or 'good' disc directory.
Instrument result files can also be 'mapped' manually through an
interactive CPL macro, with the option of viewing instrument data before
committing them to the HP ChemLMS Oracle 7 database.
156 ADVANCED L1MS TECHNOLOGY

9.4.3 C to CP communication
The C to CP library is a family of C-callable library routines that allow a C
program to read or write data to ChemLMS Oracle data bases (including
array read and write). This application program interface (API) enforces
security and audit trail rules to maintain a secure LIMS data repository.
A C program may use the C to CP library to start the execution of a CPL
macro residing on the ChemLMS machine.
To connect a C application to ChemLMS, one of the following two
communication methods can be used:
• The pipe communication method. Select this method if both the
ChemLMS and C application processes run on the same machine .
• The TCP/IP 'sockets' communication method. Select this method
whether the ChemLMS and C application processes run on the same
machine or on different machines (client-server). After the C
application starts, it connects with the ChemLMS background server,
ccpSocketMgr.
The library currently supports HP-UX C programs only.

9.5 LabSystems (Fisons Instruments)

9.5.1 Introduction
Interfacing instruments to the LIMS database is achieved with a Windows-
based software package called LabSation running on a 486 IBM-compatible
PC. LabStation is a client-server application that can acquire and manage
data from eight devices by way of RS-232 ports when networked to Sample
Manager, the core LIMS module for managing sample, test and result
data. Protocol converters enable instruments with parallel, IEEE or HP-IB
outputs to be interfaced with the LIMS. LabStation can also operate in
standalone mode and include the ability to transfer data by way of DDE
(Dynamic Data Exchange) to other compatible Windows applications
(Microsoft Word, Excel, etc.).

9.5.2 LabStation architecture


LabStation partitions the software that deals with the challenges of
transferring data to and from the LIMS database into a series of layers
(Figure 9.10), each layer dealing with a higher level of data abstraction.
The lowest level is the data stream from the instrument, whilst the sample-
oriented database transactions form the highest level.

9.5.2.1 Communications drivers. Platform-specific communications


drivers supplied with the operating system are used by LabStation. These
INTERFACING THE REAL WORLD TO LIMS 157

User Interlace

Samples

MethOds Tests

Archiver IMM

Host Intertace Smartport

Network Comms
Drivers Drivers

Network
RS232
1
I
lIMS Data
Base I Instrument

Instrument
I I
Figure 9.10 LabStation architecture.

drivers interface with the hardware and service all communications


interrupts. The lowest level of LabStation provides an interface to these
drivers.

9.5.2.2 SmartPort. This layer is concerned with building message


packets from the single-character data streams managed by the lower level.
These message packets are more easily managed by the upper layers and
normalise the view of the lower level by the Instrument Interface Module
(IMM). A number of parsing parameters is used to specify how the data
streams are packaged into messages. A user interface permits the
SmartPort to be configured.

9.5.2.3 Instrument Management Module. The IMM layer is specific to


the type of instrument. Its purpose is twofold;
• To parse and store data from the instrument
• To provide a full set of commands to communicate bi-directionally
with the instrument. These commands will reflect the capability of the
instrument.
All communication to the instrument is through the SmartPort and
therefore the IMM code is portable between LabStations. This code is also
platform-independent; code written for the PC version of LabStation will
run unmodified on the OS/2 version. An IMM is written with no particular
158 ADVANCED LIMS TECHNOLOGY

application specified. The next layer defines how the IMM will be used to
create an application.

9.5.2.4 Methods and Tests. Methods and Tests provide the primary-
application specific code which controls and acquires data from instruments
and stores the results in the sample. The role of Methods is to implement
the operation of the instrument(s). Tests are used to specify the
miscellaneous data required to perform the method. The use of Tests
facilitates the creation of generic methods. For example, a Test could
specify the number of replicates required for an average weight analysis
and the same Method would be used to get each of the results.

9.5.2.5 Samples. Samples are the standard data representation, all data
being stored within a sample object.

9.5.2.6 Graphical user interface. The customisable interface is used to


display and manipulate samples.

9.5.2.7 The Archiver. Once results are available they may be transferred
to the LIMS database. The Archiver gets the data from the Sample
objects, packages the data and manages the transactions within the
database server. The interface enables the analyst to configure how the
data should be transmitted, either on command or automatically on receipt
of the data packet.

9.5.2.8 Host Interface. This Host Interface provides the logical link to
the host computer. There are usually two co-operating components, one
part running on the client and the other on the host. The Archiver uses the
Host Interface to talk to the host and all data transfer and remote
procedure execution are accomplished by using the link.

9.5.2.9 Network drivers. LabStation does not provide its own network
drivers, but instead relies on drivers provided by third-party vendors.

9.5.3 Instrument interfacing process


LabStation has been used to interface a variety of instruments (Table 9.2)
to Sample Manager at a research centre in Australia (Alcoa of Australia
Ltd), using an architecture summarised in Figure 9.11. The major steps in
the instrument interfacing process were as follows.

9.5.3. J Hardware connections. This stage involved determination of the


type of interface required to communicate with the instruments; in this
INTERFACING THE REAL WORLD TO LIMS 159

Table 9.2 Instruments connected to L1MS in the Laboratory at Alcoa

Radiometer autotritation systems (x 3)


Malvern MS20 Mastersizer I particle size analyser
Waters Action Analyser (for chromatograph)
Dohrmann DC-80 total organic carbon analyser
Hewlett Packard gas chromatograph HP 5890
Siemens DC-SOO X-ray diffractometer
Varian Cary 3E UVNIS spectrophotometer
Varian Spectr AA-20 atomic absorption spectrometer
Leco SC-444 carbon sulphur analyser
Hewlett Packard 3D capillary electrophoresis system
Analytical balance (x3)

ETHERNET LINE TO L1MS

"C;' INSTRUMENT

=
=
=
=
=
=
LABSTATION (PC) =
'--
DIGIBOARD PATCH
PANEL
Usually 4-6 Inotrumento

LABSTATION (PC)

Figure 9.11 Instrument connections via LabStation.

case serial lines are required to allow data transfer between the instrument
and a LabStation, and then Ethernet from the LabStation to the LIMS.

9.5.3.2 Interface specifications. In developing specifications, Matt


Sumich at Alcoa wrote "It proved advantageous to fully examine the
functionality of LabStation, because it provided a variety of alternative
work practices from which to choose. For example, LabStation provides
the ability to interact with the LIMS directly; samples can be logged in at
LabStation, tests assigned and results entered manually at LabStation. The
ability to edit samples on worksheets (move, remove, add) in LabStation is
a key feature. If necessary, it enables review of data prior to uploading the
worksheet to LIMS. Results can be displayed on LabStation as received
160 ADVANCED LIMS TECHNOLOGY

and analysis components with specified limits are automatically flagged


and highlighted, if the results are out of limits".

9.5.3.3 Implementation. LabStation is designed to facilitate 'point and


click' configuration of an instrument interface. The setting of communica-
tions parameters and parsing of instrument output files are achieved
without the need for programming expertise.

9.5.4 Examples of some specific instruments

9.5.4.1 Radiometer autotitration systems. There are three autotitration


systems in the laboratory. Each system has its own PC attached, containing
a Window-based instrument operating system software package developed
in-house.
This package provides the option of sending instrument output to the PC
serial output port, to a hard disc or to a printer. The instrument output
string contains sample results, analysis name, instrument operating
parameters and sample identification. LabStation is configured to acquire
the instrument output in real time (i.e. immediately after each sample is
analysed), extract the required information from the string and send it to
the LIMS.
In the operation of this instrument, the sample identification entered
into the instrument PC prior to analysis by the operator could be a LIMS
unique numeric sample ID (obtained by preanalysis LIMS sample log-in)
or an operator-specific sample ID (sample not logged into LIMS). LIMS
can be customised to recognise if the sample ID is not a unique LIMS
numeric ID, and if so the sample is automatically logged into LIMS as
received. If the operator wishes to enter a LIMS sample ID into the
instrument PC prior to analysis, this can be read using a barcode reader
from a LIMS-generated worksheet which contains barcoded sample IDs.
The autotitrator PC software package has its own statistical control
charting facility. A control sample is analysed automatically at random by
the system. If a control sample result detected by the software is out of
control (criteria from the control chart), then all subsequent sample data
strings have a flag set to indicate that the results are suspect. LIMS is
customised to examine this flag. If it is within the control criteria, the result
is automatically authorised by LIMS. Otherwise the result is suspended by
LIMS, requiring a supervisor to examine the situation. At this point the
supervisor must either cancel the result in LIMS and reanalyse the sample
or unsuspend the test in LIMS and accept the result.

9.5.4.2 Malvern MS20 Mastersizer (particle size analyser). The Malvern


has its own PC attached, containing an instrument operating software
INTERFACING THE REAL WORLD TO LIMS 161

package which is based on MicroSoft Access. Sample results are stored


locally in the database until a batch of samples is completed, whereupon
the batch of results is automatically sent to LabStation. LabStation is
configured to extract the required information from each sample string and
send it to the LIMS database. As with the autotitration system, the
instrument output string contains sample results, analysis name, instrument
operating parameters and sample identification.

9.5.4.3 Waters action analyser (ion chromatograph). This instrument is


controlled by a PC software package called Maxima. On request LIMS
generates a worksheet containing the standards and samples requiring
analysis, with sample dilution information and sample and standard 10
numbers. The barcoded LIMS numeric standard and sample IDs are read
from the worksheet into the Maxima sample 10 field.
On completion of a batch of samples the Maxima software calculates the
concentrations and stores the results in a formatted report file. The operator
examines the results and determines from the control sample results
whether to accept each result. If the result is accepted, the operator triggers
a LIMS send command. The report file is sent to LabStation where the
required information and results are extracted and transferred to the LIMS.
LIMS is customised to authorise the batch of results automatically.

9.5.4.4 Dohrmann DC-SO total organic carbon (TOC) analyser. In


contrast to the instruments discussed above, this instrument does not have
its own control PC to calculate concentration or store results. It is a dumb
instrument, similar to an analytical balance, and transmits a single piece of
raw data.
Samples requiring TOC analysis are pre logged into LIMS. At analysis
time, a worksheet is created by LIMS containing the required standards,
controls and samples. The worksheet is then downloaded to a LabStation
close to the instrument. The LabStation software clearly shows the order of
the standards, controls and samples. The physical standards, control and
samples are placed in the instrument carousel in the same order as shown on
the worksheet.
The worksheet on the LabStation is activated by the operator as the
instrument results are ready to be generated. The raw results are transferred
in real time to LabStation. In the case of problems occurring at the
instrument (causing a sequence break) the worksheet can be edited at the
LabStation.
Once the operator is satisfied with the results, the completed worksheet
may then be uploaded from the LIMS server. A program running on the
LIMS constructs a line of best fit from the standards' results. From the
calculated calibration parameters, the concentrations from the raw data
values are calculated automatically. Prior to the LIMS implementation, the
162 ADVANCED LIMS TECHNOLOGY

only output from the instrument was by way of a parallel line to a printer.
Incorporation of a parallel to serial converter has enabled data to continue
to flow to a printer as well as to LabStation.

9.5.4.5 Analytical balances. Each analytical balance is connected to


LabStation, but associated with each balance are three Intermec devices: a
9512 transaction manager, a 1700 keyboard and a 1545 barcode scanner. The
9512 has a small screen and is programmed to request information about a
sample about to be weighed (e.g. operator, sample ID). The information
may be entered by way of barcode scanner (from a LIMS generated
worksheet) or by way of the 1700 keyboard. The sample information and
weight are transferred to LIMS by way of LabStation.

9.5.5 Future developments

9.5.5.1 Bidirectional communication between LabStation and analytical


instruments. Currently all communications between LabStation and
analytical instruments in the laboratory are unidirectional, i.e. data from
instrument to LabStation. A project is planned which will provide
bidirectional communication between LabStation and a Hewlett Packard
gas chromatograph. The LabStation will download QC run data to HP
Chemstation, and when the GC analysis is completed and approved, data
will be sent to LIMS.

9.6 The future

There are a number of technologies that are either available or in


development which will change the nature of data entry into LIMS. All
these new technologies will focus on working in the way that humans
perform best rather than computers, namely by the use of human senses
and skills, e.g. writing, speaking and intelligence.

9.6.1 Handwriting recognition


The technology is available for handwriting recognition and a number of
computer manufacturers, including AT&T, Compaq and Apple, offer their
respective hardware with pen input. The pen replaces the use of the
keyboard and mouse. The user 'writes' on the computer screen and by
using one of a number of technologies, the pen movements can be
interpreted by the computer as a mouse movement, textual input or one of
a number of gestures that are translated as cut, copy, paste, etc.
An AT&T pen computer and a Microsoft Windows-based program have
INTERFACING THE REAL WORLD TO LIMS 163

been combined to offer a user interface for both Challenger LIMS (BP)
and Beckman LIMS. The software package has been designed to offer the
user the option to follow samples and procedures and allow the system to
update data files in the LIMS automatically.
As indicated in previous sections, major LIMS vendors offer interface
modules, but few have addressed the problem of how to tackle the large
number of manual laboratory methods that have not yet been automated.
Also, most systems allow a multitude of functions that generally make the
system difficult or time-consuming to use. The most frequent LIMS user is
the laboratory analyst who requires very little functionality; a list of
samples for analysis and a simple method of reporting results are
frequently all that are required. Most laboratories are aiming to introduce
quality assurance systems to their operations and are implementing either
ISO 9000 or BS 5750 systems. Both systems rely on procedures and
auditing. The problem for laboratory management is to develop routines
that ensure that staff follow the prescribed analytical methodology, record
relevant data and archive results. Personal Interface (Process Analysis &
Automation) is an easy-to-use device that interfaces analysts to LIMS and
achieves these objectives.
The Personal Interface development is based on the new generation of
mobile PCs. The AT&T Safari 3115 PC has been selected for the task and
is a full implementation of a 486/25SX computer with 20 Mbyte virtual hard
disc, 4-8 Mbyte RAM, and VGA resolution monochrome liquid crystal
display. The device has been designed to be rugged and can be dropped on
to concrete from 1.5 m. Data input is by way of a stylus which can either
emulate a mouse or be used for alphanumeric input. The system can be
used in either the MS-DOS or the Windows 3.1 environment.
The initial data transfer can be achieved either by Ethernet or more
conveniently by WA VELAN. This is a radio-based link between host
computer and a mobile pen computer that allows full communications
throughout the building and laboratory without the inconvenience of a
cable. Using a PCMCIA WAVELAN card, the PC can be logged on to a
site computer system using a radio-based communication system rather
than an Ethernet wire.
The laboratory supervisor can download the analyst's daily work list
from the LIMS by way of the site Ethernet to the Notedpad PC using a
PCMCIA WAVELAN interface card. A suite of software presents the
operator with electronic data sheets to complete as each analysis is
progressed. These data sheets are identical to their traditional paper
copies, and in the training mode detailed instructions are given for each
stage of the analysis. Results can be compared with specification values, so
facilitating an immediate repeat if necessary. When some or all of the
analyses are completed, the computer transmits the data file of results back
to the LIMS for validation prior to LIMS database updating (Figure 9.12).
164 ADVANCED LIMS TECHNOLOGY

DOWNLOAD SAMPlES TO BE RUN


FROM lIMS SAMPlES BACKLOG REPORT
ON THE HOST COMPUTER TO THE PEN PC

THE USER CHOOSES THE SAMPlE AND


ANALYSIS TO BE RUN FROM
lIM211M PROGRAM

THE USER IS TAKEN THROUGH THE


ACREDITED ANALYSIS PROCEDURE

AT THE END OF THE ANALYSIS THE


RESULTS ARE RETURNED TO THE
HOST SYSTEM FOR APPROVAL

THE lIMS DATABASE IS UPDATED WITH


THE RESULTS DATA & COMMENTS

Figure 9.12 Pen computer interface to LIMS.

The benefits of the mobile PC link to LIMS are:


• data transcription errors are eliminated
• familiar screens promote greater user acceptance
• no LIMS knowledge required
• automatic data transfer to LIMS improves productivity
• ensures compliance with defined procedures
• allows immediate corrective action to be taken.

6.2 Voice recognition


Data input using a microphone is a technology still in limited use in
computer applications. Fast processors are required to process the signal
and to perform the advanced pattern-recognition techniques required to
extract critical sounds (phenomes) and construct computer-friendly
commands. Software capable of performing these procedures is available,
but still do not perform to an accuracy required for a LIMS integrated
technology.

Acknowledgements

I wish to acknowledge the contributions of Glen Barton (Beckman


Instrument UK Ltd), Dr Michael Kraft (Hewlett Packard GmbH), Dr
James Stafford (LabSystems, Fisons Instruments), and Matt Sumich
(Alcoa of Australia Ltd, Kwinana, W.A.) in the preparation of this
chapter.
10 Replacement LIMS: Moving forward or
maintaining the status quo
T.V. IORNS

10.1 Introduction

When should a Laboratory Information Management System (LIMS) be


replaced? How long can your current LIMS last? How much better would a
new LIMS be? How do you justify a new LIMS? These are some of the
questions to be explored in this chapter.
Trying to decide when to replace an existing LIMS is much like trying to
buy a personal computer or an automobile - as soon as it is purchased there
will be a new model available with some new and better feature. Any
LIMS that has been installed for several years is a candidate for
replacement. However, some LIMS have lasted much longer than this,
often because they have valuable features that are still needed and a better
replacement may not yet exist. Any decision to change LIMS must
carefully consider all ramifications.
Defining the scope and objectives of a 'LIMS Replacement Project' is
perhaps the most critical step in the project. This issue will be addressed in
more detail later, but the title of this chapter sets the stage. Deciding
whether LIMS should be an application that helps move the company
forward competitively and strategically is a fundamental decision that
requires careful consideration. Many opportunities can be lost if the
replacement LIMS is defined to merely fit the mold of its predecessor.
Attempting to maintain the status quo is an easy, comfortable approach,
but the reader is encouraged to dig deeper, to find the challenges and make
the creative decisions necessary to move the company forward.

10.2 Why change?

A LIMS, like any other major purchase, should be replaced when it does
not meet current needs, when needs are changing due to an organizational
or re-engineering change, or when it becomes too difficult or expensive to
maintain. Very often more than one of these factors will be involved. Each
one will be explored separately.
166 ADVANCED LIMS TECHNOLOGY

10.2.1 Current LIMS does not adequately support business strategy and
user requirements
The laboratory and laboratory information are becoming increasingly
important in helping the corporation to achieve its goals. Most organiza-
tions are recognizing the importance of analytical skills in corporate
performance, and laboratories and their information are playing more
important roles. Laboratories are in transition from being providers of
requested results to being team members in addressing corporate problems,
opportunities and competitive advantage.
Most existing LIMS were implemented to meet the immediate needs of
the laboratory. Often the justification process alluded to benefits outside
the laboratory, but typically cost pressures kept the focus on the immediate
needs of the laboratory and other potential benefit areas disappeared from
the project plan.
LIMS often represents an 'island of automation' within the corporate
infrastructure. As the pressures on the corporation to perform increase,
laboratories must become more cognizant of their ability to assist in
meeting corporate goals. This results in a broadening sphere of influence
and increasing requirements on the system. The insular nature of most
early LIMS projects, and also the design of the early LIMS, results in many
LIMS being incapable of accommodating new requirements.

10.2.1.1 Better linkage to fundamental business strategies. Often the


most important reason to change LIMS is associated with the current
system not adequately supporting fundamental business strategies. Very
often, the existing LIMS was implemented as a laboratory automation
initiative with little consideration for corporate business strategy. The only
strategy considered was the objective of the laboratory manager to reduce
costs, improve productivity, and improve turnaround times. Although such
considerations may have been sufficient to justify the first LIMS,
companies should look beyond these factors when considering a replace-
ment LIMS.
A need that is fundamental to all major projects is a consideration of
business strategy and critical success factors. This involves developing an
understanding of the business strategy of the company, an understanding
of what the company must do well to succeed, and finally, an understanding
of what the laboratory must do well to support the company business
strategy for the company to succeed. Very often, the laboratory must
provide extra or enhanced services for the company to succeed, in direct
conflict to the laboratory manager's strategy of reducing cost (Figure 10.1).
If a company is to develop new products quickly in order to provide a
competitive advantage, then the laboratory must be oriented to providing
fast turnaround times, good results, and information in formats that
MOVING FORWARD AND SYSTEM REPLACEMENT 167

To be strategic, the LIMS must allow the company to achieve important business goals.
• Excellent customer service
• Improve regulatory compliance
- Environmental
- FDA
- Accounting
• Improve product quality
• Reduce new product introduction cycles
• Improve quality and timeliness of product applications

Figure 10.1 Goals of a replacement LIMS.

support the needs of the researchers in developing new products. Increased


laboratory costs are often overshadowed by the benefits of faster service
and more useful presentation of results. In an environment where safety or
risk avoidance is paramount, quality of results and elimination of errors are
critical. In an industry where process control is critical, LIMS should have
excellent trending and statistical process control features. In an industry
which is study-oriented, LIMS must provide information in a format to
support study interpretation and monitoring. In both study and process
environments, information must be presented to support interpretations
across groups of related samples, and the importance of individual samples
may be greatly diminished.
Being a low-cost provider of laboratory services is not a strategic
requirement for most industrial laboratories. Good management should
always pay attention to cost structure but it is rare that the success of a
company hinges on laboratory costs. However, a commercial service
laboratory must be a low-cost, high-quality provider in order to compete.

10.2.1.2 Reducing risk to the business. Reducing risk to a business is a


factor closely related to linkage to fundamental business strategies.
Reducing risk is usually considered tactical in a company resulting in a
shorter time focus and less visibility to top management - until a problem
occurs! Once a crisis develops, it is usually not practical to modify or
enhance a system to solve the problem. It is important to identify risks to
the business and then ask how LIMS might help to alleviate these risks and
prevent problems from occurring.
In the pharmaceutical industry, critical risks include missing an
important product discovery oppportunity, product recalls, and delays in
product approvals. To support product discovery, results must be precise
enough to support critical interpretation and must be presented in useful
formats. To avoid product recalls, it is essential that products be checked
against specifications before release, that reliable methods be used, and
that stability studies be properly conducted. To avoid delays in product
168 ADVANCED L1MS TECHNOLOGY

approvals, results must be presented to support rigorous interpretation and


systems must be appropriately validated so that results can be trusted.
In process industries, trends in the quality of raw materials, intermediates
and final products should be monitored by using statistical process control
and trending techniques to help improve processes and products and
identify problems before they occur. In these industries, environmental
issues are also extremely important. LIMS should present information in a
manner useful to support environmental monitoring and reporting. In
addition, LIMS should report specific exceptional conditions automatically
to management, so that excursions from desirable conditions can receive
the appropriate attention without overdependence on the thoroughness
and reliability of specific individuals.
Reducing risk requires that additional user requirements be identified,
prioritized and implemented. This requires additional insight during the
requirements definition process.

10.2.1.3 Current system does not meet user requirements. In addition to


the current LIMS not adequately satisfying business strategy and risk
reduction issues, it is very common for LIMS which have been in operation
for a considerable period of time not to meet current and future needs of
daily laboratory operations. This could be because the laboratory has gone
through a major change, that initial user requirements were not defined
adequately, that the laboratory has simply outgrown the current system, or
that the current system has not kept pace with technology, and the lack of
features is creating a competitive disadvantage.
A very common situation is that, with the implementation of the initial
LIMS, users have become far more knowledgeable about LIMS capabilities
and recognize the potential for LIMS doing more than originally
envisaged. Unfortunately, it is often the case that, to achieve the full
potential, the initial LIMS must be replaced or at least reimplemented.
This dramatically establishes the need to define both current and future
requirements carefully before embarking on a major project to avoid the
expense of repeating the efforts.
Another common situation is that LIMS was initially implemented to
support sample management and result reporting requirements in a
manual data entry environment. Changes in testing instrumentation and
improvements through laboratory automation projects result in manual
data entry no longer being suitable.
Most LIMS were implemented to address the major sources of pain for
the laboratory manager at the time. As existing pains disappear and new
sources of pain gain attention, changing user requirements often mean
existing systems are obsolete.
This section highlighted the changing requirements on LIMS due to the
recognition of the greater importance of laboratory information and how it
MOVING FORWARD AND SYSTEM REPLACEMENT 169

can be used. The next section addresses changing requirements due to the
changing structure of business and new ways of looking at business
processes.

10.2.2 Business process re-engineering or organizational change dictates


new requirements
Companies are under increasing pressures to compete in a global
environment. These pressures are causing fundamental changes in the way
companies operate. Change is occurring everywhere. Business process re-
engineering is being used to break down old organizational barriers and
create new ways of conducting business. Mergers, acquisitions, and
divestitures are altering the fundamental structure of companies.
Companies and departments are reorganizing or downsizing to become
more effective.

10.2.2.1 Business process re-engineering initiatives. Business process re-


engineering (BPR) initiatives are having a major impact on many
companies, large and small, local and global. These initiatives result in
fundmental changes in the way companies do business and how materials,
paper, information, etc., flow. These changes in flow usually have a major
impact on how well existing systems can support the new processes.
Information systems are often an enabling technology to implementing
new business processes. It is critical that BPR projects consider existing
systems and address the need for system changes and replacements. Many
BPR initiatives cannot be successful until information systems to support
the new processes are in place.
Most BPR projects attempt to eliminate work that does not add value
and delays results. One example for a laboratory would be the seemingly
endless checking and verification of information at each step as raw data
are acquired, results calculated, samples released and reports written. If
LIMS and other automation systems can be implemented to automate
these steps, the beginning and ending steps can be less time-consuming and
intermediate steps can be eliminated, significantly reducing costs and
speeding up the process.
Another example involves eliminating unnecessary testing through using
statistical process control and statistical quality control techniques to
understand when processes are 'in control' and when results can be trusted
and products are of good quality. Without databases of appropriate
information provided through LIMS, such programs are very difficult to
implement and trust.
Still another example involves moving analytical testing from the
laboratory to the plant floor or control room. Such a change has a large
impact on costs and efficiency yet requires excellent information systems
170 ADVANCED LIMS TECHNOLOGY

and training to make sure that appropriate quality of testing is maintained.


LIMS must have the flexibility to support activities in the laboratory and
on the plant floor and must have tools to monitor the quality of the results
reported.
BPR initiatives can be a major factor in improving corporate performance
but many initiatives are doomed to failure if appropriate information
systems such as a LIMS are not available. Often the process changes are
such that an existing LIMS cannot continue to meet needs and it must be
replaced with a more versatile system.

10.2.2.2 Corporate infrastructure is changing. A common reason for


LIMS needing to change involves major corporate infrastructure changes,
such as an acquisition, merger, or new line of business, or a corporate
reorganization. These changes result in altered workloads for laboratories
and may result in a new customer base with different requirements. These
changes can result in new requirements and an existing LIMS becoming
unsatisfactory.
One example might be the consolidation of two companies. It could be
decided that two laboratories should be consolidated to attain efficiencies
of scale and elimination of duplicate services. However, this would be
likely to result in the remaining laboratory having a larger volume of work
and a more diverse customer base. Serving customers at two or more
remote locations can be a strain on an existing LIMS.
Another example would be deciding that laboratories should specialize
in the type of services they provide rather than being geographically close
to their customers. For instance, one laboratory might provide mass
spectrometric analyses worldwide and another might provide elemental
analyses, while all laboratories would maintain competency in chroma-
tography. This could require samples to be sent to three locations, yet a
single result report might be needed. Interpretation of results could even
be performed at another site. The requirements for such a distributed
LIMS would severely tax most existing systems.
These types of changes frequently cause a laboratory to scramble to
meet its customers' needs and a replacement LIMS may be necessary.
However, replacing systems takes time and it is seldom that the need to
change is planned early enough for a new system to be operational in time
to meet the new requirements.

10.2.2.3 Changing attitudes or other reasons. Change is healthy for a


variety of reasons. If an organization needs to experience change, the
process of selecting and implementing a new LIMS can provide enough
chaos to force a close look at all operations. Sometimes a laboratory needs
a change in systems in order to accomplish other fundamental change. The
MOVING FORWARD AND SYSTEM REPLACEMENT 171

importance of the psychological impact of a systems upheaval can be


important in fundamentally changing an organization.
Occasionally systems are changed because someone wants to leave a
'mark' on an organization or because it is necessary to move beyond the
influence of a predecessor. These can be good reasons for change or they
can be bad reasons. The cost of the change must always be balanced
against the benefits. The psychology of management can be important.
However, these considerations are usually valid only for the top manage-
ment of a laboratory organization. Beware the system manager who wants
a new system without strong support from the user community. A
successful project can look good on an individual's resume, without doing
the organization any good.

10.2.3 LIMS evolution, technology changes or maintenance


considerations dictate a change
The most common reasons to change LIMS involve issues with a vendor,
the hardware platform, or maintenance considerations. Often the existing
LIMS meets basic needs, or at least the needs that are recognized, but
other considerations dictate an expensive change. Under these conditions,
benefits are not derived from new capabilities, but from reduced costs, cost
avoidance, or 'business necessity'.
It is difficult, although rightfully so, to justify a change on these grounds.
It is difficult to convince management to make a major expenditure on
grounds of cost avoidance rather than benefits to users outside the
laboratory.
The key to changing LIMS in these situations is to be diligent in
gathering reliable facts and performing benefit calculations that are
believable. It is usually a mistake to attach significant weight to 'intangible'
benefits.

10.2.3.1 Difficult or expensive to maintain. Maintenance difficulties


arise from a number of causes. Design limitations, archaic hardware or
software platforms and eroding people skills are common problems. These
problems are often most severe with custom-built or internally developed
systems, but vendor-supplied solutions also have the same problems.
Most early LIMS possessed limited functionality according to today's
standards. In the 1970s, most LIMS were an outgrowth of instrument
automation projects and the systems were designed to automate existing
processes, producing valuable productivity savings. Little attention was
given to addressing the broad needs of a company because there was little
recognition that laboratory information was a company resource that could
be enhanced through proper LIMS implementation. It was also rare that
LIMS implementation was viewed as a vehicle for fundamental change in
172 ADVANCED LIMS TECHNOLOGY

how an organization operated. This narrowness of project scope typically


resulted in a design that is difficult to extend to meet broader needs and to
support organizational change.
For instance, early LIMS typically employed a centralized minicomputer
or mainframe architecture. This architecture was well suited to meeting the
needs of the analytical laboratory , but often led to problems when broader
needs were considered. Typically the laboratory purchased enough
capacity to meet its internal needs, but did not install enough capacity to
address the broader needs of its customer community. In fact, corporate
political considerations often resulted in laboratories selecting systems not
compatible with corporate systems so that the laboratory could maintain
control of its system.
Similarly, laboratory scientists preferred systems programmed in Fortran
or other scientific computing language, because they could easily support
and modify the system. They were willing to overlook the shortcomings of
a scientific programming language to meet the needs for information
storage, management and reporting. Scientists were typically making the
decisions and tended to select an architecture they were comfortable with.
Corporate computing departments were more attuned to information
management considerations, but they were easily convinced that scientific
information was different. Most of the early LIMS employed flat-file data
structures, well-suited to the limited needs of the time to store information
and generate sample analysis reports, but not well-suited to the broader
information management requirements of today.
The hardware and software design limitations mentioned above resulted
in systems that met the needs recognized at the time. However, such
systems are often incapable of addressing the additional needs of the
expanded organizations of today. Items as fundamental as hardware
architecture and programming language are difficult to change, resulting in
systems that stop growing to meet new needs.

10.2.3.2 Changing technology. Related to the need for new enhance-


ments, or a new foundation, is an alternative scenario involving the basic
infrastructure within which LIMS operates. Perhaps the current LIMS is
tied into an existing series of corporate applications and system hardware
which has allowed LIMS to meet many needs. If the corporation is
changing its computing infrastructure, perhaps through a downsizing effort
or a move to client-server technology, the impact of this change on the
existing LIMS implementation must be measured. As applications leave
the mainframe, maintenance costs are divided among fewer remaining
applications, resulting in an increasing cost structure. Similarly, as
applications integrated with LIMS migrate to alternative platforms, the
cost of integrating and supporting LIMS with these applications on the new
platform must be considered. Although the cost of integration is usually a
MOVING FORWARD AND SYSTEM REPLACEMENT 173

deterrent to change, external forces that require new integrations can often
catalyze the switch to a new LIMS. For instance, in a stable environment,
the users might be ready to let the current system survive for a substantial
additional period of time. However, in a changing environment, the
realization that changes to the existing system will be expensive will often
force management to take a closer look at its overall architecture and
recognize that the existing system should be replaced before making
substantial new investments in integration.
The major factors in this situation are often timing, risk management
and project management. It is beneficial to avoid the cost of integrating a
new LIMS with old applications or the old LIMS with new applications or a
new infrastructure. This situation will often lead to substantial pain and
costs, and the overall objective must be to minimize both the pain and
costs. Although it may be attractive to attempt to co-ordinate the projects
tightly, this often results in projects becoming too big and unmanageable,
resulting in loss of communications, major delays, and higher costs. It is
usually best to keep projects small and manageable, emphasizing
communications to maximize success, and having carefully thought out
contingency plans to handle expected deviations from the overall plan.

10.2.3.3 Product life cycle considerations. Systems also become difficult


to support because the underlying hardware or software packages are
inadequately supported by vendors. Many hardware and software vendors
popular in the 1970s are no longer in the business. Others have migrated to
updated strategic platforms. As vendors update their offerings, support for
older systems deteriorates. Support usually goes through several stages in
the life of a product.
An important message can be learned by thinking about the typical
product life cycle for a LIMS or other application. The terms used here are
harsh, but consider them carefully.
LIMS product life cycle:

• Hype
• Maturation
• Deterioration
• Eradication
What are the strategies of the salesperson, the vendor, and the customer
during each phase of the life cycle? How do the strategies of the
salesperson and the vendor differ from the strategies of the buyer? It is not
unethical for a vendor or salesperson to paint their product in its best light.
However, it is important for the customer to understand the process, the
market, and the technology and to develop a strategy to meet the
company's requirements and protect the company's investment.
174 ADVANCED LJMS TECHNOLOGY

During the initial 'hype' stage, the product is immature and the vendor
cannot deliver features fast enough to suit customers, but customers have
confidence they have a system that will last. The vendor needs early sales
to convince customers that the product will be successful. A few good
reference sites are critical.
The second, and fortunately longest, stage involves 'product maturation'.
During this stage, powerful capabilities are delivered, systems meet
expectations, and market saturation gradually occurs. This is usually the
high-point in the popularity of a product, and everyone believes the
product will last forever and that selecting the product is 'safe'. No-one
should be criticized for picking a platform in this important and desirable
stage, but unfortunately the decision is not as safe as it looks if a product is
nearing the end of its life cycle.
The third stage in the product life cycle usually involves 'deterioration'.
Some vendors will attempt to avoid this stage by refreshing their
architecture regularly and providing a steady stream of enhancements.
However, it is very difficult in the software industry to stay out of this stage
because technology is moving so rapidly. Deterioration is a harsh term
since vendors will claim a product is 'fully supported', but from the
customer's perspective, the product is falling behind current technology,
and relative to new products entering the market deterioration is
occurring. It is often common during this stage for established vendors to
defend their products and criticize the competition for using unproved
technology that may be nice, but really is not pertinent to laboratories.
During this stage, vendors may be frantically developing new technology
that will allow them to exhibit a competitive impact on the marketplace. A
new version of the old product may even come out during this stage, giving
the sales force 'evidence' that the product is fully supported. This version
may even have a feature or two that allow the sales force to claim use of the
technology that has bypassed their product. However, this new version
seldom exhibits important 'breakthrough' features of the type evident
during the hype or maturation stages of the product.
The final stage in the product life cycle is 'eradication'. During this stage
the vendor's sales force visits loyal customers to offer them a migration to
the new product. This may even include 'services' to assist in migration.
Vendors can maintain a high degree of customer loyalty by providing a
reasonable migration path to new systems.

10.2.3.4 Need for validation or revalidation. If a company needs to


validate its current system (or revalidate because of changes to capabilities
or regulations), it may be appropriate to consider replacing the system
first. This is because the cost of validation can be very high and it is
appropriate to consider the future likely lifetime for an existing system
before making major investments.
MOVING FORWARD AND SYSTEM REPLACEMENT 175

10.2.3.5 Changing regulatory environment. Compliance with changing


regulations is often a strong factor in forcing change. The existing LIMS
probably predates current regulations and was not designed and imple-
mented with present day GALP, ISO 9000, GMP/GLP and FDA
validation requirements in mind. It may be justifiable to implement a new
LIMS rather than face the risk to the business of noncompliance or the cost
of retrospective validation of a system not developed and maintained
following good System Development Life Cycle principles.

10.2.3.6 Dissatisfaction or problems with vendor. Sometimes a vendor


just does not deliver the level of customer service needed, fails to keep
promises or deliver desired enhancements, or changes emphasis in product
direction away from your needs, or you change direction away from a
vendor, or the vendor is being left behind by technology changes.
Vendor problems are rarely the primary reason for replacing a system
but they are very often a major consideration in the selection of a new
system.
10.2.3.7 Major enhancements planned - new foundation needed. Often
a company is basically satisfied with its current LIMS, but must at least
consider a change because major new enhancements or capabilities are
planned. These enhancements may come from strategic or business process
re-engineering considerations or they may be part of a general plan to
continue systems development and integration. If these enhancements
require a significant investment, one should evaluate whether this
investment be best made on the existing system or whether a switch to a
new system is warranted. Such an evaluation considers many factors, but
the two most important factors are the position of the current LIMS in its
product cycle (and the potential to upgrade easily) and whether the current
architecture is amenable to the planned enhancements. It is sometimes
recognized that a new platform, such as client-server with a relational
database, will make implementation of the new features drastically easier.

10.3 Why not change?

The previous discussion examined a number of reasons to change from the


existing system. In subsequent sections we will look at the costs and
approaches for successful change. However, it should not be assumed that
all LIMS replacement projects are worthwhile. A serious question that
should be asked is whether a change is really needed.

10.3.1 Investment protection


Most companies have a substantial investment in their LIMS, associated
hardware, infrastructure, training, experience, validation, customizations,
176 ADVANCED LIMS TECHNOLOGY

services, etc. A switch to another system is far more costly than the quoted
cost of new hardware and software from the vendors. Any major project
should look at the big picture before deciding to move forward. A careful
consideration of all long-term costs and benefits should lead to a rational
decision of what is best.
It is a mistake to make a decision to replace a LIMS from too narrow a
viewpoint. For instance, if there are problems with a vendor's support, it is
possible to rationalize changing systems based on reduced ongoing support
costs and problems versus the cost of a license for a competing product.
But when the cost of duplicating all features and integrations that users
assume will continue to be available is factored in, incremental benefits
must usually be substantially greater than reduced support costs for the
change to be justified.

10.3.2 Business processes need changing first


The earlier discussion on business process re-engineering focused on BPR
being the reason requirements will change and that a new system is
needed. However, too many companies start too early with a systems
project and try to use the project as a catalyst for change. If possible BPR
should not be carried out within the constraints of an information system.
Information systems should be considered enablers of new processes,
but it is rare that a system is ideally designed for an organization. It is
preferable to define the ideal business processes, then define the user
requirements to support those business processes best, and finally to select
and implement a system that meets current and future needs.
Major system projects need to be re-evaluated in the light of impending
BPR initiatives. Although it is tempting to get the project underway while
everyone is still anxious and the money exists, it is relatively rare that BPR
projects do not have a major systems impact. The corporation will benefit
if the team redefining new processes is not hampered by a new system just
being installed that supposedly represents the best thinking. In actuality,
the best thinking will occur if all constraints and preconceptions can be
minimized.
Sometimes the new business processes are relatively obvious and the
new LIMS may be flexible enough to enable the project to start with the
understanding that new requirements and resulting changes will have to
occur at an intermediate point in the implementation. In these scenarios,
what works best is to put the LIMS project team into limbo for a while
while the BPR team does its analysis. When preliminary ideas start to
reach a steady state, it may be appropriate to let the LIMS team move
forward. Significant interaction between the teams will have to be
maintained to avoid problems as BPR changes are decided upon and
implemented.
MOVING FORWARD AND SYSTEM REPLACEMENT 177

10.3.3 Something better to do with the money or not enough money


Prioritization of projects and benefits from projects is a primary considera-
tion in deciding whether to replace a LIMS. LIMS will usually compete for
funds with projects from other departments. Justification for the LIMS
must be presented in terms that the decision makers can understand and
management in competing departments can accept. In most companies,
the laboratory manager will not have sufficient budgetary authority to
approve a LIMS without support from upper management or other
departments. Even within the laboratory organization, there will be
competing projects and priorities. Laboratory management will have to
measure LIMS benefits against the benefits associated with new capabilities
from a sophisticated new instrument or improved accuracy and precision
possible through replacing antiquated instruments. It may be cheaper to
add more people than to take the long-term view and replace the LIMS,
particularly if there is a lack of commitment due to unclear justifications,
an approaching retirement, or unwillingness to recognize benefits through
reducing headcounts or reassigning employees to more important assign-
ments.

10.3.4 Already meets basic needs


Packaged LIMS from all vendors are capable of meeting the basic needs of
most laboratories. If the basic needs are met, the question should be asked
- Why change? This often leads to an inability to justify the major cost of
change when considering only the incremental benefits of a new LIMS.
Difficulty with justification can often be turned to advantage with the
existing or competing vendors to make improvements to the product or to
simplify migration.
Strategy is an important factor in this consideration. Are the laboratory
and its information an important source of competitive advantage? Can
LIMS be linked to important business strategies? If so, the change may be
justified on this basis alone. If not, justification must usually be based on
savings within the laboratory environment. The ability to make maximum
use of information is often a deciding factor in the move to a LIMS with a
relational database or extensive (and expensive!) integration with external
systems. However, if the LIMS must be justified by time savings in the
laboratory or meeting regulatory requirements, a change may not be
warranted.

10.3.5 Customizations expensive to reproduce


Many existing LIMS have a large number of internal developments or
vendor customizations that have been installed to meet user needs. If the
178 ADVANCED L1MS TECHNOLOGY

new LIMS does not have these capabilities as standard features, then the
cost of obtaining these customizations must be considered when making a
decision to replace a LIMS. It is rare that users are willing to sacrifice
capabilities they already have!
These customizations may be a major factor even when upgrading to
new versions from the same LIMS vendor. This impediment to change
should be carefuIly considered when designing customizations and migra-
tions. A slight added cost in an original customization is weIl justified if it
ensures the ability to accept upgrades, change vendors and change other
components of the application or system architecture.

10.3.6 Interfaced to other systems


Similar to customizations are the interfaces to other systems. This includes
interfaces to laboratory data systems and instruments or interfaces to other
systems which use LIMS data such as results reporting, management
information or MRP systems. These interfaces are usuaIly critical to
benefits, and the costs of duplicating the interfaces must be considered.
Again, it is rare that users are wiIling to give up capabilities they already
have.

10.3.7 Users and computing staff already trained


Often overlooked by users and management is the cost of training and
experience. This goes far beyond the cost of a few people attending a
couple of courses. Training computing staff to work in a new operating
system, programming language or database environment can be very
expensive. Current skiIls learned through experience cannot be easily
duplicated through a couple of courses or training sessions. It wiIl take
months of use at decreased productivity levels to regain the basic level of
competency. Training users beyond the fundamental functions of data
entry and reporting wiIl take substantial time at reduced productivity. A
change in system is a change in culture that takes significant pain and time.

10.3.8 System validated


Cost to validate a system is an important factor. The new system may even
be more difficult to validate than the old system. This must be balanced
against any improvements in validation level or avoidance of risk to the
business through using a more fuIly validated system.
Although validation is often considered a cost only in heavily regulated
industries such as pharmaceuticals, this is incorrect. Validation should be
performed for any system in any industry in order to provide management
and customers with the assurance that the system can be trusted, that data
MOVING FORWARD AND SYSTEM REPLACEMENT 179

are reliable, and that information is properly protected. To fail to perform


validation is to subject a business to unwarranted risk.

10.3.9 Difficult to justify


Justification is different from emotional rationalization. Justification
involves evaluating all benefits, risks, and costs on as quantitative a basis as
possible before making a decision. If a decision has already been made to
replace a system, justification is really just an emotional exercise in
rationalization with little fundamental purpose.
Justification of replacement systems creates many philosophical problems
relative to justification of an initial system. If the existing system is meeting
basic needs, then the benefits for the replacement system should be limited
to incremental benefits - those beyond what the current system delivers.
Similarly, the cost side of the justification arguments is often complicated if
the existing system is fully depreciated yet operating satisfactorily, or not
meeting needs but not depreciated yet.
Justification is often based on risk avoidance or a cost of doing business,
particularly if the system can be linked to fundamental business strategies.
Such an approach is only valid if properly acknowledged and if reasonable
decisions are properly considered.

10.3.10 Fully satisfied with vendor


Potential costs and benefits of changing vendors must be carefully
considered. The partnership with the vendor is as important as the features
of the system, particularly when problems occur. It is always attractive to
look at the latest features described in an advertisement for a competing
system and to dream of being able to switch. It is important to consider the
downside of switching vendors. Are you going to exchange features for
poorer service? Will you be as important or strategic a customer to the new
vendor - will your satisfaction be as important? After all, you have just
demonstrated your lack of loyalty to the last vendor. How much is your
loyalty worth to the current vendor? Does a willingness to stick with the
same vendor enhance your bargaining position or weaken it?
Most vendors duplicate important competitor features in a later version.
What is the benefit or cost resulting from being first? It may be better to be
second! Do all these new features just complicate the product and make it
more difficult to maintain? Often the preferred course is to stick with a
reliable vendor that provides some innovation in important areas, but is
conservative when changing proven features.
180 ADVANCED LIMS TECHNOLOGY

10.4 How long should a LIMS last?

There is no reliable rule of thumb to guide a company on how long a LIMS


should last. Some LIMS are obsolete the day they are installed. A LIMS
that this author was instrumental in installing in 1981 is still operational in
1995. This latter system has been gradually improved over the years and
migrated to improved hardware and software architectures, although many
systems available today provide superior features in some functional areas.
The major consideration is that the pain level associated with using the
current system has remained less than the perceived pain level of switching
systems.
Architecture issues are frequently very important. A company's change
from VMS to Unix operating systems or from minicomputer to local area
network architectures may create a need to replace a LIMS before its
natural life cycle is exceeded. A movement toward common systems or
enterprise databases similarly may impact the life of a LIMS. Alternatively,
being on the lucky leading edge with a new technology may result in an
environment where the LIMS life can be extended longer than normal.
The specific reasons for wanting to change may offer some ground rules
for what is a reasonable life cycle. If the LIMS is closely associated with a
particular architecture component, such as computer hardware, then the
LIMS lifetime will be closely aligned to the expected hardware. If the
LIMS is closely associated with emerging industry standards, then the
lifetime can be lengthened substantially by technology being developed to
support future migrations. For instance, DEC provided tools to migrate
from PDP-lIs to VAX computers, allowing the relatively easy migration of
some LIMS to VAX architectures and extending their lifetimes. However,
many LIMS were developed using advanced technologies that never
became popular, resulting in a substantially shortened lifetime for the fully
supported system.
Vendor support is another significant issue in determining the LIMS
lifetime. Several examples exist of vendors developing 'good' systems that
did not acquire a strong enough following in the marketplace for the
vendor to justify the continued cost of enhancements and support. Vendor
bankruptcy or vendors unexpectedly dropping support for a product result
in shortened LIMS lifetimes. Continuing development by a vendor can
lead to longer LIMS lifetimes, while stagnation in development results in
customers moving to other solutions.
Ease of integration with other systems and ad hoc access to information
are additional factors that affect lifetimes. If customization is required for
integration and the vendor does not provide easy-to-use tools, the LIMS
with the best features initially may be replaced with systems providing
easier ways to handle data. This was particularly the case with many of the
older LIMS based on Fortran file structures versus current systems using
MOVING FORWARD AND SYSTEM REPLACEMENT 181

relational databases. A vendor that provides numerous 'hooks' into


systems and standard ways of exchanging information offers a solution with
the potential for a long lifetime.
A final consideration is worthy of careful evaluation. Many companies,
particularly petrochemical and pharmaceutical, make very heavy use of
chromatographic techniques of analysis. It is natural to make the interface
between the LIMS and chromatography data system a very important
factor in selecting a LIMS. However, too much focus on a specific data
system causes the lifetime for the LIMS to be directly tied to the lifetime
for the data system, creating a situation where both systems need to be
replaced together.

10.5 How do you justify a replacement LIMS?

Justification of any LIMS is often a painful process, but justification of a


replacement LIMS can be particularly daunting. There is no magic formula
that is accepted in all companies.
Justification should start at the beginning of a project. Costs and benefits
should be considered at every stage of the project. Projects that do not
consider justification issues continually usually are plagued with expensive
features that delay projects for little benefit. However, a balance must be
maintained between justification and creative insight. Too much emphasis
on early justification results in narrow projects not producing wide
benefits. The nature of justification is that it is easier to identify costs than
benefits. This does not mean the benefits are not there, just that they may
not be understood yet. The diligent search for benefits leads to well-
conceived projects.
Justification should be thorough and not after the fact. It is important
that the justification discuss all issues in as objective a manner as possible.
A justification that attempts to ignore costs such as training, validation and
support will not be accepted by some individuals, leading to controversy
and lack of support. It is usually better to acknowledge an issue than to
deal with criticism that the justification has not been properly performed.
Justification of a replacement system presents some special challenges.
Any justification should attempt to quantify benefits and costs associated
with the new system. Comparison points are difficult to determine. For
instance, are only incremental benefits considered? Or should the baseline
return to that used for the justification of the original LIMS? If the
replacement is necessary because the current system is nearing death, then
delay in replacing the system may lead to the scenario that the current
LIMS dies before a new one can be implemented. In this case, analysts will
have to return to manual procedures and the new LIMS will be justified on
the basis of benefits compared to manual procedures. If the new LIMS is
182 ADVANCED L1MS TECHNOLOGY

installed just in the nick of time, does it make sense to give it a weaker
justification by allowing only incremental benefits?
Sometimes either the cost or the benefit side of the equation will be
more clear. But, for a balanced and thorough justification, it is important
to consider all factors. The costs may be very close to the laboratory - in
the form of license fees, hardware, customization, training, validation -
but the benefits may be from other departments and difficult to quantify.
As a typical problem, if a LIMS is being replaced to shorten the new drug
development cycle, what is the benefit? For a $1 billion/year drug, if LIMS
can shorten the development time by a single day, this equates to an
increased revenue potential of $3 million, easily justifying most LIMS
projects. If the benefits can be that large, how can you ever turn down a
project? On the other hand, how often is the laboratory on the critical path
where the development cycle can be impacted by LIMS? Such justifications
are difficult to rationalize and are often scrutinized closely by financial
executives.
Similarly, benefits in the form of savings may be clear but costs may be
obscure. The most difficult cost to quantify is the cost associated with
training and loss of experience. It is very difficult to estimate for how long
typical analysts will be less efficient after a new system is installed. It is also
difficult to estimate the cost of providing ongoing training and encourage-
ment outside the formal classroom.
Unfortunately, there are no easy answers to the justification questions.

10.6 Would a custom system be better?


As the project moves from deciding whether a new LIMS is needed to
which LIMS is best, it is often attractive to consider the benefits of
implementing a custom LIMS that fits user requirements exactly. If this
seems attractive, first go back and read section 10.2.3.1, Difficult or
Expensive to Maintain, and then read on!
The decision to develop a custom LIMS must be approached very
carefully. It is very easy to bias the steps leading up to the decision one way
or another, creating the substantial risk of making the wrong decision. This
decision must be made by a truly objective individual who understands all
the ramifications. Laboratory analysts or supervisors who have talked to a
few vendors or systems development personnel who understand popular
computer system architectures are seldom unbiased.
Faced with license cost estimates, suggested implementation schedules,
customization and system integration fee estimates, and ongoing mainten-
ance costs from vendors, it is attractive to say there must be a cheaper or
better way. System development personnel will look at the user require-
ments and say they can do it in half the time at one-third the cost.
Typically, they are right. But be careful!
MOVING FORWARD AND SYSTEM REPLACEMENT 183

The user requirements produced by most companies define the features


users need, but generally assume a full range of standard features that are
of little interest to laboratory people until they are missing. These user
requirements are usually woefully inadequate to serve as the basis for the
design of a custom LIMS. It is not possible for system development
personnel to provide a reliable estimate for the cost and time to develop a
system that is not properly defined. Typically, as development proceeds or
the system is implemented, users will demand additional capabilities that
they assumed would be present - and costs and time estimates will escalate.
LIMS from vendors are designed and built to meet the needs of many
customers. Many features will be perceived as not needed in your
laboratory, or it will be thought that your way of doing it is better. It is
important to be careful on this issue and get feedback from the users of the
system. Very often the collective experience of many companies is superior
to the best ideas of a single company. Also, the feature you do not need
today and leave out of the system may be needed tomorrow.
Vendors typically have a team of developers working on enhancements
and new versions. If you develop a custom LIMS, will you be able to afford
an ongoing development staff to provide new features? That annual
maintenance charge from a vendor that ensures access to new features and
versions is usually the best deal around! Make sure the costs of
maintenance and ongoing development are both included in cost estimates
from your developers for a custom system.
Most LIMS from vendors have a carefully thought out audit-trail
capability designed to meet the requirements of regulatory authorities such
as the EPA and FDA in the USA. This capability is rarely defined
adequately in the user requirements, and development staff will usually
underestimate the cost of providing this capability.
Most commercial LIMS have general-purpose tools designed to assist in
integrating the LIMS with other systems, such as data acquisition,
chromatography, MRP and financial reporting systems. These tools are
generally quite versatile, designed to meet the interface needs of many
systems. Interfaces designed by internal development staff will usually be
targeted to the specific system needing the interface and will have to be re-
built if that system is replaced.
The efforts required for validation, training, system documentation, and
user manuals are usually greatly underestimated by internal development
staffs. Ease of validation is strongly influenced by the degree to which
development follows good system development life-cycle and documen-
tation procedures. Your users deserve a professional set of manuals,
whether the system is developed internally or purchased from a vendor.
What if your development staff leave in the middle of the project? Or
three years after the LIMS is completed? You might be faced with
replacing your LIMS yet again!
184 ADVANCED LIMS TECHNOLOGY

10.7 Support your vendor

The decision to implement a LIMS from a vendor should be taken as the


beginning of a long partnership with that vendor. In any relationship of this
type, problems are bound to occur. Customers should work with their
vendors to make certain that problems are solved to their satisfaction.
The success of the vendor and its LIMS product should be important to
the company. This does not mean that customers should become the
marketing arm of the vendor. It does mean that the company should be
willing to invest some effort in the future of the LIMS product. Customers
should be proactive in providing feedback and encouragement to their
vendors. Customers should actively support user groups so that vendors
have a clear picture of what their user community needs. This forum also
enables customers to learn about their vendor, the LIMS product, and
future plans.
Companies should carefully consider whether fighting for the best price
from their vendor is in their best interest. Clearly, if they can receive the
same for less, it is in their best interest. But if they drive the price too low
so that the vendor does not receive a reasonable return, the vendor may be
inclined to shortcut the services provided, or the financial health of the
vendor may be at stake. Companies must be willing to support their
vendors or there may not be a choice between competitive products in the
future.

10.8 How to implement a replacement LIMS

There is probably no 'best' way to implement either a new or a


replacement LIMS. However, most companies and consulting firms have
found that a 'structured approach' works best. An alternative approach
utilizes 'fast prototyping', but in an environment where regulatory
authorities insist on thorough validation, the prototyping approach only
applies to a portion of the development cycle, primarily the manner in
which user requirements are defined and early code is written. Steps earlier
and later in the development cycle still need the traditional structure.
Although prototyping has some distinct and important advantages when
applied to system development projects, it is not particularly pertinent to
implementation of a packaged system and will not be discussed further
here.
Even within the various structured methodologies, there are wide
differences in how they might be applied. Portions of up to five
methodologies might be used to support the needs for developing and
documenting business strategy, performing business process engineering,
defining systems strategy, developing the system and for supporting
MOVING FORWARD AND SYSTEM REPLACEMENT 185

management needs. Most consulting organizations and many companies


have comparable methodologies.
If possible, a methodology should be used. Use of a structured
methodology moves the company through many steps that might otherwise
be skipped. Although skipping steps may seem more time-effective, in the
long run it is this skipping of steps that leads companies to implement
systems that do not support strategies, user requirements, and new
business processes. Failure to follow a methodology also makes it difficult
to validate a system, train users, and provide documentation and manuals.
Now that we are all convinced that a proven methodology should be
followed, the key points to be considered when following such a
methodology will now be covered superficially.

10.8.1 Define strategy


Strategy is the foundation upon which well-managed companies operate. It
is important to place important projects, such as LIMS implementation,
within the framework of the company's strategies. Obviously, LIMS may
not be the determining factor in whether a company attains its corporate
objectives. However, the value of major systems is increased if these
systems support the corporate strategy.

10.8.1.1 Business strategy. Business strategy should be defined at


several levels. Typically, these would include:
• Corporate
• Department
• Laboratory
At the corporate level, statements of the business strategy that will be of
interest to a LIMS project could include excellent customer service, quality
products, fast new-product development, compliance with regulatory
requirements, etc.
At the department level, specific items from the corporate strategy that
apply to the specific department should be described in more detail. In
addition, items specific to that department can be added if they are
important to achieving departmental goals, even if they are not directly
derived from a corporate goal. For instance, reduced raw-material
inventory, reduced manufacturing costs, implementation of statistical
process control techniques, etc., would be included in a manufacturing
strategy.
At the laboratory level, the strategy becomes much more specific.
However, it is important for the goals of the laboratory to support the goals
of the corporation and departments that it serves. Excellent customer
service probably requires easy access to analytical results for products and
186 ADVANCED LIMS TECHNOLOGY

good complaint-handling procedures. Quality products require careful


release against established specifications, good testing procedures, and the
ability to identify problems before they cause bad products. Fast new-
product development requires timely analytical services. Reduced raw-
material inventory probably means that the laboratory must provide
definitive testing results promptly on receipt of raw materials. Reduced
manufacturing costs may mean reduced laboratory costs, but it is more
likely that the laboratory will be called upon for improved services in
technical support. Implementation of statistical process control requires
immediate access to precise analytical information. Regulatory compliance
includes laboratory activities to support GMP compliance for the depart-
ment, and internal activities to ensure that the laboratory complies with
requirements for specification development, methods validation, equip-
ment calibration and employee training. These are only a few of the items
which might be important in a laboratory strategy.
In developing strategies, it is useful to identify and document 'Critical
Success Factors'. These are the things that must be done well for the
corporation, department or laboratory to be successful.

10.8.1.2 System strategy. The systems strategy will also include


corporate, department and laboratory levels. The systems strategy must
include applications and how they will interact or exchange information,
hardware and software building blocks designed to make implementation
easier and maintenance less expensive, and the infrastructure needed to
develop, implement, manage, validate, train and document.
The systems strategy must be directly tied to corresponding business
strategies. If systems do not support business strategies, their value is
greatly diminished. Systems strategies must support the 'Critical Success
Factors' identified earlier. It is useful to identify and document 'Critical
Information Requirements'. These are the pieces of information necessary
for the organization to succeed in fulfilling its Critical Success Factors and
specific goals and objectives. Systems must be designed to fulfill these
Critical Information Requirements.
The systems strategy for the laboratory will include systems to manage
samples, acquire analytical information, calculate results, and report
information. This strategy will include a framework within which LIMS,
data acquisition, robots, equipment maintenance, graphics and statistics-
related applications will reside. This strategy must also include how these
applications should interact and exchange information. In addition, this
strategy must include interactions outside the laboratory. These inter-
actions may be with people or other systems. The strategy should define
whether information exchange is to be manual or automated. The strategy
should include an overall plan of when various capabilities are to be
provided or updated, and the approximate cost of each project. The
MOVING FORWARD AND SYSTEM REPLACEMENT 187

strategy should also include an analysis of required resources. Projects


which are not part of the strategy should be more difficult to approve and
should not be allowed to detract from the goals of the strategy.
The systems strategy should be completed, understood and supported
before major projects are undertaken. The strategy should be kept
current. This may involve a process of continual revision or else the
strategy should be revisited every few years.

10.8.2 Re-engineer key business processes


It is essential that systems effectively support key business processes rather
than forcing people to work the way the system dictates. Before new
systems are defined and implemented, there must be a careful review of
current business processes.
Business process re-engineering is likely to occur at two levels that could
impact a LIMS project. It is essential that each level be adequately
considered.
At the corporate or department level, major changes to the overall new-
product development, production, or order-fulfillment processes could
impact a LIMS project significantly. These changes are likely to alter
activities and information requirements or create new needs for system
interfaces. Changes in the workflow are often difficult to support after a
LIMS implementation.
At the laboratory level, major changes in workflow or information
requirements must be defined before LIMS implementation to avoid major
losses through wasted efforts. Significant changes in strategy and the re-
engineering of laboratory work processes will drastically impact LIMS
requirements. For instance, elimination of data checking and multiple data
entries will require the LIMS to integrate directly with other systems and to
include some quality checks of the validity of data.

10.8.3 Define preliminary requirements and select system


Only after business and systems strategies are fully understood and
business process re-engineering activities are complete should the pre-
liminary requirements for the new LIMS be defined. This is essential for a
LIMS to help a company to achieve its goals.
Preliminary requirements should be defined using a team approach.
Daily users must be fully represented. In addition, someone from business
process re-engineering teams should participate, as well as individuals with
full understanding of business and systems strategies. It is important that
the team be broadly constituted, and not just rely on a few interviews with
outside areas before forgetting all about broader needs.
The capabilities of the previous system should not be used as the starting
point for the system. This would only lead to the new system being a
188 ADVANCED L1MS TECHNOLOGY

modest enhancement of the previous system and it would not be likely to


address major shortcomings. It is better to define the preliminary
requirements by using the strategic and re-engineering activities as the
catalyst for identifying major needs. The comparison with the old system
should be done last, to verify that significant capabilities have not been left
out.
During the process of defining preliminary requirements, the team
should constantly challenge ideas put forward to convince themselves that
items are really needed. It is important that a minimum of capabilities from
the previous system be included as preliminary requirements.
It is important to differentiate between preliminary requirements and
features. Preliminary requirements are what the system must do, while
features often go further and describe how the system performs a given
function. A clear focus on the preliminary requirements is good, while too
much attention to the features will bias the decision process away from
addressing strategic needs.
During the preliminary requirements definition, it is useful to survey
LIMS packages on the market to get ideas on what is possible and what
other companies are doing. This survey is also useful in helping to decide
what is really required, because if a given need is not satisfied by any
LIMS, it will probably be expensive to provide. However, at this stage, it is
important to remain objective and not form opinions about 'how' or 'how
well' a given package addresses preliminary requirements.
Preliminary requirements should be well documented and approved by
the user community. At this point, some attention also needs to be given to
procedures for selecting and validating systems. The actual selection
process should be a formal comparison of capabilities of individual
packages with the preliminary requirements. This may involve a formal
RFP process and competitive bidding. It is often useful to prepare a formal
ranking matrix in which the team assesses how well each package meets
each requirement. It is often useful to agree on the ranking criteria before
evaluating individual packages. It is at this point that the subjective factors
of 'how' a given package fulfills the need should be considered. If
implementation of a custom system is a possibility, this option should be
included among the possible alternative packages.
Once the objective definition of preliminary requirements is completed
and the objective evaluation of each package is concluded, a decision must
be made of which package to implement. Since a custom package can meet
all functional needs 100%, it is important to include in the decision criteria
the various cost and maintenance factors at a realistic level.
Once a preliminary decision is made, it is important to review the
decision. This should include a review of all requirements not met and a
decision whether the requirement can be dropped or whether a custom-
ization is needed. After the review, the decision should be explained to the
MOVING FORWARD AND SYSTEM REPLACEMENT 189

user community so that support can be built for the process used and the
actual decision.

10.8.4 Final requirements, plan, develop and implement


The preliminary requirements as defined above should be 'adjusted' to
reflect the actual capabilities of the chosen system and planned enhance-
ments. Any items from the initial requirements which it was later agreed
were unnecessary should be dropped. The final requirements should also
be 'adjusted' to reflect the manner in which the chosen system meets the
requirements. The preliminary requirements were somewhat theoretical in
nature and may have included many 'wish list' items that no system could
deliver or that were too expensive.
The final requirements should be approved by the user community.
These final requirements serve as the basis for detailed design, project
plans, validation efforts, training and documentation.
Project plans must be developed that address the full range of activities
for the project and which establish timing, resource requirements and
dependencies in the plan. The project plan should be an active document
which is updated as the project progresses. Any changes in timing for key
milestones should be fully communicated.
Development activities, such as installing the hardware and application,
loading data, detailed design, programming, testing, validation, etc. are all
managed by using the project plan. Implementation, including documenta-
tion and training must also be included in the project plan.

10.8.5 Change management


An important consideration is change management. This term applies to
the culture changes necessary for the business process changes and systems
changes to be successful. It must be recognized that past performance was
based on an evaluation and reward system, and that the system must be
changed to encourage good performance under the new systems.
Employees will be uncomfortable with all the changes expected and must
be properly encouraged to implement the new changes. Failure to address
change management issues is probably the major reason why many
business process re-engineering and systems implementation initiatives
fail. Change management includes the diagnostic tools needed to identify
resistance to change and the implementation tools to facilitate the changes.

10.8.6 System integration


A very important aspect of LIMS implementation is the interaction with
other systems. These systems may be within the laboratory or external to
190 ADVANCED L1MS TECHNOLOGY

the laboratory. Key systems within the laboratory environment typically


include data acquisition, robotics, document management, maintenance,
statistics, graphics, etc. Key systems external to the laboratory typically
include e-mail, finance, manufacturing, customer service, document
management, statistical process control, and various user and corporate
databases.
Depending on the specific situations, system integration activities may
be handled within the development and implementation portions of the
core LIMS project or as separate projects. It is essential that the timing
aspects be handled within the LIMS project plan.
It is often convenient to handle system integration separately from the
LIMS project, because the skills are different and the timing and validation
may be separate. For instance, the integration of LIMS with an MRP or
manufacturing system often requires more knowledge of the manufacturing
system and how data are created and used in the manufacturing
department than it requires details of the laboratory activities. To be
successful, system integration requires the proper collection of multi-
disciplinary skills.

10.8.7 Training, documentation and validation


Once development of the system is complete, all users need to become
familiar with the system. This requires preparation of documentation and
user manuals useful to each class of user. It also requires training and
practice with the new system in a non-threatening environment. It is
essential that individuals be separated from the pressure of ongoing
operations during training exercises and that adequate time and resources
be provided so that people do not have to do double work.
Validation will only be discussed superficially. Validation includes
multiple components. Validation is basically the process of confirming that
the system is working correctly, that results can be trusted, and that
information will be safeguarded appropriately. Documentation and training
are key aspects of validation. Other key aspects are the operating
procedures, security procedures, and backup/archival procedures. Many
people only consider development of a test suite and performance testing
when thinking about validation. Testing is only a portion of validation.

10.8.8 Maintenance
Maintenance typically begins before the system is implemented. As users
are first exposed to a system, they will start asking for enhancements. Most
enhancements can be delivered on a separate timetable. It is essential to
document the needs, develop the enhancements, and validate their correct
operation in the same manner as the initial system was implemented.
MOVING FORWARD AND SYSTEM REPLACEMENT 191

Most systems are replaced because of inadequate maintenance. A


system will not meet expectations unless flexibility is built in and
enhancements can be added. Whether the enhancements are developed
internally or by a vendor, there should be a continuous investment planned
for maintenance and revalidation.

10.9 Keep an eye on the future - is it easier the second time around?

Throughout a LIMS project, whether in the early definition phases, during


implementation, or during maintenance, it is important to keep an eye on
the future. This includes the new capabilities needed, changes in
information system architecture, marketplace changes, strategy changes,
etc. If trends are identified early, it is usually easy to evolve a system so
that it can continue to meet needs and replacement is unnecessary. Many
organizations fail to understand the importance of innovation when it
comes to their information systems. If forward-thinking, innovative people
can be kept in positions of systems responsibility, the pain associated with
replacing systems can often be avoided.
Replacing a system is harder than putting it in the first time.
Expectations for the new system are higher, everybody thinks they know
how to do it, and costs are more difficult to justify. Do it right the first time
and you reap the full potential of LIMS.
11 The promise of client-server LIMS applications
I.R. STORR

ILl Introduction

As business environments become more complex and turbulent the need to


develop innovative products frequently and rapidly has become a strategic
concern. In an environment in which efficiency can be measured by the
ability of a health-care company to bring new products to market on ever-
shortening timescales, effective communication and decision-making
processes are of strategic importance.
In a climate of increasing global political, economic, social and
technological change, the health-care industry is not atypical of the
challenges that currently face any market-driven industry. It is as a
consequence of these challenges that many Laboratory Information
Management Systems (LIMS) have been sold with the promise that they
allow laboratory researchers to manage and share information more
effectively. The presumption has been that LIMS allow more rapid and
effective decisions to be made that contribute directly to the speed with
which new products can be brought to market.
The actual payback achieved from the more realistic and achievable view
of LIMS (acting more as an analytical laboratory tool for scheduling work
and reporting results) has been questionable. With current and expected
market pressures and technology developments, we must learn from the
experience with LIMS to date, if future investment is to provide
measurable benefit against actual needs. Moreover, if we are to understand
and quantify the importance of current and future needs of LIMS and the
drivers that will influence their success, then we must be able to provide
some framework for considering the future.
This chapter provides such a framework by discussing the issues that
currently face users and designers of a LIMS in the health-care industry,
exploring the use of client-server technology that will now be used into the
next millennium.

11.2 Review of LIMS development over the last ten years - the story so far

The promise of enhanced competitiveness has driven the rapid expansion


of computerisation throughout the health-care industry. The quantity of
THE PROMISE OF CLIENT-SERVER SYSTEMS 193

information produced by modern laboratories has risen due to increasing


research efforts and the use of automation. During this same period the
demands placed on the use of this information have increased. In an
attempt to manage this information more effectively and in response to the
changing business environment, many companies and institutions intro-
duced computerised LIMS based on a spectrum of technological solutions.
To meet user requirements, in parallel to the remainder of the computing
industry, these technological solutions used by commercial and bespoke
LIMS have seen a technology-driven migration away from support of:
• Proprietary, centralised systems running all software, to distributed
systems
• Terminals, to PCs operating as clients to distributed systems
• Isolated instruments, to networked automation
• Manual operations, to robotics.
However, with many of these systems little emphasis was placed at the
design and implementation phase on how the user would interact with the
information system or on understanding how and what the information
would be used for by that business into which it was being delivered. LIMS
was identified more as a concept having the potential to address some (but
not all) of the demands facing businesses. In implementing a LIMS, a
standard piece of software may have been taken from a vendor and
installed according to current requirements. Sometimes, either the way in
which the laboratory worked may have changed to match what the system
could do, or attempts may have been made to model the system on an
existing paper system. But has this approach worked? Does this allow us to
meet the requirements of our customers? Indeed has LIMS only provided
an electronic means for carrying out current practices, with all their
imperfections, more quickly?
In many reported cases, LIMS have been layered on top of existing
practices or constrained by traditional ways of working, with the result of
emphasising any existing faults in those processes. Consequently, there
have been many LIMS implementations that have failed completely or
have fallen short of expectations. But is this 'failure' due to these poorly
articulated specifications and a reluctance to change, or more closely
associated with changes in the environment, the rate of change and the
resultant changing expectations of the users of the system not being
matched by the usability and usefulness of the LIMS?

11.3 Current trends

Assuming that our business environment will continue to change at the


current rate, a request to prepare strategies or even develop a business-led
194 ADVANCED LIMS TECHNOLOGY

vIsIon for a LIMS presents considerable challenges. Clearly, if the full


investment in any information system is to be realised (and previous
mistakes are to be avoided), then it must be capable of handling the
anticipated requirements of all its users. So that we can meet user
requirements with the flexibility demanded, we must be able to identify the
users of the systems that we provide, understand their requirements and
match their requirements to the solution. Although the laboratory can be
expected to continue to fulfil the conventional role of a source of analytical
information, changing business and regulatory requirements necessitate
enhancement of the way in which this information is managed as a part of
any business. With increasing external and internal pressure from our
customers, we have an increasing need to manage the information that our
laboratories produce in more effective ways. Providing information in
different ways to different people, faster and better is a major requirement
for today's laboratory. On this basis, we must focus on the value of the
information that is produced in a particular environment, who produces it
and for whom. The information will only be of value if we can have
confidence in its quality and if it can be presented in the form required at
the time that it is required.
Equally, the degree of assurance that can be placed on data and informa-
tion maintained by computerised systems is of fundamental importance in
the health care industry. Regulated by the Food and Drug Administration
(FDA) in the USA, the EC in Europe and the DoH in the UK,
computerised support systems must be shown to be fit for the purpose for
which they were specified. The key to meeting these requirements and
ensuring the usefulness and usability of the system is understanding and
effectively modelling business and user needs. To make the most effective
use of resources, sponsors, users and developers must, therefore, have
effective means of:
• Recognising and specifying needs at the level of the overall environ-
ment, the business and the individual (meeting requirements for
usefulness)
• Identifying opportunities for change
• Translating key needs into adaptable systems with appropriate
interfaces that al10w users to retrieve information in a timely,
consistent and secure manner (meeting requirements for usability).
• Aligning information technology (IT) and information systems (IS)
strategy with ongoing business and environmental needs (meeting
requirements for strategy and phasing the implementation of critical
business features).
These considerations are discussed in the fol1owing sections.
THE PROMISE OF CLIENT-SERVER SYSTEMS 195

11.4 Regulatory requirements

From past drug-testing design problems and their long-term effect on


society, the international concepts of Good Laboratory Practice (GLP),
Good Manufacturing Practice (GMP) and ensuing regulations have
evolved since 1981. Requiring assurance of the quality, integrity and
validity of data, such regulations have in themselves produced an increase
in the volume of data generated by a modern analytical laboratory and the
procedures required to verify and control the use of the data.
In the USA the introduction of new drug compounds is regulated by the
US Food and Drug Administration (FDA), using the New Drug
Application (NDA) process. By the time the drug is approved, its entire
life history has been thoroughly documented. The NDA process and
corresponding research are part of an 'information factory' that must be
maintained effectively. The future success of today's multinational health-
care industry depends upon how it achieves collaborative management of
these information factories. The key to effective collaborative management
is the sharing of knowledge and information. Contributing significantly to
the investment in information systems is the need for secure, controlled
and consistent access to data (at the lowest level). This need is enforced
rigorously by the FDA during audits of systems used by companies
applying for an NDA. Increasingly, computerised systems have been
scrutinised by FDA inspectors. Even without research efforts being
stopped, a delay in the NDA process can cost a health-care company
millions of dollars in lost revenue. As a consequence, computerised
'information systems' used as an integral part of the drug development
process must be developed, maintained and controlled to the highest
standards.

11.5 Standards for systems analysis and construction of information


systems

Construction of information systems refers to the process of defining and


building a computer application. An application development life cycle
(ADLC) provides a structure for the development process which ensures
that a computer application has been designed using a mechanism that
allows the system to be validated. Typically an ADLC consists of the
following phases:
• Functional requirements definition
• System specification
• Software implementation
• Software module resting
196 ADVANCED LlMS TECHNOLOGY

• Software integration and system testing


• System installation and check-out.
To allow formal specification of business and environmental require-
ments and translation of these requirements to information systems, a
variety of approaches have evolved. The approaches place different
emphases on the phases of the ADLC and can be viewed in a variety of
ways. Dividing the requirements of physical processes into smaller, more
manageable parts are the systematic analysis techniques such as
SSADM [1]; viewing the problem as a whole and emphasising the role of
perception of users, owners and the systems analyst are systems techniques
such as soft systems analysis; recognising social influences on a technical
solution is the approach adopted by Enid Mumford in her sociotechnical
systems approach; and the human computer interaction (HCI) [2]
approach focuses on physical and mental characteristics of user interaction
with systems. Rapid prototyping compresses the analysis requirements of
the waterfall method (traditionally used by systematic methods) by
allowing users to work with a computerised model of the actual system and
reduces the time taken by traditional ADLC. Object-oriented techniques
encapsulate processing and information requirements of objects manifest
in the 'real' world.
In parallel to the techniques for specifying requirements, techniques for
software development have evolved from sequential, through structured,
to object-oriented techniques. Computer aided software engineering
(CASE) tools are now emerging that support one or more of the phases of
the ADLC. These tools promise compression of the time taken to deliver
software according to a well structured scheme using one or more of the
preceding methodologies.

11.6 Understanding the user

Recognising the needs of the users is the key to implementation of a LIMS


that adds value to a business. By using a user-centred approach to the
design of the human computer interface it is possible to:

• Increase the usability of the system


• Reduce the risk of system failure due to user rejection
• Reduce mistakes in data entry and system operation
• Reduce potential user frustration with the product
• Improve system performance.
Depending on the scope of LIMS it may be used by all levels of
scientific, clerical or manual staff in an organisation. Typically the
information produced by a LIMS will be used by all levels of scientific staff
THE PROMISE OF CLIENT-SERVER SYSTEMS 197

within that organisation, including individual chemists, analysts and


managers of teams and departments but for different purposes and with
different requirements for the information. These needs will be matched
very much to their particular abilities and skills. From current HCI
research there is a clear match between the type of interface and the type
of user; this does not mean just menu structures. Sutcliffe [2] categorises
users into four distinct groups: naive, novice, skilled and expert. Such user
categories may be estimated by the following criteria: frequency of use,
discretionary usage, computer familiarity, user knowledge, user mental
abilities and user physical abilities and skills. To match user requirements,
these need to be accommodated in a single system with different interfaces
able to reflect different working metaphors supporting physical tasks.
Client-server technology may provide one means of meeting the require-
ments of such challenges.

11.7 Meeting the requirements with appropriate technology:


The challenge facing c1ient-server technology

Client-server technology is an IT configuration. There are currently at


least four definitions of client-server. Typical applications that use this
architecture include:
• File sharing
• Database serving
• Computation engine sharing
• X-Window system display serving.
Each of these application areas uses a pair of programs; one a client and
the other a server. Client-server describes the way of structuring
applications to take advantage of networks. In the client-server model, a
developer divides a single application into a client component and a server
component. The client resides on one computer while the server resides on
another. The matched pair of programs use an interprocess communication
scheme. Through this scheme the client makes requests for services and the
server services those requests. A client uses a local area network (LAN) to
request data or services from a server, and a server performs the requested
operation and returns the results via the LAN.

11.8 Discussion of relationships and issues

In terms of integrating IT, IS, organisational culture and structure, the


client-server IT solution makes the following promises:
• It provides new opportunities for overcoming organisational bound-
aries to process working and information sharing.
198 ADVANCED LlMS TECHNOLOGY

• It provides opportunities to integrate different environments and data


into a single coherent whole as required by the tasks performed by its
users.
In many organisations, LIMS have been implemented to address
intradepartmental and specific interdepartmental functions and not as part
of an overall initiative to facilitate collaborative management. These
systems have involved low and medium levels of business transformation
respectively [3]. Implementation of the systems and co-ordination of any
organisational changes have been controlled by a particular business unit
with minimal impact on the business as a whole. As a consequence, the
benefits may have been great within that business unit but little benefit has
been felt by other areas. Environmental and organisational changes are
increasingly resulting in 'business process redesign' and 'business network
redesign' with changes to the function of departments and the ways in
which departments interact with one another. Opportunities for using
computerised systems to competitive advantage are being lost as a result of
non-alignment of the business strategy, the organisational infrastructure
and IT and IS strategies. The following issues may have to be resolved if
client-server technology is to play its full part in facilitating competitive
advantage.

11.8. J Internal conflict


Often the mainframe lobby may want a bigger, more stable computer. This
group perceive a loss of control due to the downsizing promises of client-
server technology. In the longer term both the culture and the skills of IT
staff may need to be changed to support client-server technology if
the implementation is to be successful. However, people are more difficult
to convert than technology. Although IT is intimately concerned with
delivery of information to users, many IT personnel are not aware of their
real customers and where they fit into the organisation.

11.8.2 User power


Users want technology to solve their particular problems. Often, as users do
not think globally, it can be difficult to adopt a new corporate architecture.
The move toward client-server technology is driven by users complaining
about lack of user friendly mini-computer and mainframe applications and
the extended ADLC. pes and the Windows environment have had massive
impact; users expect applications to be ready to use out of a box, to cost
less than £500 and to be installed in a day. This has given rise to the
perception that IT and IS development are easy. As a consequence,
packages may be installed at short notice as inappropriate platforms, and
THE PROMISE OF CLIENT-SERVER SYSTEMS 199

may be poorly planned, documented and validated. This approach is not


acceptable in a regulated environment.

lJ.8.3 Data ownership


Client-server technology can potentially solve new problems. However, if
the user implements client-server technology, then there is a risk of data
anarchy unless a global view is taken. There is also a risk that central
computing functions may be locked out of corporate data resulting in little
or no benefit of computerisation to the business. Of equal importance is
the need for the data to remain secure. The IT departments have been
custodial about data; this view needs now to change to being prescriptive
about how users use the data. Users must be made aware of the risks and
dangers of managing data.

lJ.8.4 False economies


The promise of client-server technology leading to cost reduction by
providing facilities on smaller, more cost-effective server computing
platforms (downsizing) may be misleading. Moving existing minicomputer
and mainframe applications to a client-server architecture will not cut costs
in the short term. The mainframe and minicomputer are not about to
disappear. There will be a long period of time during which corporate data
will coexist on legacy database management systems (DBMS) and open
systems relational database management systems (RDBMS). New client-
server applications will require access to network and hierarchic databases.
Old mainframe applications will require access to RDBMS when the
downsizing process migrates the data before redevloping the application.
Integrating legacy data and mainframe applications into the client-server
environment is a difficult requirement to meet. Some of the reasons that
make it difficult are:
• Legacy data are not always relational, and not easily accessible
through SQL (Structured Query Language) .
• Real-time, high-performance, read/write access to legacy data is
needed for many applications.
Legacy data may be 'locked' behind a series of 'doors' including
communications system security, transaction manager security, operating
system program authorisation, DBMS read/write authorisation, and
DBMS database access control. It is, however, possible to use desktop
computing to add functionality to mainframe applications, as the desktop
machine is as cheap as, if not cheaper than, a terminal. For new users,
however, client-server technology presents considerable advantages.
Development of user-based applications using client-server technology has
200 ADVANCED LIMS TECHNOLOGY

seen the demise of applications that have taken many person-years to


implement. However, there can be hidden long-term costs associated with
such solutions. For example, there is no clear idea of the cost of user
disruption while changing to client-server technology.

11.9 Systems analysis, construction of information systems and process


re-engineering

Each technique outlined previously has its own advantage for use in
particular situations and phases in the ADLC. Recently all of the
techniques have shown considerable convergence. Eclectic approaches to
systems analysis and software development propose use of those techniques
that are 'best' for a particular phase of the ADLC. 'Best' will be judged
according to user, business and environmental requirements and factors
considered may include speed of development, depth of documentation
and understanding of users. However, without a consistent means of
supporting an ADLC, applications may be developed and maintained to
different levels of regulatory 'completeness'. Often a variety of different
programming styles and paradigms will have been used with little
recognition of human factors. In a number of cases, software may have
been produced prior to significant use of structured programming.
Maintenance may have been performed on these applications, resulting in
applications that are now becoming costly to maintain and whose reliability
is difficult to test and validate. This situation will be accentuated by
applications that are acquired from vendors, where it may be impossible to
apply the ADLC to the software development process directly. In these
cases, the application design, implementation and testing are generally
done by the vendor who provides the application. In a regulated
environment and under these circumstances, the quality of the software
can only be inferred from recognised quality accreditation (e.g. BS 5750,
ISO 9000). In all cases, use of defined minimum standards is required.
However, there is a fine balance that must be achieved between meeting
user, business and social or political requirements. As a direct consequence
of the political environment and the requirement for validated applications,
the ADLC may still cause unacceptable lead times in the delivery of critical
business systems.

11.10 Software development

Critical to the success of a number of client-server applications has been


identifying and implementing key functions by interactively demonstrating
THE PROMISE OF CLIENT-SERVER SYSTEMS 201

'payback' to the user and business benefit. The rapid prototype develop-
ment method favours development of a small subset of the total
functionality, getting that working, and later developing additional
functionality in the same way. By using such a rapid prototyping method, it
is possible to demonstrate quickly to users the key aspects of a business-
focused system as part of a phased implementation in a way that is easy to
demonstrate and easy for users to understand, and which can be developed
according to changing business and user requirements.
With access to more powerful applications on the desktop and with
prototyping, users are increasingly intimately involved in development of
solutions. However, if users take on development too fast there is a risk
that central computing facilities will need to rewrite the applications, at a
cost, if things go wrong. There is a need for users to understand the
discipline of IS development including support of an ADLC and
recognition of the need for verification and validation. Indeed there is a
need to develop new applications with a view of the global architecture.

11.11 Communications

The communications infrastructure must be able to manage the demands


that will be placed upon it by the client-server technology. The simplistic
model of client-server development describes the front-end client as
handling the user interface and the program, and the back-end DBMS
server handling the user interface and the program, and the back-end
DBMS server handling the data. In response to some user request, the
application on the client issues an SQL request to the database server. The
server performs the operation and attempts to return the result to the
client. Problems would not normally be expected if the resulting amount of
information is small. However, when the amount of information is large
the performance of the network and the server's ability to service other
requests can be dramatically affected. From other studies, it is extremely
difficult to tell in advance what is the typical amount of information that
will be returned by a user request. Some indication will, however, be given
by the types of activities performed and mapped by the users of the system.
To address this problem partially, developers are restricting end-user
access to a set of specific SQL queries. However, the dangers of this
solution in terms of its impact on the usefulness of the system must be
recognised. Limiting the queries that can be performed immediately
restricts the value of the information along predefined dimensions, and
may exclude ad hoc information interrogation.
202 ADVANCED UMS TECHNOLOGY

11.12 Implementing client-server technology

When developing new systems to increase access or reach new users, one
approach to implementing client-server technology is to overlay interfaces
on existing applications. A logical starting point is to assess what sorts of
platforms are already instalIed. In some organisations there are elements
of the client-server architecture already in place. In other environments,
users are happy working with graphical front ends like Windows on the PC
or in a Macintosh environment, but they need a server to consolidate and
access information. The type of platform wilI steer the direction of
development towards certain types of tools. It remains essential that any
existing IT environment is critically assessed to establish whether it will
work with client-server technology.

11.13 Conclusions

The increasing pressures of global competitiveness are causing organisa-


tions to re-examine the ways in which they do business. To improve or
maintain organisational competitiveness, entire business processes from
conception to the consumption of products, as well as services being
offered, are undergoing critical examination. Within organisations in the
developed economies of the world, the widespread use of automation and
information technology has amplified the sense of urgency that companies
feel. In a competitive environment, companies can see competitors
redesigning and streamlining their business processes (re-engineering), and
automating those streamlined processes by using information technologies
such as client-server methods to attain a competitive edge.
Regardless of the method used for process re-engineering, the key to the
successful use of computing to support the business is the matching of the
IT solution to IS requirements. It is essential that future investment in
LIMS is guided by a rational analysis of available IT and IS options aligned
with business and environmental requirements. Clearly for a LIMS
implementation to be successful, it must therefore:
• Meet requirements for data quality, integrity and validity
• Facilitate communication and information sharing between individual
departments
• Provide usable products (reports, graphs, etc.) with the content and
format required by customer departments
• Integrate data from different analytical disciplines involved in the drug
discovery process
• Support the power and reward structure of the organisation by
providing appropriate means of interacting with the system
THE PROMISE OF CLIENT-SERVER SYSTEMS 203

• Integrate with the prevalent IT environment and with existing


information sources used by the business
• Be flexible enough to adapt to changing business requirements.

To achieve these broad objectives, both identified users and the developers
of LIMS have a proactive role to play in its successful implementation and
use. Each must have a clear understanding of:

• The subject of the laboratory's output


• The value of the output
• The user of that output
• The producer of the output
• The supplier of resources used to generate the output.

Using a human-centred approach, rapid prototyping and client-server


technology, it may be possible to:

• Identify the current LIMS needs of an organisation as part of a


coherent IT and IS strategy.
• Propose a system design that meets the current needs of the primary
users and their customers

Clearly, client-server approaches have a role to play in achieving these


goals and avoiding the mistakes made with older systems. However, we
may need to temper our enthusiasm for implementing client-server
technology based on the considerations below.

11.13.1 Client-server technology is not a panacea


Any overall LIMS and laboratory automation strategy should be capable
of working across multiple platforms and across multiple DBMS, should
accommodate multiple organisational structures and approval procedures,
and provide access to multiple analysis tools.
This flexibility is not only good from a laboratory perspective, but will
also allow a computing group to make use of new technologies through
prototypes or pilot projects that are still part of an overall strategy. One
such technology is that of client-server systems. However, although
'client-server' was once a useful term, too many people have begun to
believe that it is the only useful distributed computing model capable of
solving such a wide range of user, business and environmental problems.
From current benchmarking, no matter what application-development
tools are chosen, it is extremely difficult to design a high-performance
client-server application.
204 ADVANCED LIMS TECHNOLOGY

11.13.2 Application of client-server technology


Every organisation has a number of goals, missions and values. These
values may drive a number of projects. Arguably the most strategic
projects are those which are closest to addressing more than one value. If
projects are begun because IT solutions are now available to solve some
identified problem, then the problem may be solved with the new
technology at the cost of many person-hours and much overhead and
equipment. However, the question remains: 'Could the same effort have
been dedicated more effectively to solving a problem with existing
technology that would have more impact on corporate goals?' With this in
mind and to limit risk of failure, client-server projects must be chosen with
care and approached with caution.

11.13.3 Software development


Breaking a software problem down into smaller more manageable units
allows the application-development process to be approached more
effectively. Recognising that the systems analyst and software developer
are part of the problem, delivery of the systems must be approached as part
of an enhanced ADLC model. To ensure user acceptance of new and
replacement solutions, a user-centred approach to systems design should
be adopted. The ADLC must support controlled use of rapid prototyping,
thereby giving the opportunity for controlled rapid delivery of applications
to the user that meet their current requirements. With appropriate
software, rapid prototyping has been used effectively as the development
method for client-server systems. The advantage is a faster, more
responsive method of development, with only small differences between
the prototype and the production version. This method is advocated as an
excellent means of building 'opportunistic' applications that need a quick
development cycle and may have a short life-span. However, speed cannot
be a substitute for specification and control of the application development
life cycle. The use of prototypes as models without any documented
specification may not be acceptable for critical application in a regulated
environment. Additionally, this approach may address usability criteria of
a user, but may miss opportunities for value-added activities through
process re-engineering by being too concerned with the interface.

1l.14 The way forward?

If effective client-server application development is an iterative process,


then the key is flexibility. If the IT infrastructure does not allow this
flexibility then use of client-server technology may not be appropriate.
THE PROMISE OF CLIENT-SERVER SYSTEMS 205

This in turn will lead to companies taking a pragmatic approach to c1ient-


service technology, such as surrounding legacy systems with new desktop-
hosted applications rather than pioneering across-the-board conversion to
client-server architectures.
The high residual value of information maintained by LIMS databases
that have been in use in the health-care industry for the last ten years may
dictate that this pragmatic approach is the only time-efficient way forward
for many companies.

References

I. Ashworth, C. and Goodland, M. (1990) SSADM A Praclical Approach. McGraw-Hili.


Maidenhead, UK.
2. Sutcliffe, A. (1988) Human-Compurer In/erface Design. Macmillan, London. UK.
3. Scott Morton, M.S. (1991) The Corporarion of lhe I990s: Informarion Technology and
OrganisalionaJ Transformalion. Oxford University Press. Oxford.
12 Standards for analytical laboratory data
communications, storage, and archival
R.S. LYSAKOWSKI

12.1 Introduction

Every year many thousands of analytical, clinical, and biological chemistry


laboratories throughout the world generate terabytes of data. Chroma-
tography and spectroscopy are the high-production techniques for most of
these laboratories. In sheer volume alone, more than 70% of all analytical
data are chromatographic data. Instrument manufacturers sell more than
10 000 new chromatography instruments per year, and nearly all of them
are made to meet some set of customer specifications. Virtually all
chromatographic data handling systems are proprietary, and have many
different features and data processing capabilities. Yet these commercial
systems all collect a similar, overlapping base of data elements running on
fewer than six common types of computer operating systems. Logically,
because they collect many very similar data elements, these instruments
should be able to communicate and share data, regardless of the
manufacturer of the equipment.
The need to intercommunicate frequently arises in large corporations
and government laboratories with multiple sites or remote sites that use
equipment from different manufacturers. However, even small laboratories
experience problems resulting from poor data communications capabilities.
These problems lead to unnecessary manual labor and decreased through-
put. Increased regulatory pressures mandate easy retrieval of data from
archives after many years or decades and re-evaluation of those data on
data processing equipment or instruments that will have evolved several
generations ahead in time. Consumer protection, product quality standards,
environmental issues, and quality patient care are but a few of the concerns
dictating requirements for standards for analytical data communication,
storage, and archiving. The need to move and reuse analytical data over
time is now critical in all sectors of research, development, and testing.
The first effort to standardize chromatography data communication and
storage was initiated only a few years ago [1]. Very few attempts were made
to define standards or standards programs for analytical laboratory data.
Previous attempts did not meet both the business and the technical
requirements, and were either unsupported or unsupportable by the
STANDARDS FOR EXCHANGING ANALYTICAL DATA 207

commercial and government sectors. Since then a common standard


framework for the analytical laboratory world has emerged, called ADISSI
netCDF, and it has been applied to chromatography, mass spectrometry,
and infrared spectroscopy. The ADISS/netCDF framework is currently
being applied to several other analytical techniques. The current progress
on instrument data standardization is remarkable. However, much work
remains to be done if scientists and information technologists are going to
have their wish for open and integrated laboratory data systems,
instruments, and LIMS. A vision and technology statement for open,
integrated laboratory data systems, instruments, and LIMS was clearly
articulated in 1989 [1-3]. Consumer knowledge of the issues and their
articulated demands are still insufficient to compel the instrument industry
to realize this vision.
This chapter covers the following important aspects of analytical data
standards and laboratory automation.
• An introduction to the standards development process, with invest-
ment and payback models for commercial and formal standards
development
• A summary of the Analytical Data Interchange and Storage Standards
(ADISS) Program
• The Analytical Instrument Association (AlA) Standards Program: the
AlA is an international trade association taking more rapid steps
toward the core ADISS vision than any other organization
• The analytical information model and software architecture used for
the standards
• An example of the current application of the chromatography
standard
• A discussion of the need and potential impact of these standards on
the growth of the R&D LIMS and laboratory groupware market
• A discussion of the influence of standards on laboratory automation
market dynamics
• Specific recommendations to help end-users and instrument vendors
to become long-term partners.

12.2 Standards investment and payback

Developing standards is expensive. It can take more time to develop a


standard than to develop a commercial, proprietary product. One must
always remember that standards are the products of standards develop-
ment organizations and committees. Formal standards development
organizations such as the International Standards Organization (ISO), the
American Society for Testing and Materials (ASTM), the Institute for
208 ADVANCED L1MS TECHNOLOGY

Electrical and Electronic Engineers (IEEE) and the many others at


national or international levels are commercial ventures that sell their
standards documents as products. These standards organizations must
make money or they will go out of business. The fact that these
organizations adhere to self-defined open consensus processes is an
advantage to ensure the ultimate neutrality of the product. For this last
reason, I am a strong participant in formal standards organizations.
Even so, my experience has shown that there are definite advantages to
having commercial consortia and industry associations product standards,
rather than formal standards development organizations. The natural
constraints that always exist on commercial development projects, e.g. tight
budgets, tight schedules, and only one organization setting the agenda,
work in favor of them producing standards more quickly than formal
standards development organizations can. Without these fiscal constraints,
most standards development organizations do not have the urgency needed
to complete a standard in a timely manner. A tight budget is a blessing in
disguise here, because it creates a strong motivation to use a no-nonsense
approach and not let too many philosophical discussions waste valuable
development time. Having one organization set the project management
agenda is a constraint that can become a problem if it is self-serving or
obstructionist. However, users will discover such agendas before long.
Lastly, the best technical, business, and human-resources talents and skills
are as crucial for standards development as they are for product
development [4].
A standards development effort is a product development effort; very
few differences exist, except that the development team is bigger and has
additional consensus-building rules. All project phases are the same as a
product development effort:
(1) A need is felt (but not yet quantified
(2) Market requirements are collected and prioritized
(3) Funding is sought from those most positively affected by having it
(or those most negatively impacted without it)
(4) A draft specification is written and circulated for review
(5) A prototype system is built that embodies the draft specification
(6) The specification and prototype are refined to satisfaction
(7) The standard or product is approved
(8) The standard or product is mass markekd
Several standardization efforts in years past have tried to create common
definitions for communciating data between independent analytical
systems. An article describing an approach to infrared spectroscopy data
exchange called JCAMP-DX was published in 1988 [5]; this approach
succeeded in achieving widespread acceptance in the infrared spectroscopy
community. It was the first such offer ever, and it was successful because it
STANDARDS FOR EXCHANGING ANALYTICAL DATA 209

taught the industry much about standardization. The AlA superseded


JCAMP-DX in 1993 by applying ADISS/netCDF to mass spectrometry
and infrared spectroscopy [6,7]. For years before JCAMP-DX there were
false starts that are almost too insignificant in result to mention, were it not
for the valuable human energy expended with each effort to try to define
something useful. Market dynamics, politics, or lack of required talent and
commitment at the right time dictated that these past efforts would not
make it to market.

12.3 The ADISS Program

The ADISS Program is a global program that provides a joint development


framework to create uniform standards for worldwide analytical laboratory
data portability. It is global in two respects: (1) it is germane to all major
analytical instrument techniques of commercial importance; and (2) it is
worldwide in reach. It helps to guide the Analytical Instrument Association,
the American Society for Testing and Materials (ASTM), branches of the
US government, and end-users to work together within a common
standards framework targeting the same core set of goals. The ADISS
Program consists of several multiyear projects being executed simul-
taneously by different organizations. The technical solution includes two
major pieces: (1) the ADISS Analytical Information Model; and (2) a
standard code base that implements the ADISS software interface
architecture. The ADISS Analytical Information Model and software
implementation form the common framework for standards projects by the
AlA for chromatography, mass spectrometry, and infrared spectroscopy
[6-10], the ASTM for chromatography, UV-visible, NMR, IR, TA, AA,
ICP, MS, X-ray, and surface chemical analysis [11-14]. Several information
technology (IT) groups within major corporations are now taking the lead
and using the ADISS/netCDF framework to implement data communica-
tions and archiving that include all of these techniques [15-16]. The result
has been co-operation and synergism to focus the problem-solving energy
and resources of all these interested groups on the completion of the
ADISS Program's mission: to standardize analytical data communication
and storage to make it easy for scientists to move and reuse data easily.

12.3.1 Technology goals must meet the needs of people and businesses
Early in the ADISS Program, it became clear that an integrated approach
was needed to ensure success for end-users and suppliers. Because the
overall mission of the program was to facilitate the tasks of people and to
make businesses more productive, simply addressing technology issues
would not solve the problem. A limited business perspective could
210 ADVANCED LIMS TECHNOLOGY

conceivably lead to a solution that was technologically sound and effective,


but that could not or would not be implemented because it did not meet
specific user needs or take business realities into account.
The approach, therefore, was first to define project goals that related to
people and businesses. Technology goals were developed and defined in
light of their ability to support the people and business goals. 'Business'
here means any for-profit or non-profit organization that creates or uses
analytical instrument test information to conduct operations.
For people and businesses, the AD/55 Program goals are to:
• Eliminate the time and money wasted by current data incompatibility
problems in the analytical instrument and scientific software industries.
• Provide consistency in analytical data and information description and
interchange for the most common use of analytical information,
including data collection, data analysis, publishing and reporting
results, and document databases.
• Create standards that are acceptable to users, instrument vendors and
software vendors.
• Create industry-wide conditions conducive to shifting the focus of
scientific instrument and software suppliers away from time-
consuming, low-level computer-science issues (proprietary file
systems, data structures, file formats, data management systems, etc.)
and return it to problems of physical and life sciences.
• Gain wide user acceptance of the standards through ease of use and
adaptability to established work procedures for existing and future
technologies.
• Identify, evaluate, and recommend software engineering centers of
excellence that can provide well-tested, supportable standard software
implementations that can be used worldwide, along with their long-
term technical support, development and maintenance resources.

To meet these people and business goals, the technology solution defined in
the AD/55 Program must:
• Provide a consistent systems architecture that can be used for:
- Communicating data
- Storing and archiving data
- Building analytical applications from standardized, extensible soft-
ware modules
- Building analytical databases that can be rapidly searched and
accessed
- Information management, presentation, and visualization tools of
the future
- Reporting and publishing of analytical information.
STANDARDS FOR EXCHANGING ANALYTICAL DATA 211

• Specify, design, implement, and establish robust data interchange,


storage, and archival standards for heterogeneous laboratory software
systems.
• Make it easy to build the ADISS architecture into products running on
all major laboratory computer operating systems, including DOS,
OS/2, Unix, VMS, MS-Windows 3.1 and Windows NT, Macintosh,
and future ones.
• Make the ADISS systems architecture inherently extensible and long-
lived by basing it on abstract concepts independent of software
implementation.
• Provide the technical design base that developers can use to eliminate
data incompatibility problems for future generations of workstations
and servers.
• Make sure powerful tools and utilities are available to help developers
implement the standards rapidly and reliably.
• Leverage existing de facto national and international standards
whenever possible so that no time is wasted in 'reinventing the wheel'.

12.3.1.1 Laboratory data handling problems. The most common data


handling problems encountered when using heterogeneous systems are
difficulties with consistent and easy access to data. Lack of standard
interfaces for analytical data communications causes much time and data to
be lost in the aggregate by researchers using file translators and editors.
Computer programmers lose time writing point-to-point translators and
porting software. Reinvention and duplication of effort are still happening
across all industries with maddening regularity.

12.3.1.2 Standard file format problems. Major problems exist with the
'standard file format' approach. Formats change constantly for each
application, because performance of applications depends greatly on how
files are accessed on disk. As new applications require additional features,
changes to the file format are needed. Thus a 'standard file format'
approach is not easy to maintain. File formats in the past have typically
been too 'byte-oriented', or 'file position-oriented', which is extremely
tedious for developers. Because file formats are typically created by
independent developers, they are often too focused on a particular data
type (e.g. IR, NMR, spreadsheets, graphics, etc.) or vendor (e.g.
instrument vendor, computer vendor, etc.), which makes retrofit to other
data types or applications difficult. File formats are designed by many
different people for many different applications and for many different
systems, and they are rarely designed to be modules in global 'systems'.
They have limited extensibility. The widespread heterogeneity of file
design approaches most often leads to the use of the lowest common
denominator for a standard, typically a structured ASCII flat file.
212 ADVANCED L1MS TECHNOLOGY

Another common problem with so-called standard file formats is that an


organization will create a file format and publish it in the literature, often
without defining a totally rigorous and unambiguous Backus Naur Form
(BNF) grammar, and then leave it up to developers to implement their
own format translators and other tools. This is wasteful of time and
resources, especially for standards designed to be used by more than one
organization, because each organization must implement its own tools. It
also leads to more software bugs and greater chances of file reader errors,
because the toolkits are implemented at different times, in different places,
by different individuals.

12.3.2 Portable laboratory data objects


What is really needed is a higher level of abstraction. Data should be
treated in aggregates as abstract objects, from which data are accessed
through an abstract interface. Supported systems and tools should be made
available, including code generators for data converters to help developers
to create interfaces from their own proprietary applications to the standard
communication vehicle. Systems and tools should also be generic, yet
customizable for specific scientific domains. Eventually the systems and
tools should be made an extension of current operating systems. These
things are now all possible, especially with the newer object-based and
object-oriented file systems designed into the newer generation of
commercial operating systems.
The ADISS Architecture specifies two major architectural concepts: (1)
the ADISS Analytical Information Model (ADISS AIM), which specifies
the analytical data objects for a 'generic' analytical technique and all
analytical techniques which can be derived from it; and (2) a structured set
of software tools, starting with a simple, high-level application programming
interface for accessing data in ADISS data objects, which take advantage
of low-level tools that handle data encoding and decoding details
transparently.

12.3.3 The ADI55 Analytical Information Model (ADI55 AIM)


The ADISS AIM is a simple yet comprehensive information model that
defines a taxonomy of analytical data objects across and within the
targeted analytical technique families. The ADISS AIM also contains
some exemplary usage models (data processing models) to provide a
sample context for turning the analytical data into analytical information.
The ADISS Analytical Information Model is a high-level conceptual model
of a 'virtual analytical instrument dataset'. This conceptual model is turned
into a generic logical model in the ADISS Generic Data Dictionary, which
is then refined into data dictionaries for specific techniques. The·
STANDARDS FOR EXCHANGING ANALYTICAL DATA 213

technique-specific data dictionaries contain the ADISS Generic Data


Dictionary elements, plus specialized ones needed by that family. The
ADISS Generic Data Dictionary is used along with a specific standard
methodology to apply the ADISS Analytical Information Model to new
analytical techniques [13]. Table 12.1 shows the set of top-level information
classes within the ADISS AIM and the ADISS Generic Data Dictionary.
The set of information classes listed in Table 12.1 contains all data
attributes common to the most prevalent types of analytical experiments.
Through an approach called 'class inheritance', object-oriented systems
allow specific analytical techniques to inherit these generic information
classes [17]. Specific analytical techniques are then defined as subclasses of
the generic instrument dataset class.
Class inheritance also applies to experiments within a technique family
which inherit information classes from the generic instrument dataset
'superclass'. Experiments within a technique family have the same basic
attributes essential to that technique. They may also have additional,
'private' data fields specific to a particular instrument or software vendor.
These additional fields are included by developers in incrementally built
objects.
The inheritance mechanism of object-oriented systems fully supports the
specialization of the generic classes to suit a particular vendor's imple-
mentation, without compromising the ADISS standards. The Virtual
Analytical Dataset is comprised of the ADISS Information Classes. These
Information Classes are inherited by each one of the analytical techniques
listed below, as an instance of them. Data Interchange Objects are

Table 12.1 Top-level ADISS information classes

Class Description

Administrative Identification information for administrative tracking of


datasets
Sample description Nominal sample information, separate from method
Instrument description Nominal instrument information, separate from method
Test method All sample preparation information, instrument and
computer system settings used to generate raw data
Raw data Scan information and unprocessed data collected during
the experiment
Data handling method The composite of methods for transforming raw data into
results
Calibration method Information needed to assign correct chemical and physical
identities to instrument output
Processed data Processed results information derived from applying the
data-handling method
Sequence information Information needed to track a sequence or batch of sample
runs
Conclusions Decisions arrived at from human interpretation of the
results
214 ADVANCED LIMS TECHNOLOGY

distributed to instrument and software vendors, and become the basis of


the application data interchange.
The ADISS Class Inheritance Model also makes class inheritance for
hyphenated (combined techniques possible, as a simple extension of the
single inheritance model. Hyphenated technique datasets generally include
the datasets of each technique, plus data objects to describe the interfaces
that connect the instruments together. Instruments are combined as
objects; so are their data interchange and storage objects. This is important
because it allows data interchange and storage standards to remain fairly
constant across generations of analytical software used originally in non-
hyphenated techniques. Data interchange and storage software can thus be
reused more easily in new hyphenated techniques.
The following analytical technique families are targeted for technique-
specific ADISS Data Dictionaries. (Asterisks mark those that have active
standard working groups at present.)

• Chromatography*
• Mass spectrometry*
• Optical spectroscopy (infrared and ultraviolet-visible)*
• Nuclear magnetic resonance spectroscopy*
• Surface chemical analysis techniques*
• Secondary ionization mass spectrometry
• Atomic absorption and emission spectroscopy*
• Inductively coupled plasma spectroscopy*
• X-ray spectroscopy
• Thermal analysis techniques

The ADISS Program provides for a gradual implementation of the


ADISS Architecture within analytical technique families. The analytical
technique families were prioritized based on business needs for standard-
ization, including the sheer volume of data generated per year by a
technique family, the number of those instrument types sold per year, and
the relative commercial importance of the techniques. Techniques scoring
high in all areas were logical candidates for the ADISS Program, because
their standardization would have the greatest impact and deliver the most
substantial benefits in the short term. Other analytical techniques
important to clinical and biotechnology markets will be added as needed.

12.3.4 Software implementation of the AD/55 architecture


To achieve its mission, the ADISS Program requires a crucial second piece
- a practical software implementation. The implementation must have two
primary software interfaces, shown in Figure 12.1. These are an abstract
application programming interface (the ADISS API) for accessing data,
STANDARDS FOR EXCHANGING ANALYTICAL DATA 215

ANALYTICAL CHEMISTRY APPLICATIONS

r DEVELOPER'S
TOOLS
111
UTILITIES
ADISS/NETCDF
L . . . - _ - - III L..-_ _--J 1 -_ _--1

ADISSAPI (netCDF API)

(XDR encoding)

OPERATING SYSTEM

COMPUTER HARDWARE

Figure 12.1 ADISS application programming interface architecture. Copyright © 1994


Optimize Technologies.

and a set of low-level tools for byte-stream encoding and decoding, called
the ADISS Toolkit, which makes the data machine-independent. The
ADISS API simplifies data access by allowing programmers to input and
output data by name, as logical entities, according to data objects defined
by the ADISS data model, rather than as a stream of bytes. The netCDF
system described in the next section gives first-level conformance to
ADISS API definition.
The AD ISS Toolkit is a lower-level interface that shields scientists and
programmers from the details of how data are formatted for interchange
and storage. The low-level (netCDF) ADISS Toolkit ensures data object
portability and independence of both computer hardware and operating
systems. Both interfaces require design and implementation using object-
oriented concepts to achieve the modularity needed for graceful extension
and evolution [17].

12.3.5 NetCDF - the de facto standard implementation of the AD/55


architecture
NetCDF stands for the network Common Data Form. NetCDF was
designed and implemented by professional scientific software engineers
and Unix programmers at the Unidata Corporation of the University
Corporation for Atmospheric Research in Boulder, Colorado, which
supports the US National Center for Atmospheric Research and many
216 ADVANCED L1MS TECHNOLOGY

universities with atmospheric research programs [18-21]. It is an evolution


of the Common Data Form developed by NASA for scientific visualiza-
tion.
The Unidata Corporation is a non-profit, university co-operative corpora-
tion with a staff of twelve engineers and scientists who develop, enhance,
and maintain netCDF. Unidata receives funding of over $2 million
per year from the National Science Foundation. NetCDF is a critical part
of their mission to provide data portability and software applications
to the scientific community. In 1993, Unidata received funding from the
US National Science Foundation until the end of 1998, making it a viable
and well-supported scientific software engineering center at least until
then.
The netCDF data interchange system was evaluated and recommended
to the analytical laboratory community by researchers in the ADISS
Program. It has sufficient functionality to meet the needs of data
interchange and storage. NetCDF meets the requirements of robustness,
supportability, timeliness, public-domain availability, and conformance to
the ADISS Architecture design. It was designed to be a 'global' file system
to be used by many different types of applications. As such, netCDF is the
first public-domain ADISS data access interface and low-level toolkit. The
ADISS Program Office has received permission from Unidata to use it
universally as the default public-domain ADISS Architecture implementa-
tion. The AlA chose netCDF as the software implementation for its data
standard [6,7,9]. The primary reasons were that it meets certain scientific
requirements (see above), and it has robust code, good performance, code
availability on many platforms, and good, on-going support that is well
funded by the US Government.
The first implementation of the netCDF tools include many object-
oriented design features with a more traditional implementation, written in
fully-compliant standard ANSI C. This feature was specifically chosen
because it will lead to fully object-oriented systems in the near future. A
prototype C++ interface is available. After extensive testing by Unidata
and its user community, Unidata will bundle and support the C++
interface along with the standard ANSI C implementation.
The netCDF system provides the following features .
• Direct access to data on disk, instead of having to parse through the
file. Parsing through sequential ASCII files is much slower than using
direct-access files; this particularly becomes a problem for larger
datasets .
• Data handling for very large (hundreds of megabytes) multi-
dimensional data sets. There is no limit to the number of dimensions
for variables in netCDF data sets. It works for two-dimensional and
three-dimensional NMR, GC-MS, surface science, and other 'large
data' techniques.
STANDARDS FOR EXCHANGING ANALYTICAL DATA 217

• An abstract, high-level Data Access Interface or Application Program-


ming Interface (API) that is independent of the format of the
underlying data files. If you standardize on the API, rather than just
the file format, then both the data and the portion of the application
using the API become portable. Of course, other application services
besides data interchange and storage must use standard toolkits for
applications to be portable with no rewriting.
• A full ANSI C programming language source-code set (with Fortran
jackets) with a public domain implementation that runs on MS-DOS,
OS/2, Compaq 386/ix, Macintosh, HP-UX, DEC-ULTRIX, DEC
Alpha OSF/l, DEC VAXlVMS, DEC AlphaJVMS, SunOS, Sun
Solaris, IBM AIX, SGI, Cray Unicos, and NeXT-MACH. It has also
been ported to other platforms of interest to analytical scientists. Most
of these are obtainable free of charge over the Internet. In some cases
there may be software media charges only.
• Well-written documentation and publications that explain what netCDF
is, how and why it was designed, and how to use it. The netCDF
User's Guide describes how to use netCDF in much detail [18].
Several journal articles, published papers, and presentations on
netCDF give overviews of the system [19-21].
• A data description language, called the Common Data Language
(CDL); this language is highly readable by humans (see above) and
has many features for defining new data structures and records that
include data attributes (metadata).
• A binary data encoding scheme that is network-, machine-, and vendor-
independent; it is based on the same data encoding mechanism as the
Network File System (NFS), called the eXternal Data Representation
(XDR). The netCDF source code distribution includes the XDR data
encoding and decoding libraries. XDR is now being balloted as a
formal IEEE standard.
• Binary-to-ASClI and ASClI-to-binary conversion utilities. NetCDF
has an ASCII dump utility that allows the binary files to be dumped to
the human-readable CDL form for creating, debugging, and viewing
datasets. This human-readable form can be edited with a regular text
editor. The CDL form can then be fed into another utility (called
'ncgen') to generate the binary version of the data set. The ncgen
utility has additional options to generate C or Fortran library routines
to read and write particular datasets directly. Programmers can then
cut and paste generate code directly into an application [18].
The netCDF data interchange system was evaluated and recommended
system. Product developers are conducting performance tests on the
netCDF system to determine what, if any, limitations exist for use as a
native storage mechanism for commercial high performance NMR soft-
ware products [11].
218 ADVANCED LIMS TECHNOLOGY

NetCDF has the significant advantage of being a 'systems' solution. It is


not just a format; it includes some very powerful software tools that are
easy to use. With the ADISS Architecture and netCDF implementation,
people can complete the data interchange, storage, and archival part of
their scientific software applications faster than with previous approaches.
ADISS and netCDF together also help to create better scientific software
systems integration across the industry.
The ADISS/netCDF approach gives every organization and user the
same default, base-level ADISS implementation. Previous approaches had
ambiguous or incomplete protocol specifications that caused developers to
implement the protocols slightly differently from one another. The result
was that people had problems reading each other's files. Such unclear
specifications lead to many 'dialects' or variations, producing 'non-
standard' standards. Having the publicly available and supported netCDF
reference inplementation solves this problem simply.

12.4 Application of the ADISS Information Model to chromatography

Using netCDF's Common Data Language (CDL) to construct a template


for chromatography is straightforward. The Analytical Instrument Associ-
ation implemented a subset of the proposed ADISS specification for
chromatography in their Analytical Data Interchange (ANDI) Chromato-
graphy Standard. Below are extracts from the ANDI chromatography data
communication template. This neither is the full template required to
implement data exchange among AlA-compliant applications, nor does it
tell you which data elements are required or optional. This merely
illustrates how the netCDF tool is used to implement a small segment of
the ADISS AIM as applied to chromatography. Interested readers should
purchase the AlA's ANDI Standard to receive the full software set, the
documentation, the templates, and technical support.
NetCDF files contain two types of basic objects - multidimensional data
arrays and data attributes. Data attributes can be attached to a data array
or they can be 'free attributes', meaning they are global to the file, and
accessible with their own software routines. Programmers give values to
data attributes when defining a template; these can be changed at program
run time. The netCDF libraries include data access interface routines to
allow programs to query for the names of all existing variables and
attributes, the values of single or aggregated netCDF variables (data
structures), or to access cross-cuts across multidimensional arrays, known
as 'hyperslabs'.
NetCDF objects have three definition sections (Figure 12.2). The first
section defines the 'dimensions' for array variables. The actual values do
not have to be defined in the dimension section; they can be set by the
STANDARDS FOR EXCHANGING ANALYTICAL DATA 219

user's program. The second section is where netCDF 'variables' or objects


are defined; again they can be given preset values or changed at run time.
The third section is the 'data' section where the actual data array values are
put.

12.5 Future ADISS extensions for chromatography and other techniques

Over the next two years, the ADISS and AlA Programs will begin to focus
on data elements beyond those needed to transfer raw data and results
information for single sample chromatography runs. They will extend the
chromatography standard to include other detector types such as mass
spectrometry, LC-diode array, infrared, NMR, and possibly others. The
AlA's plan is to move into a second phase of data standardization for
chromatography, if the demand is high enough and more customers inform
the AlA of their applications of the existing standards. This section covers
the projected enhancements to the AlA Chromatography Data Standard
scheduled for completion between late 1994 and mid-1995.
On-going work of major international instrument vendors to apply the
ADISS/netCDF approach to infrared spectroscopy (IR), nuclear magnetic
resonance (NMR) spectroscopy, and inductively coupled plasma (ICP),
direct coupled plasma (DCP), atomic absorption (AA) and atomic
emission (AES) spectroscopies is also briefly covered.

12.5.1 Future extensions for chromatography

The AlA Data Standard extensions include seven work items, divided into
two 'waves' of development. Customer input received over the past three
years has driven the selection of these particular enhancements. The full
ADISS Chromatography Specification proposal contained several of the
short proposals. However, the AlA pushed these out beyond version 1.0
of the ANDIIChromatography Standard because of the 'time-to-market'
issues - there was simply too much to consider for version 1.0.
The issue of uncertain customer demand for and usage of the standard
has been reducing the AlA's momentum in chromatography since about
mid-1993. A 'chicken-and-egg' problem exists to a certain extent, because
if the proposals have not been completed and accepted by the vendors and
end-users, they cannot commit themselves to implement them. Vendors
are trying now to answer the question of why they should invest in a
product (standard) if customers are not demanding that it be done. Users
are reticent in demanding a product (standard) that does not fully exist,
and which they may not fully understand. A Catch-22!
Alternatives exist for end users: (1) pay directly for completion of the
220 ADVANCED LIMS TECHNOLOGY

netCDF chromatography_template_name
dimensions:

II Dimensions for arrays within netCDF objects are defined in this


section.
II Comments are denoted by having a double-slash anywhere on a line.
_16_byte_string =
16; II 16 character string length
_64_byte_string = 64; II 64 character string length
_128_byte_string = 128; II 128 character string length
_255_byte_string = 255; II 255 character string
point_number = 1000; II point number dimension for raw data
peak_number = 12; II number of peaks
error_number = 1;

variables:

II Administrative Information Class - Category 1.2 Data Elements


: dataset_completeness = "Cl+C2";
II Categories 1&2. raw & result data
II are the only fully agreed to now.
: aia_temp1ate_revision = "1.0";
:netcdf_revision "2.0";
: languages "English";
: administrative_comments "this is just an example';
: dataset_origin "the ADISS laboratory";
: dataset_owner "the public domain";
:dataset_date_time_stamp "19940130123030-0600";
:injection_date_time_stamp "19940130123030-0600";
: experiment_title "chromatography exchange";
: operator_name "Joe Chemist";
: separation_experiment_type "gas chromatography";
: company_method_name "company method XYZ";
:company_method_id "N35-C01";
:pre_experiment-program_name = "setup';
:post_experiment-program_name = "environmental report";
: source_file_reference "HEALTH: : DKA100: [NETCDF] CHROM. COL";
char error_log (error_nurnber. _64_byte_string);

II SAMPLE-DESCRIPTION Information Class - Category 1 Data Elements


:sample_id_comments "normal soil sample";
:sample_id "N1010" ;
: sample_type "control";
:sample_injection_volurne "3.0"; II microliter
: sample_amount "3 . 0 " ; II micrograms

Figure 12.2 NetCDF object definition. Copyright © 1994 Optimize Technologies.

standards work; (2) make purchases of equipment totally contingent on


completion and implementation of the standards (and do not bend until
demonstrations are shown); (3) voice your demands for applications of the
standards so loudly to the AlA that the need is clear; (4) do nothing and let
the business priority of the AlA's Standards Program decrease over time
because the AlA cannot measure strong interest or need. It is now up to
users to motivate their vendors.

12.5.1.1 The first wave of enhancements to the Chromatography Data


Standard. The first wave has four proposed sets of information that will
STANDARDS FOR EXCHANGING ANALYTICAL DATA 221

II RAW-DATA Information Class - Category 1 Data Elements


:raw_data_tab1e_name "test raw data set";
: retention_unit "time in seconds";
float actua1_run_time_1ength; II in retention_unit
float actua1_samp1ing_interva1; II in retention_unit
float actua1_de1ay_time; II in retention_unit
float ordinate_values (point_number) ; II ordinate values
array ordinate_va1ues:uniform_samp1ing_f1ag "¥" ;
float raw_data_retention(point_number); II abcissa values array
II PEAK-PROCESSING-RESULTS Information Class - Category 2 Elements
:peak-processing_resu1ts_tab1e_name "a partial example";
:peak-processing_resu1ts_comments "5 level calibration ";
:peak-processing_method_name "processing method";
:peak-processing_date_time_stamp "19940130123030-0600";
float peak_retention_time(peak_number) ;
char peak_name (peak_number, _32_byte_string);
float peak_amount (peak_number) ;
:peak_amount_unit "micrograms";
float peak_start_time(peak_number); II unit
retention_unit
float peak_end_time(peak_number); II unit = retention_unit
float peak_width(peak_number);
float peak_area(peak_number);
float peak_area-percent(peak_number);
float peak_height(peak_number);
float peak_height-percent(peak_number);
float base1ine_start_time(peak_number);
float base1ine_start_va1ue(peak_number);
float base1ine_stop_time(peak_number);
float base1ine_stop_va1ue(peak_number);
char peak_start_detection_code(peak_number, _2_byte_string) ;
II baseline type
char _2_byte_string);
II baseline type
float retention_index(peak_number);
float migration_time(peak_number);
float peak_asymmetry (peak_number) :
float peak_efficiency (peak_number) ;
float mass_on_co1umn(peak_number);
short manua11y_reintegrated-peaks(peak_number); II O=no, l=yes
data:
II The user's program inserts data here. NetCDF datasets typically
II contain much more data than shown here! This is just an example.
ordinate values = 40983, 41009, 45345, 65212, 87766, 67543, 43421,
- 32345, 24349, 23943, 21023, 98321;
peak_retention_time = 105;
peak_name = "The Main Peak";

Figure 12.2 Continued

significantly enhance the AlA Chromatography Data Standard in the areas


of system and sample logging and performance tracking.
• Chromatography Column Information
• Sample Description Information
• Instrument Description Information
• System Suitability Information
The Chromatography Column Information Class and Instrument
222 ADVANCED LIMS TECHNOLOGY

Description Information Class will make it possible for users to build and
transfer instrument system log books more easily, i.e. to track chromato-
graphy columns and other instrument components. The Sample Description
Information Class records significantly more sample information, so that
samples described by different vendors' systems can be cross-referenced
more easily. Finally, the addition of the System Suitability Information
Class enhances the AlA Chromatography Data Standard to allow
customers to track performance of separation systems over time and across
different systems better. The goals were to deliver the first set of
enhancements by mid-1994. However, now the market-awareness and
demand-measurement programs have taken precedence over enhance-
ments. This has delayed delivery of the first wave of enhancements by
many months. These first enhancements are now expected to be delivered in
vendors' products in early to mid-1996, along with support for diode array
detectors and more general multi-dimensional detectors (see below).

12.5.1.2 The second wave of enhancements to the Chromatography Data


Standard. The second wave of work items covers three problem areas
that are considerably more difficult than the first wave of enhancements.
The second wave enhances the AlA Chromatography Data Communication
Standard to include:
• Additional Chromatography Detector Types
• System Validation Information
• Sequenced Run Information
The additional chromatography detector types targeted include mass
spectrometry, UV-visible diode array, and infrared spectroscopy detectors.
The AlA will develop data descriptors for these detectors that are generic
enough to apply to most other types of detectors used in chromatography-
based analytical systems. Secondly, the System Validation Information will
make it easier for laboratory personnel to validate their instrument
systems, because AlA instrument manufacturers will supply some standard
data elements for recording and tracking that information. Finally, the
Sequenced Run Information will allow users to track and cross-reference
groups of sample runs that are part of a sequence or a batch.
The AlA is proceeding more cautiously with the second wave of
enhancements. This wave requires the AlA to pay much greater attention
to creating and maintaining consistency across chromatography-based
analytical techniques. Also, the exact contents of Systems Validation
Information are still open to some interpretation, because validation is so
specific to each application of a system. Again the 80/20 rule will be
applied, i.e. standardize the 20%+ core of information used by at least
80% of the intended applications. Sequenced Run Information is not easy
to standardize, because there is no 'right' way to record sequences of
STANDARDS FOR EXCHANGING ANALYTICAL DATA 223

sample run data. Each approach has its own tradeoffs. Vendors have
developed many approaches that are correct for the particular applications
needed by their customers. These approaches must be accommodated by a
neutral model for sequences to allow interoperability among products.

12.5.2 Other analytical techniques being completed now


Instrument and software vendors are working together now to write
specifications to apply the ADISS Information Model and netCDF
software system to additional analytical techniques, including IR, UV-
visible, AA, ICP, DCP, secondary ion mass spectroscopy, and NMR.
For the optical spectroscopy family of techniques (IR, NIR, UV-visible,
and others), the AlA is sponsoring a committee of end-users, software
developers, and the major instrument companies to develop a specification.
These companies have completed final testings of the specification and
netCDF software interface, and the ANDlllnfrared Data Exchange
Standard is now available from instrument manufacturers. Interested
readers are encouraged to contact their vendors to see it working.
For atomic emission (AE), atomic absorption (AA), direct-current
plasma (DCP) and inductively-coupled plasma (ICP), the US Army Corps
of Engineers has funded the Laboratory Automation Standards Foundation
(LASF, Groton, Massachusetts, USA) to develop a comprehensive
specification. The specification is designed for data transmission, data
storage, and data archiving. This specification work was first targeted for
their environmental testing laboratories, but is expected to fit many other
applications. Environmental software developers will implement the
specification using netCDF. The starting point for the work was the
ASTM's ADISS AIM Specification. The finished document will be
contributed back to the ASTM. The AlA has agreed to use the LASF-
developed standard when it is ready, since most of the vendors developing
the standard with LASF are long-standing members of the AlA. This
typifies the fast-track approach of the AlA, which tries to adopt advanced
work wherever it exists.
AT&T Bell Laboratories are working on an ADISS specification for
secondary ionization mass spectrometry (SIMS). This specification is based
on the AlA Standard for Mass Spectrometry data interchange and the
American Vacuum Society's specification for Auger and X-ray photo-
electron spectroscopy data publication. The SIMS specification was
expected to be available early in 1995.
The most intractable ADISS data standard to date has been nuclear
magnetic resonance (NMR) spectroscopy, because of its high-performance
and multidimensional data requirements. NMR was particularly difficult
because releasing a data exchange format that was ASCII-based or
inherently one-dimensional would be very troublesome, because of poor
224 ADVANCED LIMS TECHNOLOGY

performance and limited scope of applications. The greatest advances in


NMR over the past two decades have been in exploiting multidimensional
techniques. The NMR techniques vary so much that designing information
classes and data structures to support these multidimensional techniques
was very difficult. However, NMR data experts have completed a suitable
design, and the full specification was to be finalized by the beginning of
1995.
It is somewhat unfortunate for the NMR community that a version of
JCAMP-DX for NMR was developed and published in the literature in
1993 [22], despite the direction and strong commitment of the AlA and the
major international instrument manufacturers for the other very popular
analytical techniques. This JCAMP-DX NMR publication covers only one-
dimensional data and is an ASCII-only format. This version of JCAMP-
DX for NMR may confuse and waste time for end users, because it is an
approach with limited applications scope and poor performance relative to
ADISS/netCDF. However, JCAMP-DX can provide a channel to get data
from very old data systems for which there are no C programming language
compilers available into ADISS/netCDF by using commonly available C
compilers.

12.5.3 Formal standardization of the AD1SS specifications


The American Society for Testing and Materials E49.52 'Committee on
Analytical Sciences Data' is finishing balloting of the final specifications for
the ADISS Systems Architecture, the ADISS Analytical Information
Model, and the ADISS Generic Data Dictionary. The goal is to have an
international formal standards organization with a full consensus process
approve the specifications after they have been tested for robustness by the
marketplace. The approval of a full consensus organization is important,
because that represents all viewpoints within the scientific community,
rather than just those of vendors and large end-user corporations.

12.5.4 NetCDF v2.3.3 upgrade


A major upgrade of netCDF is available since the AlA implemented the
Chromatography Communication Data Standard vl.0, which uses netCDF
v2.0.1. The significant enhancements made by Unidata to the netCDF
software system in version 2.3.3 improve performance, add support for
more computer platforms, and extend the netCDF interface to include
support for complex data records (complex data structures as objects) for a
variety of data portability problems.
Using this upgrade requires a significant investment of time and energy
by software developers, because it includes recoding and retesting all
conformant products for interoperability and data interchange fidelity. The
STANDARDS FOR EXCHANGING ANALYTICAL DATA 225

logical time for all AlA members to upgrade is when extensions for
chromatography-MS, chromatography-IR, chromatography with diode
array detectors, and other are done. That is the earliest time that the
chromatography, mass spectrometry, and optical spectroscopy data
standards must use the same netCDF version. However, preliminary
testing of netCDF v2.3.3 and later versions was to have been completed on
an individual basis by AlA Chromatography Committee members before
the beginning of 1995. New standards for IR and atomic spectroscopic
techniques will use the latest version available when they are introduced,
expected to be in early 1995. A major design goal of netCDF is full
backward compatibility with all previous versions of netCDF, i.e. the
ability to at least read files from all previous versions, and the ability to
write them out in the current format. This is important because of the need
to read historical meteorological datasets from many years past.
The benefits of the new netCDF system (version 2.3.3) are:
(1) Record-level access, to allow access to groups of netCDF variables
as individual records, which will also help with sequences of sample
and spectral data
(2) Improved performance, as much as 40 times better for some
operations (e.g. cross-cuts across hyperslabs), which will be
important for GC-MS, GC-IR, NMR etc., and other 'large-data'
techniques
(3) New command line options; ncdump supports several options for
selectively dumping variables and comments
(4) A new C++ interface prototype, which simplifies the usage of the
data access library somewhat (this is still prototype code; it is being
tested extensively before it is finalized in a near future version)
(5) More computer platforms supported (HP-UX, Alpha OSF/l, MS-
DOS 5.0).
Unidata released netCDF v2.3.3 in June 1993. All reported problems
with the beta version were fixed, and the software was ported to several
additional computer architectures. NetCDF Version 2.3.3 was built and
tested successfully on the platforms listed in Table 12.2.
Some users have ported netCDF v2.3.3 to other platforms, including
Apple Macintosh and Microsoft Windows NT. Unidata does not maintain
these ports, but they can be obtained over the Internet from the netCDF
user community. Totally new ports are now easier due to a new 'configure-
based' approach to installation, which adapts more easily to what an
operating system provides. A number of systems and utilities built on top
of netCDF are available as source code for scientific visualization,
laboratory data handling, converters for spreadsheet data to netCDF data
and vice versa. The next version of NetCDF (v2.4) is in the works now,
expected to be released by the end of 1995 on the computer platforms
226 ADVANCED LIMS TECHNOLOGY

Table 12.2 Currently supported netCDF operating platforms

Hardware Operating system

Cray Y-MP UNICOS 6.1.6


DEC Alpha OSF/I VI.2
DEC Alpha OpenVMS VI.5
DEC VAX VMS V5.5-2
DEC VAX ULTRIX V4.3
DECstation ULTRIX V4.3
HP-9000l7xx HP-UX 9.0
IBM PS/2 MS-DOS 5.0
IBM PS/2 OS/2 1.3
IBM RS-6000 AIX 3.2
NeXT NeXT-OS 3.0
SGllris IRIX 4.0.5
Sun SPARCstation Solaris 2.1
Sun SPARCstation SunOS 4.1.3

mentioned earlier. The most significant enhancement in v2.4, from a


chemist's point of view, is its ability to handle nested arrays, including
jagged or irregular arrays (these are necessary to handle multidimensional
arrays coming from hyphenated analytical techniques such as GC-MS or
MS-MS). Other enhancements are being made to the data reporting utility
called 'ncdump'. Other utilities such as a full UNIX perl scripting language
and an array algebra language are also now available. For the latest
information on netCDF and Unidata, use a WWW browser to access the
Unidata Home Page using the URL http://unidata.ucar.eduJpackages/
netcdf/whatsnew.html, which contains a wealth of information about
netCDF and related software packages, and gives pointers to dozens of
free software packages for accessing, visualizing and manipulating data in
netCDF files.

12.6 Future influence of ADISS standards on L1MS in R&D

The LIMS software industry is at a crossroads at present. Most LIMS


systems on the market today are basically sample tracking systems with a
strong emphasis on quality assurance and quality control functions. A few
of the more sophisticated systems include laboratory planning and
scheduling functions, and can possibly support strategic planning processes.
However, for data management, few if any systems deal with anything
other than the final, reduced result of analytical testing. They do not
manage instrument raw data or methods very effectively. They are just
'laboratory results managers' and not more general-purpose laboratory
data managers. Today's LIMS help laboratory supervisors monitor
STANDARDS FOR EXCHANGING ANALYllCAL DATA 227

laboratory productivity and report results to clients, but they do not help
R&D workers to collect, store, and manage the many heterogeneous data
types generated during R&D (textual data, data arrays, tables of numbers,
spectra, molecular structures, images from CAT scans or fluorescence gel
scans, and DNA sequences are but a few of the data types encountered).
What we really need is a comprehensive system that I call the 'R&D
LIMS'.
The next generation of laboratory data management software (the R&D
LIMS) will include an 'analytical laboratory data repository' or 'data
librarian'. Laboratory data repositories will include all laboratory raw
data, methods, and processed data held in a central respository in a
standard form and assessed using a standard interface. Scientists and
managers will be able to access and cross correlate results from datasets
across sample runs, across instruments, across laboratories, across labo-
ratory protocols, across clinical trials for drug development, or across
environmental pollutant tracking studies, over many years, instantly. This
is possible because data are stored, interchanged, and archived in common
data repositories using standard information models, held in standard
forms and accessed by using standard software interfaces.
A few laboratory data management products are beginning to support
input and output of data from a variety of data systems and store them in a
standard form. Standard repositories based on the ADISS/netCDF
specifications and tools are an important step toward an R&D LIMS that
can support teams of scientists directly, not just their managers or
customers.

12.7 The influence of standards on market dynamics

The computer and electronics industries have long been ahead of the
laboratory instrument industry in addressing future needs for standards,
perhaps because they are much larger, are more widespread, and have felt
a dire need for standards much sooner. Witness the early creation of
standards such as the Electronics Industry Association's EIA-232 (RS-
232), IEEE-488 (GPIB), IEEE 802.3 (Ethernet), IEEE lOO3.X (POSIX),
XlOpen XPG3 and XPG4, and many others. Some of these efforts began
five to ten years before the market demanded or was fully ready for them.
The instrument industry has much to learn from the foresight (the cynics
would say stupidity!) of these high-technology industries, which have
become commodity producers. Standardization in these industries has in a
large part produced this commoditization and the economic problems from
the resulting restructuring. However, it is also standardization and
commoditization that have produced the rapid growth rates these
industries have enjoyed over the past 20 years.
228 ADVANCED LIMS TECHNOLOGY

What follows is not an indictment of the analytical instrument


manufacturers, although they could have been more proactive in creating
open, forward-looking, formal standards for all aspects of laboratory
automation (including data exchange, storage and archiving, device
interfacing, and sample and materials handling). Developing truly 'plug-
and-play' systems is a very expensive proposition. This is a major goal of
the Consortium on Automated Analytical Laboratory Systems (CAALS)
from the US National Institute of Standards and Technology (NIST), and
the Containment Analysis Automation (CAA) Project of the US Depart-
ment of Energy [23]. If instrument manufacturers were very proactive in
developing standards, this traditionally closed business would be undercut
until a new market equilibrium was established. The new market
equilibrium could take as long as three to seven years after the standard
products are introduced to become established, during which time some
companies might lose money. However, sometimes undercutting one's
own product line is important. It happens routinely in other high-
technology industries such as computers, biotechnology, and pharma-
ceuticals. The reason is because 'if you do not do it your competitors will',
and it is better to keep customers than have them and their purchase orders
move to competitors.
Some business activities will diminish in importance. For example, the
need will decrease for custom translation packages to interconvert
proprietary data formats and totally custom device interfaces for production
(non-research) systems. Some business activities will die completely, e.g.
constant marketing one-upmanship in trying to claim that a proprietary
product is 'the industry standard', all the while deviating from the
established standards so much as to be incompatible with them. Intelligent
customers will not tolerate that strategy for very long, perhaps for only one
or two buying cycles.
Some business activities will flourish, e.g. competition based on the best
implementation of a standard's features (most completely implemented,
most usable, best performance, etc.), or new applications built on top of
the standard. We will also see convergence of two or more standards to
create new markets - witness the creation of the videocassette player
market. In the portable instrument industry, we will soon see the use of the
recent global positioning standards with data standards and device
interfacing standards.
Perhaps the biggest benefactors will be the scientists' managers who
must integrate and reintegrate laboratory components into systems,
systems with other systems, and into the infrastructure of their organization
in order to meet the changing business goals of the organization. An
interesting perspective is that instrument systems are just information
sources, no matter how large or small they are. It may be uncomfortable to
instrument manufacturer executives to think that their $2 million NMR
STANDARDS FOR EXCHANGING ANALYTICAL DATA 229

spectrometer is considered to be a sensor connected to the information


infrastructure, but it is welcome perspective to executives of instrument-
consumer companies. Instrument information sources plug into the
company's communication infrastructure and generate numbers needed to
run the business. This is true no matter what business the company is in -
government R&D, regulatory affairs, academic R&D, commercial product
development and sales, or contract services.
Information sources, consumers and infrastructure - this paradigm
makes a lot of sense for business, but cannot occur without proactive
standardization. The complexity of today's automated systems is extreme.
Without standards it takes too long to integrate systems, and the usability
is far too low for heterogeneous laboratory components in multivendor
systems. Though some laboratory instruments may always be standalone
or manually operated, automated systems of the future must increasingly
be measured on their usability as parts of larger systems. Standards make
systems more easily integrated and hence more usable.

12.8 Summary and recommendations

Throughout development of these analytical data standards there was


considerable healthy debate about the best technical solutions and market
dynamics for implementing them. Now, the ADISS/netCDF framework
solution has lived up to promises to solve major problems untouched by
earlier approaches.
Innovations in the ADISS Program include: (1) technical analysis and
discovery of good, supported tools to address the majority of the technical
and business requirements; (2) creation of a comprehensive information
architecture and software architecture; and (3) finding and encouraging
confluence of the right talented resources. These things together will lead
to fulfilment of the vision of integrated systems to aid scientific
collaboration.

12.8.1 Industry standards status


The analytical instrument industry now has a proven and open process for
'fast-track' creation of instrument data standards and tools. However, the
analytical data standards work is still far from complete. Users of
instrument data systems and LIMS must prove to the instrument
manufacturers that demand for such standards is high enough to fund on-
going developments, concurrent with each vendor's own creation of
innovative and competitive system features. There are things that end-
users should know and can do in this regard.
The AlA and ADISS Standards Programs together have cost approxi-
230 ADVANCED LIMS TECHNOLOGY

mately $3 million since late 1988. Revenues from the purchases of the
AlA Chromatography Data Standard have been about $30 000 in total.
From July 1993 to January 1994 revenues were about $7000. This does not
nearly cover the expenses of packaging and shipping the standard kit,
creating and distributing an awareness newsletter, or other incidentals.
Before significantly expanding the work, the AlA and its member
companies must see a significant demand for and application of the
standards by end-users. The AlA also wants to know what is missing, in
order to make it easier for end-users (and developers) to use the standards.
But first end-users must know what to request by learning what exists
today.
Once end-users know that the standards exist and can be used to solve
some of their key data handling problems, they must push all their
instrument manufacturers to complete the standards work, to become
compliant and to support them. This is accomplished by making standards
compliance a condition of purchase for any new instrument. Start this
process by educating your local instrument sales force. If they do not know
about the standards, feign ignorance, or show lack of support, then there
are other vendors from whom you can purchase instruments supporting
them. Your business may soon depend on the ability to communicate
freely between your various computer systems.
Wave 1 and Wave 2 extensions must be completed to meet future needs.
Proposals are now pending for most of this information; consult the
Appendix of the current AlA Chromatography Data Standard Specifica-
tions for details [7]. However, lead times are long for these standard
features. If demands are not made now, they will not make it into the
standards for one or two more years. Being a partner with the instrument
vendors is important. Vendors need to be able to measure demand clearly,
and to see many applications of the standards before they can commit to
the significant investment that it takes to implement more. Being a
demanding partner helps the standard agreement process to speed along,
helps the industry to grow, and helps to return more focus to scientific
innovation.
Benefits from the ADISS Program include giving businesses the proper
justifications to request these standards from vendors and to implement
the standards themselves. The ADISS Program's results will make data
access more consistent and easier, and will improve software and data
reusability, especially in the long term. It will protect the enormous data
and software investments being made by major institutions and corpora-
tions. It will increase productivity by using time more effectively and help
to shift scientists' focus back to science, away from the data communications
difficulties that are typical of distributed, multivendor systems. Finally, it
will reduce systems integration costs for analytical instruments, LIMS, and
other systems, and help to improve product and data quality [24].
STANDARDS FOR EXCHANGING ANALYTICAL DATA 231

12.8.2 Call to action


End-users must educate each other and their managers about these issues.
They must drive the ADISS concepts into their organizations and into the
market. Help is needed in completing the ADISS Specifications for new
techniques, including data modeling, writing, reviewing, and testing. The
ASTM E49.52 Committee on Computerization of Analytical Sciences
Data is also a focal point for the application of ADISS to chromatography
and other techniques. For more details contact the author at Optimize
Technologies, Sudbury, Massachusetts, USA (phone 508-443-4771), the
Laboratory Automation Standards Foundation in Groton, Massachusetts,
USA (phone 508-448-6130), or the Analytical Instrument Association in
Alexandria, Virginia, USA (phone 703-835-1360) [8].
For those who would like to use or experiment with netCDF version
2.3.3 (or later versions), it is available via anonymous FTP from the
Internet host: unidata.ucar.edu, in the file: pub/netcdf/netcdf.tar.Z.
Questions or comments about netCDF that are of general interest may be
e-mailed over the Internet to netcdfgroup@unidata.ucar.edu. Specific
questions or bug reports must be sent to netcdfsupport@unidata.ucar.edu.

Note

Permission was granted by the AlA to reprint sections of the AlA's


netCDF CDL template for chromatography. NetCDF is copyrighted by
the Unidata Program Center.

References

1. Lyaskowski, R. (1989) Proposed Plan for Industry-Wide Analytical Data Standards - the
ADISS Architecture and Project Overview. Digital Equipment Corporation, Four Results
Way, MR04-3/C9, Marlboro. MA 01752, USA.
2. Lysakowski, R. (1991) The global standards architecture for analytical data interchange
and storage, ASTM Standardization News, March, pp. 44-51.
3. Lysakowski, R. (1993) NetCDF - A defacto standardized framework for analytical
data exchange, storage and retrieval. In Computerized Chemical Data Standards:
Databases, Data Interchange. and Information Systems, ASTM STP 1214 (eds. R.
Lysakowski and C.E. Gragg). American Society for Testing and Materials, Philadelphia,
PA, USA, pp. 57-74.
4. Digital Equipment Corporation (1991) The Open Systems Handbook: A Guide to
Building Open Systems, reference number EC-HI089-48/91. Digital Equipment Corpora-
tion, Marlboro, MA 01752, USA.
5. MacDonald, R. and Wilks, P. (1988) 1CAMp·DX for infrared spectroscopy. Applied
Spectroscopy, 42 (1), 151-162.
6. Stanz, D., Campbell, S., Christopher, R., Watt 1. and Zakett, D. (1994) ANDIIMass
Spectrometry: Data Interchange for Mass Spectrometry. The Analytical Instrument
Association, 225 Reinekers Lane, Suite 625, Alexandria. VA 22314, USA.
7. Mattson, D. (1994) AND/Ilnfrared Spectroscopy: Data Interchange for Infrared
232 ADVANCED L1MS TECHNOLOGY

Spectroscopy. The Analytical Instrument Association, 225 Reinekers Lane, Suite 625,
Alexandria, VA 22314, USA.
8. Analytical Instrument Association Committee on Data Communications Standards
(1991) AlA Chromatography Data Standard - Specification. (Gives full definitions of data
elements and data models used by the AlA for chromatography data.) The Analytical
Instrument Association, 225 Reinekers Lane, Suite 625, Alexandria, VA 22314, USA.
9. Analytical Instrument Association Committee on Data Communications Standards
(1991) AlA Chromatography Data Standard - Implementation Guide. (Contains an
overview of netCDF and how to implement the content of the AlA Specification with
netCDF.) The Analytical Instrument Association, 225 Reinekers Lane, Suite 625,
Alexandria, VA 22314, USA.
10. Analytical Instrument Association Committee on Data Communications Standards
(1991) Proposed Global Extensions to the AlA Data Interchange Standard. The Analytical
Instrument Association, 225 Reinekers Lane, Suite 625, Alexandria, VA 22314, USA.
11. Macur, A. (1993) Private communication. Tripos and Associates, St Louis, Missouri,
USA.
12. Gaarenstroom, S. and Lee, R. (1993) Journal of Surface Science Spectra, 2 (4), 5-22.
13. ASTM (1993) E49.52.oo2. Roo3. Draft Standard Specification for the Analytical Informa-
tion Model for Analytical Data Interchange and Storage. American Society for Testing
and Materials, 1901 Race St., Philadelphia, PA 19103-1187, USA.
14. Weimar, R.D. (1993) Future ADISS architecture for materials properties and analytical
testing and The demand for and value of a standard unified data architecture for
analytical testing data. Both in Computerized Chemical Data Standards: Databases, Data
Interchange, and Information Systems, ASTM STP 1214 (eds. R. Lysakowski and E.
Gragg), American Society for Testing and Materials, Philadelphia, PA, USA, 96-107 and
84-95.
15. MacLaughlin, D. (1992) Integrating molecular structure and spectral databases with
LIMS, Eastman Kodak, presented at a conference on Scientific Computing and
Automation, Washington, DC, October.
16. Wollenberg, G. (1994) Private communication. Merck & Co., Rahway, NJ, USA.
17. Meyers, B. (1988) Object-Oriented Software Construction. Prentice-Hall International,
New York, NY, USA.
18. Rew, R.K., Davis, G. and Emmerson, S. (1994) netCDF User's Guide. An interface for
data access. Unidata Program Center, University Corporation for Atmospheric
Research, P.O. Box 3000, Boulder, CO 80307-3000, USA.
19. Rew, R.K. and Davis, G.P. (1990) netCDF, an interface for scientific data access, IEEE
Computer Graphics and Applications, July, 76-82.
20. Rew, R.K. and Davis, G.P. (1990) The unidata netCDF: software for scientific data
access. In Proceedings of the Sixth International Conference on Interactive Information
and Processing Systems for meteorology, oceanography and hydrology. American
Meteorology Soc. Anaheim, CA, February, 33-40.
21. Sherretz, L. and Fulker, D.W. (1988) Unidata: enabling universities to acquire and
analyze scientific data. Bulletin of the American Meterological Society, 69 (4), 373-376.
22. Davies, A.N. and Lambert, P. (1993) JCAMP-DX for nuclear magnetic resonance.
Applied Spectroscopy, 47 (8), 1093-1099.
23. Salit, M.L., Guenther, F.R., Kramer, G.W. and Griesmeyer, J.M. (1994) Integrating
automated modular systems with modular architecture. Analytical Chemistry, 66 (6),
361A-367A.
24. McDowall, R.D. (1994) A matrix for the development of a strategic LIMS. Analytical
Chemistry, 65 (20), 896A-901A.
Index

accession number xvii analytical


accessioning process controlled by protocol equipment offset costs against
synchronous structures 88-89 productivity gains 124-125
accreditation information model
requirement for instrument maintenance exploits object technology 212-214
management 50 framework for data standards
schemes and software quality 200 projects 209
acquisition clinical UMS 105-109 laboratory industrialization 6
hardware considerations 106 procedure operator control of 142
architecture 106 process criteria for automation 133
fault tolerant 106 systems chromatographic 6, 181, 206
under configuring 106 techniques
importance of site visit 109 as objects 213
single versus multi-vendor solution 105- class inheritance of hyphenated
106 techniques 214
software considerations data base ANDI chromatography data
management system 108 standard 218-219
decision support 108 application
newer systems 108 development life-cycle 195, 196
older systems 107 specific module
user friendly reporting language 108 Bioassay 65
active data base xvii DMPK 79
ADISS QUMS 79
analytical information model 209 archive/restore data RUMS module 119
architecture and operating systems 211 ASTM xvii
data dictionaries and spectroscopic E49.52 Committee on Analytical Science
techniques 214 data 224
program goals 209-210 audit
innovations 229 function xvii
mission statement 209 of data 6
phased introduction and commercial trail xvii
aspects 214 certification requirements 130
requirement for software clinical trial materials and accession
implementation 214-215 number 84
requirements 210-211 clinical trial requirements 76
toolkit and netCDF 215-218 DMPK studies 64
systems architecture 211 regulated laboratories 145
ADISS/netCDF 207 user defined requirements 183
ADLC xvii, 200 authoritarian control
lead times 200 manager's interaction with
support of rapid prototyping 204 subordinates 8
see also application development life work force polarization 8
cycle automated equipment
American Society for Testing and audit requirements 130
Materials UMS standard guide 34 commissioning 135-136
analyte conveyors 128-129
stability automated canceling of tests 89 cost of robotic systems 135
list in reporting 119 error logging and reporting 130-131
234 INDEX

automated equipment cont'd number of container types required 127


linked to LIMS data base 1 of routine functions limit strategic
monitoring of consumable advantage 17-18
consumption 130 of sample movement to
optimizing use 130 workstations 128
reliability requirements 135 return on investment 124-125
replicate analyses 146 speed of response 125
requirement
calculations at the LIMS interface 146 Backus-Naur Form xvii
compare result against barcode
specifications 146 clinical trial materials 83-84
data acquisition 143-146 facilitates sample registration 129-130
for status information 130 positive identification of patient 100
operator intervention to review scanners storage of specimen data 100
results 146 scanning devices 144
see a/so automated instruments use for real-time host query
automated instrumentation systems 102-103
IDAS 67 use in reducing transcription errors
TPO 67 142
automated instruments stand alone 142 wand, scanner 144
automated laboratories barcoded containers 129
industrialization of laboratories 125- Beckman instrument automation
126, 136 components 147-152
staff numbers 123, 136 Bioassay 65
automated laboratory see also packaged systems
auditability 126 bottles see containers
competitive advantages from accurate BPR xvii
costing of testing 130 matching IT to system
control of services 130 requirements 202-203
error logging 126, 130-131 BS 5750 reliance on procedures and
impact on working environment 125- auditing 163
126, 136-137 BUN xvii
links islands of automation 125 business
nature of systems' interface 3 activities missed through rapid
recognition systems 126, 129-130 prototyping 204
requirement for bottle and container decision 3
management 127 objectives forensic laboratory 37-38
sample transportation 128-129 pressures on clinical laboratories 110
automated methods and sample process re-engineering xvii
workloads 39 enabled through information
automating methods criteria for technology 169
selection 67 objectives 169-170
automation prior to LIMS implementation 187
advantages of in hazardous and replacing LIMS 176
operations 124 stimulus for LIMS replacement 169-
benefits specimen validation 84--85 170
building modification requirement 125, strategy
129 support critical success factors 186
capital funding requirements 125 levels within organization 185
clinical laboratories and increased
testing demands 98 CANDA xviii, 7, 9
consumable costs 125 system costs 7
culture shock 125, 136-137 capacity management ensures achievable
improved result quality 125 daily work load 127
Industrial Revolution 8 CASE xviii
islands of 12, 125 compression of development time 196
justification for at North West tools rapid application development 115
Water 124-125 CDER xviii
INDEX 235

central laboratory XVlll false economics 199-200


ancillary services for clinical trials 72 infonnation system staff conflicts 198
capital costs justification 125 unrealistic expectation of
clinical trials management 71 technology 198
expanding role in drug evaluation 71, 73 user abuse of technology 198-199
flexible LIMS architecture 72, 75 issues of performance 203
North West Water laboratory 123 LIMS 141
certificate of analysis model
annotation and loss of information 12 communication implications 201
electronic document 12 communication implications for value
simple document 12 of information returned 201
cGMP xviii requested data volumes 201
chain of custody xviii, 40 requested data volumes assessment
and sample receipt teams 45 of 201
importance of foreknowledge 11 requested data volumes restricted
judicial scrutiny 11 queries and 201
change management and LIMS organizational boundaries 198
replacement 189 potential for matching interface with
ChemLMS user category 197
API layer enforces data security 156 solution to information generation 11
automated data entry commands 155 technology
client server application 154 long term costs 200
components 153-156 not sole distributed computing
components command processor model 203
language 154 technology benefits to new users 199-
components procedure specification 200
language 153-154 technology IT structure flexibility 204-
server components 154-155 205
use in water analysis laboratory 133 the promise of 197-198
use of 'C' routines 156 clinical laboratory
chromatography data automation and increased testing
communication problems 206 demands 98
management proprietary systems 206 computerization 97
standard ANDI 218-219 commercial systems 97
chromatography standards impact of new technology 98
development 219-222 office automation 105
customer driven 219-220 perspective 105
demand for 219 laboratory standard model of 12
first wave of enhancements 220-222 LIMS
second wave of enhancements 222 future 109-110
vendor involvement 219 data standards 100
chromatography standards lead times Governmental regulations 110
230 requirements 98-104, 108
chromatography standards and system billing 98, 104
parameters 221-222 data interpretive functions 98, 104
class inheritance xviii data interpretive functions and expert
classification of LIMS functions 26--30,34- systems 104
35 data interpretive functions and result
client server xviii, 107 annotation 104
benefits 203 laboratory statistics and QA
definitions 197 functions 98, 104
implementation laboratory statistics and QA functions
interface with legacy systems 202 ad hoc reporting 104
pragmatic approach 205 laboratory statistics and QA functions
infonnation technology strategy 198 automation of 104
integrating different environment 198 laboratory statistics and QA functions
issues for resolution 198-200 turnaround times 100
data ownership 199 patient registration 98, 99
236 INDEX

clinical laboratory requirements cont'd types


quality control and result and robotic systems 127
verification 98, 103 automation requirements 127
quality control and result verification supply to customers 127
multiple level 103 test specific processing 127
scheduling analysis and data control of data 6
collection 100-103 corporate technology strategy client server
scheduling analysis automated batch and legacy LlMS 172-173
analysis 100-102 CRA xix
scheduling analysis limitations of older CRO xix
systems 103 CTM xix
scheduling analysis manual batch see also clinical trial materials
analysis 100 culture change in automated
scheduling sampling and sample laboratories 125-126, 136-137
collection 98, 100 current Good Manufacturing Practice 6
specimen receipt and scheduling for customer and importance of relationship
analysis 98, 100 with vendor 184
test ordering 98, 99 customers for IT services problems in
trial xviii identifying 198
laboratory business objectives 72
laboratory efficiency of operators 75
laboratory prospective time-based data
events 75 abstraction and product development 3
management use of advanced access
information technologies 71 clinical department 68
materials manufacture barcode QA group 68
use 83-84 acquisition automated instrument
materials manufacture requirements 143-146
management 83-84 analysis
see also specimen collection kit by management 8
materials pre-labeling 83 LlMS model levels 28
materials prospective distribution 83- and information 2
84 archival and standards 223
phone alerts audited 92 specification development for
phone alerts automated environmental laboratories 223
management 92 archive 226
protocols sophistication 72 see also data repository
reflex messaging 92-93 archive and netCDF 209,218
CNS xviii archiving automation 51
commissioning automated systems and base configuration
costs 136 need for flexibility 60
committed data base xviii-xix time consuming 120
competitive advantages from coherent base construction xix
information technology strategy 198 capture LlMS model levels 26-28
competitiveness global nature and use of cemetery 6
IT 202 clarification
Computer Assisted New Drug Application audit trail generation 89
see CANDA automated management 90
computerised support systems investigator follow-up 89-90
fit for purpose 194 collection use of remote device 100
regulatory requirements 194 confidence in integrity 8
understanding user needs 194 dictionary class inheritance of generic
container instrument definition 213
impacts on result quality 127 direction of dissemination 2
inventory requirements 127 domain LlMS model level 1 functions 23
shelf life 127 entry
transportation use of conveyors and automated and RLlMS 118
guided vehicles 12&-129 future possibilities 162-164
INDEX 237

hand writing recognition systems 162- using standard information


164 models 226-227
key board 141 retrieval regulatory requirements 206
non-automated methods 142 review
punch cards 141 chromatograms 10
systems mobile PCs 162-164 forensic laboratory 48-49
tape 141 forensic laboratory chromatograms
voice recognition systems 164 in 49
explosion 40 graphical view of serial sample
from instruments and low connectivity results 69
rates 6 mass spectrum 10
and UMS justification 5-6 revision report xix
file format limitations 211 security 6
fit for purpose 5 cost implications 195
flow enforced through FDA system
of samples in environmental audits 195
laboratories 117 philosophy 68
RUMS workstation 118 standards
handling problems for the Electronic Transmission of
and data access 211 Laboratory Measurement
lack of standard instrument Results 112
interfaces 211 and L1MS the way forward 226-227
integrity 6-7 and L1MS vision 207
interchange standardised across attempts to define 206-207
combined instrument objects 214 chromatography 206-207
interface common skills required at 9 IR and JCAMP-DX 208
interpretation and information 5 NMR 223
interpretation by customer 5 NMR and JCAMP-DX
interpretation difficulty in specifications 224
automating 40 requirement for informating
of known quality 7 L1MS 11-12
loss SIMS 223
from focused scope of file storage 223
translators 211 see also data archival
from out-of-control analyses 10 traceability 6
management 8 transfer
LIMS model levels 29-30 use of DOE 156
objects portable definition of laboratory fax gateway transmission 90
instrument environment 212 immunoassay results 48
ownership 1 instruments and L1MS 143
in client-server environment 199 managing information in
production automation and documents 51-52
interpretation 40 potential benefits of fax 51
quality radio-based communication
GLP, chain of custody and QA systems 163
111 review of chromatograms 47-48
improvement and protocol review of mass spectra 48
synchronous structures 90 review of UV spectra 47-48
regulatory requirements 111 standard data base definition 114
reporting see a/so data transmission
electronic transmission 104 transmission
in real time 104 automation and protocol
and reliability confidence in 6 structure 95-96
transfer to hospital information fax gateways 94-95
system 104 fax gateways audit trail
repository maintenance 94
accessed through standard fax gateways management by protocol
interface 226-227 structures 94-95
238 INDEX

data transmission cont'd forensic laboratory business objectives 37


and quality controlled by protocol chain of evidence 37
structures 96 correct report generation 37
standard and EPA 112, 118 data interpretation 37
data visibility 1, 5 data security 37
data volume and extracting information provision of effective service 37-38
need for expert systems 52 see also chain of custody
data volumes of chromatography and forensic laboratory generic functions 43
spectroscopy techniques 206 functional requirement
DDE xix
delta check xix GALP xx
Dessy, R.E. 25 GMP xx
DMPK Good Automated Laboratory
Department Practices xx, 112
information generated 54-55 Good Laboratory Practice 6
integration of separate functions 55
study hardware architecture contingency
regulatory requirements for sampling requirements 133
times 61 HIS xx
types 55-58 holding time sample priority 128
documentation key aspect of human computer interaction approach 196
validation 190 human interface 12
drug metabolism 55
drug safety evaluation 55 lATA xx
Dunnett, N. 37,49 implementation see LIMS implementation
dynamic data xix expectation 1
RLIMS see RLIMS implementation
edited data base xix IND xx
electronic notebook Industrial Revolution 8
medium for data management 13 industrialized laboratories
not LIMS 13 laboratory culture 125, 13~137
environmental laboratories and data management of 126, 13~137
standards 223 informate xx
EPA xix informating aspect
error of automated analytical processes 10
checking protocol synchronous control of automation 10
structures 82 informating
logging importance of immediate environment and documents 12
action 130-131 LIMS login schedulers 11
executive control strategy basis for continual learning 8
consolidation 8 system architecture 11-13
LIMS and CAND A 8 technology authoritarian control 8
experiments LIMS requirements for 54 information
expert systems in clinical laboratories 98 based organization model 2
based organization model placing data in
FDA xx context 3-4
partnership with pharmaceutical classes ADISS model 213
companies 9 corporate organization model 2
review time and CANDA 7 and data 2
see Food and Drug Administration data abstraction 3-4
file system global nature of netCDF 215 definition 2
file systems potential of object oriented documents 12
technology 212 factory
file translators increased errors in NDA process 195
reading 212 sharing knowledge and
Food and Drug Administration information 195
clinical data summary requirements 7 generation requires data standards 11-
reviewers 7 12
INDEX 239

loss and analytical detection limits 5 low connectivity rates


production management
need for effectiveness 194 by LIMS 145-146
need for quality and context 194 regulated laboratory
systems requirement 145-146
identifying the customer 194 run concept and RLIMS 119
managing changing requirements 194 instrumentation
providing appropriate solution 194 stand alone and result entry 142-143
understanding the requirements 194 stand alone and sample
technology and managerial control 8 identification 142
innovation and retention of information instruments
systems personnel 191 automation requires LIMS strategy 12
instrument automation clients of LIMS data base 12
automates laboratory procedures 147 integrated approach benefit 69, 70
different from interfacing 147 internal developers underestimate effort
see also equipment automation required for LIMS development 183
editing the active worksheet 161 Investigational New Drug submission 7
encompasses instrument interfacing 147 investigator xx
LabManager test entry forms 150 IR data exchange standard 208
LabStation and HP Chemstation 162 islands of automation 12, 125
autotitration systems to LIMS 160 ISO 9000 xx, 163
balances 149, 162 IT
chromatographs 161 based organization common skills at data
particle size analyzer 160-161 interface 9
multi-use of instruments 147 enables BPR 202
operator interaction at the solution effective use of new
interface 151, 153, 160-162 technology 204
pH meters 47 IT architecture
real-time host query 48 and LIMS systems 2
real-time host query clinical laboratory pharmaceutical R&D organizations 2
requirement 102-103
requirements JCAMP-DX IR data exchange
identify sample or test 145 standard 208--209
verify analyst 145 superseded by ADISS/netCDF 209
verify instrument 145 JCAMP-DX standard for NMR weakness
requirements down load worklists 145, of 224
159 Jockey Club (UK) race programs 41
test entry forms 150, 152
total organic carbon analyzer 161-162 kits for specimen collection see specimen
instrument collection
connection to LIMS I, 12, 143-146
Data Acquisition System 149 LabManager 147
Data Acquisition System design forms used with LIL programs 150
objectives 149 laboratories and information growth 193
description and chromatography laboratory automation
standards 221 benefits for information flow 16
integration and standards for data creating paper mountain 37
exchange 228 development of plug and play
interface systems 227
importance of requirements and netCDF 218
definition 131 laboratory business objectives
managing work 10 and LIMS 17
multiple contractor involvement 131 DMPK 55
interfacing environmental 111
choice of vendor 43 environmental LIMS 111-112
data transfer 47 information availability 17
data transfer format 143 information production 17
knowledge required 68 information quality 17
240 INDEX

laboratory business pressures achieved laboratory management 52-


compliance with accreditation 53
schemes 38 achieved reduced paper-work 52
increased audit required 38 actual 192, 193
increased Quality Control assurance automated CoA production 42
38 barcode usage 41
increased sample workload 38 data domain 16
more comprehensive testing 38 data management reduction in clerical
laboratory business process 1 work 41
laboratory business requirements in water data transfer automated 41
analysis laboratory 126 information domain 16
laboratory centre for information gathering laboratory performance statistics
and dissemination 16 42
laboratory events schedule xx prediction of workloads 41
Laboratory Interface Language see LIL presumed 192, 193
laboratory management regulatory compliance 42
and audit trail 50 sample tracking 41
GLP requirement 50 definitions 18-20
LIMS model levels 29-30 based on LIMS model 24
system security 50 weaknesses of 18
laboratory model subsystems 9 enforcing SOPs 8
laboratory personnel salary costs 125 fail to meet expectations 15
laboratory process human element 9 functional components 22-23
laboratory testing FDA stance 72 levels of complexity 23
laboratory workstation future as analytical data repository
adherence to EPA standards 118 226
RLIMS module for automated result implementation
entry 118 addresses local functions 198
LabStation 156 and total information needs 18
architecture components 156-158 critical success factor 9
archiver 158 emphasizing current deficiencies in
communication drivers 156 doing business 193
host interface 158 and experience of CANDA
instrument management module 157 systems 8-9
methods and tests 158 failures and poorly articulated
portability of code 157 specifications 193
sample objects 158 failures changing user
SmartPort 157 expectations 193
third party network drivers 158 failures due to rate of change in doing
LabStation client server application for business 193
instrument integration 156 failures due to system usability 193
LAN xxi islands of automation 166
LASF 223 poor success rate 1
legacy systems requirements 203
and client server technology 199 response to changing business
and desk top computing 199 environment 192-193
and integration of data systems 199 scope restriction benefits 15
maintenance and validation issues 200 strategy in-house versus packaged
LIL properties 150, 152 system 42-43
see Laboratory Interface Language strategy conflict with company
LIMS xxi strategy 166-167
architecture 20-21, 24 strategy cost effectiveness of doing
building analogy 20 business 166
requirement protocol synchronous 75 strategy support for company
requirements clinical trial support 72 goals 167
benefits structured approach to 184-185
achieved efficiency gains 52-53 success factors 203
achieved integrity of data transfer 52 success rate and user expectations
INDEX 241

support key business processes 187 real time control of instruments 126
time consuming 44 replacement
user interface 193 change management 189
within company business reasons for 165-175
strategy 185-186 see also replacement of LlMS
instrument management 145-146 reasons not to change 175-179
interface device 145-146 see also reason not to replace LlMS
investment lessons from past 192 result of inadequate
justification 1, 192 maintenance 190-191
based on business benefits 124-125 requirements
incremental benefits and LlMS definition adjustments 189
replacement 179 definition and features 188
and laboratory automation 15 definition packaged systems
life-time 180-181 survey 188
data systems' life-cycle 181 definition rapid prototyping
ease of integration 180 benefits 203
linked to system architecture 180 definition team approach 187-189
vendor support 180 in water laboratory 126
Link bi-directional communication with non-functional hard- and software 43
LabManager 148-149 non-functional system
independent operation 148, 151 management 44
automating 'single reading' non-functional training 43-44
instruments 151-152 robotic interface 123-137
bar code recognition devices 151 scheduling and controlling
Instrument Coupler 148-149 instruments 127
Instrument Coupler distribution of strategic design 16-18
processing to instruments 152 study management requirements 54
operator interaction with 151-152 technology driven solutions 193
work cell configuration 149 traditional features 75-77
management of automated methods 10 clinical laboratory systems 97
model inability to capture protocol design
basis for interdisciplinary elements 76
discussion 25 result managers 226
commercial system see also protocol asynchronous LlMS
comparisons 31-33 traditional functions of 1, 126
data base medium for sample and specification
communication 25 management 54
definition 21-24 transparent medium for sharing data 12
definition functional components 22- usefulness I
23 user most frequent 163
definition interaction between user perception 1
components 23-24 users wrongly identified 1
functional complexity 30 vendors automated interface
global issues 34-35 modules 147-158
importance of perspective 20-21 vision and data standards 207
laboratory environment definition 25 vision in rapidly changing business
organization 24 environment 193-194
package systems selection 31-33 login xxi
potential for expansion 24, 25 schedulers sample context 11
system management 35
technological independence 25 management of automated laboratories
next generation 13 self-managing teams 136
packages managerial control of subordinates impact
Challenger 163 of technology 8
ChemLMS 133, 153 manual data entry
LabManager 147 large instrumental data volumes 142
SampleManager 156 operator analytical work load 142
purchase objectives 40-42 transcription errors 142
242 INDEX

manual methods replaced by robotic organizational systems strategy


systems 124 levels within 186-187
MDL xxi LIMS implementation 186-187
MDL see information, loss of
method automation tried and tested package systems
technologies 124 application to biomedical
monitor xxi applications 54
motivation and job satisfaction 10 ChemLMS 133, 153
Mumford, E. 1% clinical laboratory 97
clinical systems of future 110
NAMAS xxi clinical systems of future multi-
NDA submission size 7 vendor 110
NDA xxi, 7 general purpose tools included 183
see also New Drug Application LabManager 147
netCDF 207 life-cycle 17~174
as a native scientific data storage life-cycle vendor marketing
system 217 strategies 17~)74
development RLIMS 112
history 215 SampleManager 44, 79
support and US National Science selection of using LIMS model 31-33
Foundation 215 vendor opportunities for assessing
features 216-217 requirements 183
file objects' definition 218-219 pathology laboratories
global file system 215 clinical trial support 76
public domain software 216 sample oriented 76
and systems integration 218 patient event xxi
v2.3.3 upgrade 224 defined by accession number 80
backward compatibility 225 prospective link to accession
benefits 225 number 82-83
timing for users 22~225 patient identification control of 82
network architecture designed to reduce PCMCIA xxi, 163
impact of failure 131-133 Peck, C.C 9
New Drug Application data pharmacokinetics xxii
requirements 7 confidence in study data 7
NMR studies 55
$2m sensor 228 phlebotomy xxii
and development of data standards 223 PK xxii
North West Water laboratory mission 123 PLA xxii
North West Water Limited services 123 planning and scheduling of samples 12-127
Notepad PC 163 problem definition for analysis 9
electronic data sheets 16~164 Product License Application data
requirements 7
object productivity gains
class xxi efficient personnel employment 125
oriented xxi faster turnaround times 125
analysis of laboratory data 212-214 project planning RLIMS module 118
techniques 196 proprietary programming language
operator interaction VGL 84
at automated instrument interface 148- protection of investment in data through
149, 151-153, 159-162 use of standards 230
with analytical system 10 protocol xxii
with automated systems 10 asynchronous LIMS deficiencies of 75-
operator intervention 77
automated instrument interface 148- structure laboratory events schedule 78-
149, 151-153, 159-162 79
QC results 103 synchronous LIMS
verification of result reasonableness control structures 79-81
103 output structures 79-81
INDEX 243

patient event basic element 78 rapid application development xxii


time and event structures 79-81 avoiding pitfalls of standard
use of matrix to capture design methods 115
elements 78, 80 CASE tools 115
synchronous output structures RLIMS 113
control flexible reporting 93-94 user requirements and 9
protocol specific reports 93-94, 96 user requirements definition 113-114
synchronous structures managing ad hoc rapid prototyping 196
events 87--88 demonstrates key business
template elements 200-201
captures study design elements 80 disciplined approach 201
creation of investigator specific missing value added business
studies 80 activities 204
protocol-synchronous structures xxii opportunistic application building 204
prototypes user requirements see also rapid application development
definition 113-114 raw data
definition impact of operator review
QNQC support functions RLIMS function at instrument 146
module 118-119 digitized instrumental output 146
QLIMS 79 LIMS data records 146
quality xxii RDBMS xxiii
bio-analytical methods 7 reason not to replace LIMS
chain of custody 11 BPR to take place first 176
clinical data 7 customizations expensive to
control xxii reproduce 177-178
analytical programs 7 difficult to justify 179
and DMPK studies 68 insufficient funds 177
assay performance control 7 interfaced with other systems 178
data use of 7 investment protection 175-176
historic data use 10 meets current business
process automation 10 requirements 177
human contribution 10 satisfaction with vendor 179
laboratory services training costs 178
of data transmission and protocol validated system 178-179
synchronous structures 96 recognition systems
of software and accreditation automatic tracking of samples and
schemes 200 containers 129-130
of work documented evidence in forensic electronic tags 129
laboratory 37 see also barcodes
service through national redundant array of inexpensive disks xxiii,
certification 123 133
quality assurance xxii regulations stimulus for automation 6
increased data volumes 195 Regulatory Agencies European and expert
work load in laboratories 6 reports 7
Regulatory Authorities and quality
RAD xxii control 7
RAID xxii regulatory compliance management
range checking of data entry and review 42
delta values 90, 103 of standards 42
exclusion flagging 90 regulatory compliance use of reagents 42
hitorical versions and clinical trials 91 regulatory environment changes and LIMS
independence from test definition 91 replacement 175
multiple levels and instigating regulatory requirements
investigator action 90 in clinical laboratories prospective
multiple levels and reporting 90-93 QNQCI04
normal range 103 data of known quality III
protocol specific requirements 91 data retrieval and reprocessing 206
QC data 103 DMPK study sampling plans 61
244 INDEX

relational data base management result files manual mapping prior to


system xxiii posting to LIMS database 3
replacement LIMS justification results management RLIMS module 118
custom systems 182-183 risk management
incremental benefits 181 factor in LIMS replacement 173
search for benefits 181-182 validation in non-regulated
training costs 182 industries 178-179
replacement of LIMS risks and lack of change management 189
changing role of laboratory 166 RLIMS
deficiency of original acceptance test plan 116
implementation 166 central support group 121
better linkage with business development approach 113
strategies 166-167, 169 development general principles 112
changed corporate infrastructure development rapid application
170 development methodology 113
changed corporate infrastructure development requirements
distributed LIMS 170 commercially supported RDBMS
costs of retrospective validation 174 112
current user requirements not met 168- common, national data base
169 definition 112, 114
impact of planned major common, national data base definition
enhancements 175 and data transfer 114
limited scope of initial criteria for inclusion 114-115
implementation 171-172 full documentation and training 112,
system 168 114, 115-116
see also LIMS traditional features local control of source code 112,
maintenance costs 171-172 114
personal self-interest 17()-171 nationally supported functions 112,
reducing risk to business 167-168 114
relationship between customer and open systems architecture 112
vendor 175 functional specification 115
risk management 173 implementation
system integration 189-190 managing new feature requests 120
technology driven 171 team approach 119-120
cost of integration 172-173 hardware configuration 119
technology restricting multi-disciplinary team 120, 122
development 171-172 phased or big bang 12()-121
file structure 172 project plan definition 119
language 172 modules 116-119
technology strategy and client sample data flow 117
server 172 user's reference guide 116
training 190 robots
report generation in clinical trials and FDA and barcode readers 39
regulations 72 clinical laboratories 98
reporting for sample processing 39
ad hoc report development 12()-121 role of laboratory changing stimulus for
certificates of analysis 12 LIMS replacement 166
LIMS benefit 49 runsheets and QC samples 66
LIMS model levels 29
RLIMS 119 sample
RLIMS analyte lists 119 analysis runsheet 66
RLIMS consensus for report worksheet 67
definitions 119, 120 see also worksheet
secretarial involvement 49 chain of custody 11
and transcription errors 49 collection kits 45
use of multiple range levels for flagging materials management 72
results 90 see also clinical trial materials
work in progress 5 see also specimen collection kits
INDEX 245

container types see container types soft systems analysis 196


containers and automated software
laboratories 127 design specification xxiii
costs reduce through increased development personnel part of
throughput 124 problem 204
data flow in environmental specimen xxiii
laboratories 117 specimen collection
description and chromatography clinical laboratories 100
standards 221 clinical laboratories use of mobile
gathering control 11 intelligent bar code scanners 100
impact of condition 11 kit
management RUMS module 118 building 84
priority for managing emergency building quality control 84
work 128 building quality control importance of
receipt accession number 84
quality of result 45 inventory management 84
sample inventory tracking 46 inventory management protocol
teams forensic laboratories 45 structure 84
reception centres 127, 128 receipt 84-85
staff responsibilities 128 validation automation benefits 84-85
registration sponsor xxiii
source of errors 126 SQC xxiii
use of scheduler software 126-127 SQL xxiii
reporting and end of life cycle 6 SSADM 196
scheduler Stability Testing xxiii
benefits from prospective standard file formats
functionality 70 Backus-Naur Form grammar 212
Bioassay script 65 difficulty of maintaining 211-212
capacity management 127 lack of rigorous definition 212
pre-allocation of unique limitations of 211
identifiers 45-46 tendency to use lowest common
production labels for glassware 47 denominator 211
production of barcode labels 46 standard guide for UMS 34
and re-sampling 127 standard laboratory model
sample receipt deficiency reports 46 benefits for laboratory community
test assignment 46 12
use in forensic laboratories 41 vendor opportunities 12
workload prediction prior to login Standard Operating Procedures 147
45 see also SOPs
shelf life 128 Standard Operating Procedures enforced
see also holding time using LabManager forms 150
shelf life automated systems 128 standards
storage management 46 and user awareness 229
tracking 62-63 benefactors 228
sampling times modification of protocol benefit
times 66-67 and increased productivity 230
SAS xxiii and protection of investment in
scheduling prospective events protocol data 230
synchronous structures 86-87 business dependency 229
sequenced run information 222 development
see also worksheets benefits of business oriented
shelf life xxiii approach 208
derivation 3 commercial venture 208
SIMS and development of data cost of 207
standards 223 product development analogy 207-
skills requirement in automated 208
laboratory 136-137 for data exchange
socio-technical systems approach 196 ADISS AIM 209
246 INDEX

standards for data exchange cont'd interfaces


EPA 112,118 documented specification control 131
for describing laboratory information transfer 131, 137-139
environment 12 maintenance 51
for system analysis 195 management LIMS model levels 35
influence on market dynamics security philosophy 68
and custom file translators 228 suitability
in electrical industry 227-228 and chromatography standards 221
and marketing strategies 228 data role of 10
and new products 228 support RUMS and dedicated system
program manager 121
AlA 207 user requirements poorly
ASTM 207 articulated 1
ASTM ADISS software validation and chromatography
architecture 209 standards 222
ASTM analytical information systems
model 209 analysis eclectic approach 200
see also ADISS analytical development life cycle xxiv
information model and
organizations technology
costs 229 impact of informating 8
driven by users 229 informates 8
IEEE 208 test
ISO 207 entry forms for acquiring data 152
participation 219 schedule xxiv
vendors need to measure demand 230 timeliness of service factor in laboratory
STAT xxiii operations 5
static data xxiii training key aspect of validation 190
structured methodologies and Transaction Processing Option 159
requirements definition 196 Transaction Processor 149
Structured Query Language 154 transcription errors reduced through use of
study xxiii barcodes 41, 45
design 3 transportation
components experimental subjects equipment building requirements 129
59 samples to equipment 128-129
components in vitro application 59
components protocol identifier 58 user
components sample analysis 59 categories 197
components sampling plan 59 client server potential for matching
components treatment groups 58-59 interface to 197
management requirements of UMS 54 centered approach matches interface
serial sample management 54 with user category 197
sample attributes 61 defined requirements inadequate basis
DMPK study requirements 64 for in house development 183
multiple time definitions 61--{i2 interface 12
status 62 needs definition 196
types user centered approach 196--197
analytical method development 59 requirements definition
drug metabolism drug disposition 56- prototypes use of 114
58 workshops 114, 115
drug metabolism enzyme users 1
induction 56-58
metabolite characterisation 59 validation xxiv
pharmacokinetic 56 documentation key aspect of 190
Sutcliffe, A. 197 training key aspect of 190
system interfaces vendor
cost of uncontrolled changes to and ADISS standards development
specifications 131 programs 219
INDEX 247

and development of data standards for WA VELAN radio based protocol for data
instruments 227 transfer 163
evaluation LIMS model 32-33 work flow 6
importance of relationship with workforce polarisation 8
customer 184 workload statistics production 50
importance of track record of worksheet
hardware 106 and chromatography standards 222
marketing practices sales pitch 109 automated batch analysis in clinical
marketing practices vaporware 109 laboratories
marketing strategies and LIMS product non-host query 100-102
life-cycle 173-174 real-time host query 102-103
need to measure demand for generation DMPK studies 64
standards 230 manual batch analysis in clinical
return on LIMS investment 184 laboratories 100, 102
systems response times 109 template driven 47
VGL sponsor specific reports 94
virtual analytical instrument 212 xenobiotic xxiv
voice recognition systems limited use
164 Zuboff, S. 8

You might also like