Professional Documents
Culture Documents
Advanced LIMS Technology - Case Studies and Business Opportunities
Advanced LIMS Technology - Case Studies and Business Opportunities
Advanced LIMS Technology - Case Studies and Business Opportunities
Edited by
J.E.H. STAFFORD
Consultant
Fisons Instruments
Cheshire
3.1 Introduction 37
3. 1.1 The role of a forensic laboratory 37
3.1.2 Automation 39
3.1.3 Analytical data 40
3.2 Objectives of a LIMS 40
3.2.1 Efficient interactive sample log-in 41
3.2.2 Sample tracking 41
3.2.3 Utilisation of barcodes 41
3.2.4 Automatic analytical data transfer and result entry 41
3.2.5 Less clerical work for scientists 41
3.2.6 Application of GLP principles 42
3.2.7 Laboratory performance statistics 42
3.2.8 No typing delays 42
3.3 The system 42
3.3.1 Sample receipt 45
3.3.2 Sample analysis 47
3.3.3 Analytical result review 48
3.3.4 Reporting 49
3.3.5 General management 50
3.4 The future 51
3.5 Conclusions 52
Acknowledgement 53
Index 233
Glossary
Active database That part of a LIMS database used for work in progress.
Chain of custody The complete record of the life cycle of an item under
analytical investigation, from the point of collection and preservation
through to its storage, transfers, analysis and disposal.
Committed database That part of a LIMS database used to store data for
samples on which work has been completed. Splitting the LIMS database
into active and committed is a historical device employed to maintain
GLOSSARY XIX
system response time. Should not be confused with the process of data
archiving.
IND Investigational New Drug submission. The FDA review stage used
to regulate the testing of pharmaceuticals in human volunteers.
LAN Local Area Network. Term given to a small grouping of PCs linked
together to enable sharing of peripherals and data. The network is
managed using dedicated application software.
NDA New drug application. The FDA review stage used to regulate the
introduction of pharmaceutics into the US market place.
PK see Pharmacokinetics.
SAS A trademark of the SAS Institute Inc. and used as a prefix for
identifying their products and services for data analysis.
STAT A sample with the highest priority for testing. Sample requiring
immediate testing.
Static data Data which defines the laboratory environment, e.g. instru-
ments, personnel, analyses.
1.1 Introduction
/' "-
" '\
/
/
/ \
/ L1MS \
@
I \
I \
I BASE J
\ I
\ /
\ /
\ /
'\ /
" /'
/'
LABORATORY PROCESS
DATA
SAMPLES
(DOCUMENTS)
PROCESS
PARAMETERS
Figure 1.2 Top-level laboratory process.
Test System
Analysis
Sample Management
CLINICAl RESEARCH
VETERINARY ANALYTICAl
AGRICHEMICAL LABORATORY
TOXICOLOGY
QUALITY
ASSURANCE
REGULATORY
AFFAIRS INVENTORY
MANUFACTURING
Figure 1.4 Hypothetical R&D-based company and overlapping requirements for LIMS.
LABORATORY PROCESS
ANAl.YTICAL PROCESS
PROCESS
PARAMETERS
the test result, will be readily accessible from within the review screen (see
chapter 3).
The need for standards may progress beyond the analytical data itself
and address the definition of the laboratory environment. In this area the
clinical laboratories are already leading the way [24], and the experience of
the EPA is discussed in chapter 7. Whilst vendors of sample management
systems (most LIMS systems are sample management systems) may view
this development with some trepidation it offers the users and commercial
software developers potentially a great number of advantages. Software
applications that manipulate laboratory data and create information will be
able to take advantage of standardisation to interact cleanly and more
securely with LIMS from different vendors and enable a more sound
business case to be made for undertaking product development. The
laboratory community would benefit by having more variety and choice of
software applications to help them to analyse their analytical processes and
test systems. Currently the availability of application-specific LIMS
modules reflects the viability of vendors able to develop products for niche
markets, e.g. stability testing, materials and formulation management, lot
disposition, drug metabolism and SOc.
Perhaps the greatest change necessary is in the way instruments interact
with the LIMS database. Instruments can no longer be implementated as
islands of automation, but must form part of the LIMS strategy (chapter
8). In an informating LIMS environment instruments are true clients of the
LIMS database server. The analytical scientist or technician will manage
their activities by way of the instrument interface. The LIMS database will
be invisible. It will become a transparent medium by which analytical data
are shared in a controlled and secure manner between instrument systems.
References
12. Computerised Data Systems for Nonclinical Safety Assessment. Current Concepts and
Quality Assurance (1988). Drug Information Association, Maple Glen, PA 19002, USA.
13. Good Laboratory Practice Advisory Leaflet, No. 1 (1989) The Application of GLP
Principles to Computer Systems. UK GLP Compliance Programme, Department of
Health, London.
14. Good Automated Laboratory Practices (1990) Recommendations for ensuring data
integrity in automated laboratory operations with implementation guidance. Draft.
OIRM, US EPA, Research Triangle Park, NC 27711, USA.
15. Shah, V.P., et al. (1992) Analytical methods validation: bioavailability, bioequivalence
and pharmacokinetic studies. Pharm. Res., 9 (4), 588-592
16. Rogalski, W. (1994) Global electronic submissions, part I: The DAMOS transfer
interface standard. Applied Clinical Trials, 3 (11), 30-40.
17. Mathieu, M.P. (ed.) (1992) CANDA: A Regulatory, Technology and Strategy Report.
Parexal In!. Corp., Waltham, MA, USA.
18. Rank, M. (1993) Automating procedure management for regulated industries through
LIMS. LC-GC Int., 6 (6), 342-344.
19. Bell, R.A. (1993) FDA Perspective on CANDAs. In CANDAs: Evolution & Revolution,
Conference Proceedings, Amsterdam, pp. 124-139.
20. Kimbrell, J.Y. (1993) CANDA as a strategic initiative for the pharmaceutical industry. In
CANDAs: Evolution & Revolution, Conference Proceedings, Amsterdam, pp. 22-32.
21. Trigg, J.F. and Smith, J.A.P. (1994) Do end-users meet the LIMS requirements?
Chemometrics and Intelligent Laboratory Systems: Laboratory Information Management,
26 (3), 181-187.
22. Long, T.J. (1992) Human issues and impact on quality. Chemometrics and Intelligent
Laboratory Systems: Laboratory Information Management 17 (3), 289-294.
23. Herzberg, F. (1966) Work and the Nature of Man. WPc.
24. ASTM E 1639 (1995) Standard Guide for Functional Requirements of Clinical Laboratory
Information Management Systems. American Society for Testing and Materials,
Philadelphia, PA 10103, USA, in press.
25. Gorry, G.A., Long, K.B., Burger, A.M., Jung, c.P. and Meyer, B.D. (1991) The virtual
notebook system: an architecture for collaborative work. J. Organisational Computing, 1
(3), 233-250.
26. Rubinson, L., personal communication. Megalon, Novato, CA 94949, USA.
2 A model for a comprehensive LIMS
R.D. McDOWALL
2.1 Introduction
A framework now exists upon which to build a LIMS model. This model
will eliminate some of the shortcomings of the literature definitions [1-3]
discussed previously. The model is based on the objectives that define a
laboratory environment, which must be addressed prior to a functional
definition of a system. Although conceptual in nature, it will define a LIMS
so that it encompasses effectively the total automation requirements for a
laboratory. Moreover, it will provide a basis for a LIMS in both the Data
and Information Domains, i.e. an integrated information management
environment. As a laboratory is charged with producing information, a
LIMS, using today's computer technology, can provide this integrated
framework to link all aspects of the laboratory environment and thereby
increase the productivity of all functions inside and outside the laboratory
[4, 5].
The idea of an architecture takes these issues further, with a systematic
and unifying organization of LIMS functions and their interactions. This is
important, as it provides the framework for the future development of
LIMS. However, the model can be viewed with different perspectives
depending on the viewpoint of the reader: user, designer or vendor. For
example, consider an architectural analogy.
The model consists of two principles that form the basis of an architecture
for a comprehensive LIMS:
• First, a LIMS is divided into a set of functional components. This
means that the boundaries and scope of each of these functional areas
are clearly understood .
• Second, the interaction between these functional areas is defined by a
common method that allows each area to grow and expand independ-
ently of any other, thus preserving a path for incremental growth,
upward compatibility and future expansion of the model itself.
A pictorial representation of the LIMS model is shown in Figure 2.1.
Here the centre represents the LIMS database which is surrounded by the
four functional components of the LIMS. The boundaries between these
functions are well defined and the major connection between the
functional areas is via the database. In this way it can express the following
major elements of a LIMS:
22 ADVANCED LlMS TECHNOLOGY
Data
Capture
example, a laboratory may have a large emphasis on data capture and its
subsequent analysis with rudimentary reporting and management facilities.
Therefore, the model is sufficiently flexible to cope with different
situations.
2.5.2 Functionallevels
To facilitate the understanding of the model and to use it as an effective
tool, each of the four 'user function' areas is divided into three levels,
which represent the scope of implementation. These range from the
fundamentals required for a computer application to be classified as a
LIMS to possibilities for the future.
Within any of the user functional areas described above, the actual
implementation may occur at various levels of complexity. The following
broad descriptions of these levels help to describe the idea of a complete
LIMS framework and to facilitate discussion of LIMS in general.
Level I Minimum functions for a LIMS required to meet the basic
requirements of a functional area. This level of implementa-
tion is usually manual and is only minimally, if at all, helped by
the technology and computer tools used to implement a LIMS.
A system with Level I functions would be aimed at the Data
Domain only.
Level II Functions at this level are intermediate and generally represent
known applications of computer tools in a laboratory environ-
ment. They are generally automated functions and are usually
more complex and hence more costly to develop and imple-
ment into a LIMS, but it is only as higher level functions are
added that the system begins to address the Information
Domain.
Level III At this level, functions are advanced features that are at the
leading edge of technology, which may be high-risk and high-
cost, and are not available on most systems at present.
However, they should generate competitive advantage for the
organization if successful. Implementation of functions at this
level will fuel the development of LIMS in the future.
Having classified functions into areas and levels, we are ready to define a
LIMS. A LIMS is a computer application that meets the requirements
outlined by the model: the Level I functions in each of the four areas, with
a database. A system that does not meet the minimum requirements of this
model may help automate the laboratory, but is not a LIMS. A LIMS,
therefore, is a computer system that can capture, analyse, report and
manage data and information by way of a database, or it is not a LIMS.
COMPREHENSIVE LIMS MODEL 25
meet laboratory IT requirements with current technology, but also lays the
foundation with which to expand the model in a coherent and controlled
manner.
Level II. A LIMS operates best when results are transferred electronically
and the functions at this level reflect this; however, transfer of data is only
one-way to the LIMS. The ease of interfacing analytical instruments varies
greatly, and therefore at this level instruments with low data capture, such
as chromatographs, atomic absorbance and emission spectrometers and
balances can be interfaced. Data can be captured on-line with well proven
peripheral equipment and software. The LIMS model, being technology-
independent, does not stipulate whether the computer running the LIMS
or an ancillary computer acquires data. Therefore, a network of processors
or a single system can still be classified as a LIMS, provided that it still
Table 2.1 Elements of the LIMS model
Level II. Data-analysis functions at this Level bring more power to the
system: either files imported from instruments can be converted into
database files, or the results contained within the original file can be
abstracted by a utility program prior to insertion. Furthermore, this
process can work in the opposite direction and results files can be exported
from the LIMS for graphical or statistical interpretation.
Reduction of data is an important function at this Level of LIMS
operation: an example might be the reduction of chromatographic data to
amount or concentration of analyte. Included in this process is also the
interpretation of results, where a chromatogram is viewed and an analyst
makes a judgement on whether any further interpretation is required.
Level III. Functions here involve the use of more sophisticated graphical
interpretation packages, e.g. three-dimensional plotting, or the use of
chemical substructure searching and interpretative packages used for
converting data to information.
COMPREHENSIVE LIMS MODEL 29
Level I/. At this Level there is improved reporting flexibility with the
ability to present results graphically, or to merge the LIMS output with
electronic mail facilities for faster distribution of reports or with desktop
publishing packages for professional reporting. Some laboratories may
require chemical substructure searching and drawing as part of their
operation, and an interface with such a package would be a feature at this
level. At this Level the functions that enable a LIMS to conquer the
Information Domain begin to emerge.
Level II/. A LIMS with functionality at this Level has the ability for
natural-language interrogation of the database with on-line expert assistance
for ad hoc searches. The system would be interfacing with expert systems
to help the inexperienced user.
The system would be integrated into the Information Domain, and its
value to the organization would be enhanced by interaction with other
applications, e.g. production information systems or manufacturing systems
to provide the key information to Computer Integrated Manufacture.
Level II. Here the ability to track sample location is integrated with the
sample-management functions of Level I and the data-capture functions of
30 ADVANCED L1MS TECHNOLOGY
Level Ill. Functions at this Level are concerned with planning of resources
and running the laboratory on a long term basis: on-line scheduling of work
from other computer applications and systems, interaction of the LIMS
with project-management software and electronic laboratory notebooks.
Data capture III Bidirectional communication between the LIMS and the
data-acquisition systems for immunoassay and
chromatography. This is for downloading of files
containing sample identity and the uploading of files
containing the same identities with the associated
results. This is a key requirement to ensure sample
continuity and efficient use of resource.
Data analysis II Two-dimensional graphics for the visualization of
analytical results. Utilities for the conversion of files
transmitted from the data-acquisition systems for
inclusion in the database.
Reporting II The requirement is the merging of LIMS output with a
document management system for electronic
distribution of reports.
Management III The requirement for software to manage protocol
required for bioanalytical studies means that Level III
facilities are required. This facility was not included in
the original definition of the LIMS model, but
experience has indicated that it is prudent to do so now.
32 ADVANCED LIMS TECHNOLOGY
Laboratory requirements
Figure 2.2 The LIMS model based on the requirements for the Department of Bioanalytical
Sciences, Wellcome Research. From reference 11.
Figure 2.3 The LIMS model from (a) vendor A; (b) vendor B; (c) vendor C; (d) vendor D;
and (e) vendor E. From reference 11.
commercial systems. Interestingly, there are some systems that have very
little in common with the laboratory. Therefore, it is important that the
laboratory knows its minimum requirements before approaching potential
vendors, in case compromises need to be made about specific functions.
Discussing each functional area in turn:
• Data capture: three suppliers were able to provide a full solution on
paper to the laboratory requirements. However, one vendor did not
have sufficient communications capability for serious consideration as
a contender within this category. Another could meet the laboratory's
requirements, but made it clear that the software did not exist and
would be written if an order were placed.
• Data analysis: The majority of suppliers could meet the laboratory
requirements for integrating files from chromatographic and immuno-
assay data-acquisition systems, but two did not have on-screen
graphics capability to present the results.
• Reporting: All vendors could meet the requirements for reporting.
• Management: Only two suppliers could provide study protocol
management facilities; the others had plans to develop them but with
varying timescales.
Taken overall, the model showed that only two of the five systems were
close to the requirements of the laboratory and hence should be evaluated
further. One system could meet the laboratory requirements through
additional programming, and whether this or any of the other systems
should be evaluated was left to the judgement of the project team.
The American Society for Testing and Materials (ASTM) has a LIMS
committee, E31.40, that has been developing a LlMS Guide. The Guide
was published in 1994 and covers standard definitions, the definition and
scope of a LIMS and the system-development life cycle [13].
When the LIMS model was used in practice, it was static and did not
cover such areas as support [11]. One element of the LIMS Guide is the
further development of the model to address such issues.
The first area was the introduction of global issues that impact the whole
of the LIMS model. Some of the global issues are:
• Change control or configuration management This covers many
aspects of a LIMS such as the hardware configuration, software
version and revision change control, and the storage and approval of
results within the database. Formal change control is essential for data
integrity.
• Communication infrastructure The means of transferring data to and
from instruments in the laboratory and to and from the LIMS and the
organization.
• Documentation and Training To operate and maintain the system.
• Security Both the physical security of the equipment and the logical
security built into the operating and application software.
• Validation Any LIMS operating under a quality scheme such as
Good Laboratory Practice (GLP), Good Manufacturing Practice
(GMP) or International Standards Organization (ISO) should be
validated to demonstrate and document that the tested functions work
as designed.
• Performance The responsiveness of the whole system when used in
practice.
The LIMS model was enhanced, as the authors intended [4], by the
addition of another functional segment of system management. This is
aimed at the functions of monitoring and maintaining the LIMS. Table 2.3
summarizes the functions in this new group and Figure 2.4 shows the
updated LIMS model.
COMPREHENSIVE L1MS MODEL 35
Table 2.3 Additional functions for system management in the ASTM LIMS concept model
Level I
Backup and recovery
Level II
Archiving
Manual performance tuning
System fault tolerance
Level III
Dynamic performance tuning
Advanced fault tolerance
Redundant systems
Advanced communications links to external systems
Global Issues
Data
Capture Analysis
Reporting
Laboratory
Level I. The functions here concern backup and recovery of the existing
data and software on the system.
Level I1. This covers such items as archiving of data, which many systems
are good at. However, most commercial systems are very poor at retrieval
of data into the database. Manual performance tuning and system fault
tolerance are the other two functions at this level.
Level III. At this level of the model there are dynamic performance
tuning and advanced system fault tolerance. These are the tools to improve
the performance of a LIMS in response to an increasing user load and the
use of hardware solutions such as clustering or CPU redundancy to keep a
system operational after hardware problems have emerged. There are also
36 ADVANCED L1MS TECHNOLOGY
advanced links to external systems and applications where a user can cross
seamlessly from one application to another.
2.10 Summary
All major functions of a LIMS can be classified into one system- and four
user-related areas and conceptualized in the LIMS model. The system
function is the data repository or database. The four user areas are data
capture, data analysis, reporting and management. The LIMS model
provides the means for introducing the concepts of LIMS, promoting a
common vocabulary for multidisciplinary communication and visualizing
LIMS requirements, and as an aid in the selection and acquisition of a
system. The original model has been enhanced in the ASTM LIMS
standard and improved through incorporation of system management and
global issues.
References
3.1 Introduction
level and this cost. It is the third factor in the statement, namely efficiency,
which is so crucial in achieving the desired goal.
The total workload of any laboratory is the challenge against which it must pit
its resources in order to succeed. A racing laboratory faces not only inflationary
pressures, but also other factors specific to the current racing scene.
These factors are as follows:
(1) More samples submitted for analysis
(2) A wider range of drugs being tested for
(3) Ever increasing need for tighter forensic audit
(4) Resource implication for QC/QA procedures
(5) Compliance with Accreditation schemes
The first two factors mentioned need to be investigated in more detail.
3.1.1.1 More samples submitted for analysis, to screen for a wider range of
drugs. Forensic laboratories are today faced with an ever increasing
workload, and are screening for a wider range of drugs. The combination
of these two factors has therefore caused an increase in the numbers of
analyses performed. As recently as ten years ago, the analytical techniques
available to the laboratories were technically simpler and less abundant.
For example, each biological sample would have been subjected to no
more than five to eight tests in total. These would have probably involved
liquid-liquid extraction followed by UV spectroscopy and crystal micro-
scopy testing. This testing did not, in general, involve the ubiquitous
computer and so the data produced, although quite possibly incomplete,
were very concise. It was all very labour-intensive, requiring a high level of
personal skill, but ultimately only capable of detecting a relatively small
range of substances.
In contrast, the present-day laboratory will do between ten and fifteen
tests per sample involving a battery of analytical techniques. These may
include solid-phase extraction (automated), followed by chromatographic
procedures, such as gas and liquid chromatography, in addition to
extensive immunologically based analyses, including RIA and ELISA. The
use of GC-MS has also developed into a fundamental method for both
confirmatory work and a wide range of screening analyses. The use of
autosamplers, working day and night, has become essential, as has the use
of robotic sample dispensers that efficiently process large numbers of
samples through a range of different immunoassays. Newer techniques are
also being introduced or are under investigation. Because of these
developments, the number of substances tested for, and therefore
requiring some interpretational action, has increased dramatically in recent
years.
As an example of this, we can look at the increase in HFL's workload.
The number of samples received for analysis at HFL has increased steadily
at 5% per annum since it started in 1963. This increase, however,
LIMS IN A FORENSIC LABORATORY 39
3.1.2 Automation
Clearly, the application of automated procedures is essential to contain the
enormous increase in sample test numbers. With the development of the
microprocessor, and later the personal computer, control of 'robotised'
instruments has become a reality.
Autosamplers for analytical instruments such as gas and liquid chromato-
graphs were among the first developments; they seemed to appear in the
early 1980s and have significantly improved to give the models that are
available today. Automation for other techniques soon followed; automatic
spotters for TLC plates were produced but received mixed reviews, with
enthusiastic acceptance by some and rejection by other analysts. Signifi-
cantly, the use of automation for sample extractions, arguably the most
labour intensive area, did not become a reality until the introduction of the
modern solid-phase extraction cartridge. Despite their ability to reproduce
the same task reliably, early sample-extraction robots were very demanding
upon laboratory space, very expensive to install and slow in performance.
Versions are now available which are extremely compact, inexpensive and
moderately quicker. Their strength lies in the fact that unlike the human
equivalent, they will work reproducibly around the clock. Immunoassay
equipment manufacturers were quick to adopt the automatic approach,
initially with radioactivity counters and later with robotic sample processors.
The sample dispensers in use today are complete with multiple robotic
arms and on-board barcode readers.
/'
,967
'969
197'1973 I• I'J~ .....
1975,977
1979 '981 - ........
tests
198319851987
,989 ,991
samples
Figure 3.1 Increse in HFL workload, 196710 1991.
40 ADVANCED LIMS TECHNOLOGY
average, scientific staff were spending 20% of their time on clerical work.
All instruments were producing some form of hard copy, and all these
printouts had to be collated into the correct files. These files were then
second and third checked by different members of staff to ensure that each
file contained the correct printouts and each printout the correct result.
Although we had a good idea of what we wanted from our system, we were
largely ignorant of what was actually feasible or currently available from
LIMS vendors. To this end, we appointed a computer-system expert in the
role of consultant for compiling an invitation for companies to tender for
supply, and for developing a specification and evaluating the systems on
offer. This appointment proved to be a prudent decision and the consultant
was involved right through from feasibility study to acceptance of the
system.
A short list of options for finding a suitable system was considered.
1987, when these decisions were being made, we did not have any staff
with the necessary computing knowledge to design such a complex system.
dependent upon the effort put into the training of users and managers.
There will always be a learning curve once a new system becomes
operational, but this can be reduced to a minimum with a thoroughly
prepared training program. Once the laboratory is locked into a computer-
controlled environment, the need to keep the system operating effectively
becomes essential. Not is it only prudent to invest in both hardware and
software maintenance support, but more importantly is the provision of a
competent system manager. System housekeeping, database tuning and
refinement add up to a full-time job and are frequently underestimated.
The design, development, installation and system validation elements
generate a complete project that is extremely time-consuming. For the
HFL system, this took three years from inception to going live. This was
longer than anyone envisaged out the outset. There were problems along
the way, as there will be with any new implementation, but due to the
experience gained by both ourselves and the vendors, I am confident that
this timescale would be shortened if we were starting today.
Looking at the current hardware configuration shows how the system
has grown since it went live in 1990 and how dependent the laboratory has
become on it. The dramatic increase in networked personal computers and
on-line disk capacity reflects the growth of the laboratory's PC network
(Table 3.1).
The two major software components of the system are the VG
Laboratory Systems Sample Manager and the VG Data Systems Multichrom
chromatography software. Ethernet communications and terminal emula-
tion for the PCs is achieved by a combination of both Decnet and TCPIIP
protocols, with Digital Pathworks and Reflections 4+ software.
1991 1994
the log-in information, are created the night before receipt to speed up the
log-in process.
3.3.1.7 Sample volume tracking. The volume of sample required for the
analyses is also stored on the system. This allows the system to decrement
the sample volume automatically after the initial volume has been entered
at log-in.
3.3.1.8 Sample storage and location. Control of sample storage was a
major problem without LIMS. We receive about 30 000 samples per year,
nearly all of which are stored in freezers, and most produce negative
results. As we have limited freezer space, it is imperative to dispose of
negative samples as soon as possible after reporting, but it would be a
catastrophe to dispose of samples producing positive results before all legal
options have been completed. Consequently, the LIMS controls all storage
management. All freezers are barcoded, as is each of the many storage
trays within each freezer. Using these barcodes and those on the samples,
all samples are logged into the freezer using a portable computer with a
built-in barcode reader. The completed data are then uploaded into the
LIMS. Each day on request, the LIMS will produce a list of samples ready
for disposal in both barcoded and human-readable form. Samples for
disposal are removed from their location and, using the portable computer
and barcoded labels, their correct identity and numbers of samples
available for disposal are verified, before disposal takes place.
L1MS IN A FORENSIC LABORATORY 47
3.3.4 Reporting
Case reporting is perhaps the area where the users have observed the most
benefits from the LIMS. Before use of the LIMS, all reports leaving the
laboratory were sent to the secretarial staff for typing. Not only was this a
time-consuming exercise, but it also introduced a problem of induced
errors at times of greatest pressure. Transcription of sample code number
digits, which were typically between six and twenty characters, was one of
the most commonly encountered errors and required many checks and
cross-checks before the reports could leave the laboratory.
Secretarial involvement has been removed completely from the reporting
procedure and the features provided by the reporting function are listed
below.
(a) Automatic generation of negative reports
(b) Production of composite report for Jockey Club
(c) Production of Certificates of Analysis for private clients
(d) Production of positive report via the network to PCs running word-
processing software
(e) Generation of hard copy data for positives
(f) Interaction with Accounts for invoice generation
50 ADVANCED LIMS TECHNOLOGY
3.3.5.2 Audit trail. The system allows us to audit the reading, modifica-
tion, addition or deletion of any data. The only restriction we have is the
availability of disk space to store all these transactions. In reality the
actions that need to be audited are reduced because the operator identity
and date are recorded for most database entries.
table within minutes of the request being entered on the first day of each
month.
Besides the normal upgrading of the system as and when newer versions
are developed by the vendor, enhancement will be achieved by introducing
new features.
System transmitted and received faxes will further improve efficiency.
We currently have a direct link to the UK Jockey Club, but ultimately
direct communication with other customers' computers is a possibility.
This would also introduce the reality of uploading sample information to
reduce the sample receipt procedure dramatically.
Incorporation into the system of incoming and other externally
52 ADVANCED L1MS TECHNOLOGY
3.5 Conclusions
Acknowledgement
4.1 Introduction
The studies that yield samples for analysis in the Drug Metabolism and
Pharmacokinetics Department can generally be defined as being of several
distinct types. The main types of studies, the types of samples generated
56 ADVANCED LIMS TECHNOLOGY
and the analyses carried out are summarised in Table 4.1. The first four of
these studies are designed to determine the fate of a test drug following
administration to a test subject and can be further broadly classified as
follows.
administered with the drug, or determine the effect of adding drug on the
activity of enzymes derived from the liver of control subjects. In either
case, the results of these analyses are generally enzyme activities compared
across various drug treatments.
(a) Study. This, exclusively, is the identifier of the Study Protocol and is
the primary key to the stored data. All studies in the Department are
carried out under a unique Study Protocol.
(b) Treatment groups. For the most part, for studies carried out in vivo,
this identifies groups of subjects that receive the same treatment. Some
studies may only have a single treatment group, for example, a simple
excretion/balance study, but in general, most have several treatment
groups, which may have different routes of treatment (intravenous, oral,
topical), different doses, or different formulations (e.g. tablet, capsule). In
Figure 4.1 Outline of a study design. The numbers shown are those commonly encountered.
Although not shown, each Group will have an identical hierarchy of components.
LIMS IN A DMPK LABORATORY 59
(c) Subjects. This details only the subjects studied in a given treatment
group.
(d) Samples. These are the samples collected from the subjects in a
specific treatment group. Not all subjects necessarily have the same
samples collected from them, but the protocol defines which samples are
collected from which subjects.
Where several samples of the same type are collected from a single
subject, e.g. sequential plasma samples collected from each subject in a
pharmacokinetic study, the samples are differentiated by the time of
collection of the sample in reference to the previous dose of the test
compound.
The matrix shown can be used effectively to represent the sample matrix
in any in vivo or ex vivo experimental protocol in current use. To allow the
extension of the template for application to in vitro studies, not all of the
above parameters are required to define the sample matrix. However, the
samples can be included in such a matrix. How this is done in practice is
covered below.
There are a small number of study protocols which either do not fit the
matrix shown above, or contain such a large number of alternatives in the
way in which the work is executed that they preclude precise specification
of the sample matrix in the protocol study plan. Examples of this type of
study are analytical method development, or metabolite characterisation.
In the former case, there is generally no fixed study plan, or even sample
matrix, and only a general approach is specified at the outset. In the latter
case, although the sample matrix is generally well specified, the tests, or
analyses, which are used first to purify potential metabolites and
subsequently to characterise their structure are not. These types of studies
are thus not generally entered or contained in the LIMS database.
60 ADVANCED LIMS TECHNOLOGY
~ I /
SAMPLE AND TEST
HOME OFFICE
TEST I(
DATABASE
I(
• RECORDS
/
CALCULATION
~ TREATMENT
Figure 4.2 The data base and its ancillary tables. The Protocol, Test and Calculation tables
are required to support application functionality. The remaining tables are user-defined,
providing application-specific functionality.
L1MS IN A DMPK LABORATORY 61
(b) Treatment group. In all but crossover-type studies, this is simply the
treatment given. For crossover studies, this entry is generally the leg of the
crossover study (e.g. Phase 1). The treatment given in that leg of the study
is entered to a separate treatment field in the sample record.
(d) Sample type. This records plasma, blood, dose solution, etc.
(e) Nominal time. This is a composite field that contains two pieces of
information. Both relate to the time the sample was collected in reference
to any preceding treatment. The Dose Nominal Time specifies the time
from study initiation that the reference dose was given. The Sample
Nominal Time specifies the time after the reference dose that the sample
was collected. The times are both described using a specific notation (see
section 4.3.3 below) and are designated as target, or protocol, times.
The descriptions given above are the most commonly used descriptions
for the Sample ID fields, but the fields are configured as simple
alphanumeric fields and under special circumstances the field contents are
other than those specified above.
specifically because the Department has responsibility both for the analysis
of the samples and for the subsequent pharmacokinetic data analysis,
LabManager has been configured to allow expression of the various times
relevant to a sample in a number of ways (Figure 4.3). In this scheme,
given the study protocol information of Study Start Date (Dose Regime
Start Date) and the Dose and Sample Nominal Times, functionality within
the system calculates the Daterrime of the last dose and the theoretical
(i.e. target) Sample Time (as a Date and Time field). The user is then
provided with the ability to update the theoretical sample time to an actual
sample time, and a user-defined procedure calculates two values for the
Time Post-Dose and enters the data to the sample record.
(aJ Logged. The sample record has been created in the database, i.e. the
system 'knows' about the sample, but the sample has not been received.
Nominal Dose
~:;~~~~/ Time
Nominal Sample
" Timel
DateITime of Last Dose ~
/
Actual Sample DatelTime
!
Time Post-Dose [xx.xl
Figure 4.3 Sample times in the LabManager database. All time values, except the two Time
Post-Dose fields, are component parts of LabManager with the Bioassay option. The Time
Post-Dose fields are user-defined functionality.
L1MS IN A DMPK LABORATORY 63
Figure 4.4 The sample status field. As samples progress through the laboratory. various
functions progress the status, as shown. Certain actions, such as adding tests to a sample,
result in retrograde changes to the status.
(b) Received. When samples are delivered to the laboratory for analysis,
analysts will use a specific 'Sample Receipt' screen to change sample status
to Received from Logged.
(c) Tested. When all tests assigned to a sample are completed, the status
changes to Tested.
(a) Priority. This is a value assigned to samples from a given study at log-
in, which allows samples scheduled for analysis to be sorted in order of
priority. Thus, when an analyst carries out a retrieval in preparation for
analysing samples, the high-priority samples are at the head of the hit list.
(b) Sex. This is specified in the protocol (see below) and assigned
automatically to the samples at log-in, to allow sorting of results by sex.
GR=PHASE 1
TR=TABLETS
10= AA001.CC003,EEOOS
SP=PLASMA,PHENYTOIN
SP=SALlVA,PHENYTOIN
OO=09HR
TI=lHR
TI=2HR
TI=4HR
T1=8HR
TI=12HR
TR=CAPSULES
10= BB002.00004,FF006
SP=PLASMA,PHENYTOIN
SP=SALlVA,PHENYTOIN
OO=09HR
TI=lHR
TI=2HR
TI=4HR
TI=8HR
TI=12HR
Figure 4.5 Example of a protocol script. Interpretation of the keywords in terms of Sample
Leader fields is as follows: GR = Group, TR = Treatment, ID = Subject Identifiers, SP =
Sample Type, DO = Dose Nominal Time (after study start), TI = Sample Nominal Time
(after last dose).
66 ADVANCED L1MS TECHNOLOGY
samples which will be in existence up to the following day, that is, the
calculated sample time is before the next day's date.
Following sample logging, subsequent sample processing follows an
essentially conventional pattern (Figure 4.5).
4.4.6 Security
The security functions in the system are implemented in a way that not so
much controls access, but in an ordered way is designed to permit access to
functionality and data. In essence, these approaches are the same, but the
philosophy is totally different. In the former sense, access control is similar
to restricting data access strictly on a 'need to know' basis, where in the
other sense, data are made available to anyone who may want to know,
may benefit from knowing, but may not strictly need to know.
In terms of the functionality that controls sample logging through to
approval, access is defined on an operator's experience, training and
management authority to carry out a particular function. The appropriate-
ness of the level of access granted to any group can be monitored through
LabManager's audit functions, and the fact that the identity of operators
carrying out the main operations on a sample (logging, testing, validation,
etc.) is detailed in the sample record for later review, if necessary. In
addition, the ability to modify or print worksheets, or print reports of
unapproved data, can be strictly controlled.
In terms of the results data in the database and data held in the ancillary
tables, such as study status information, broader access can be granted. It
has been proposed that there is little need to control access to status
information, or approved results. Thus, access can be granted to scientists
and managers outwith the department, for example, in the Clinical
Department, or QA Group, which allows them to look at status
information and even approved study data. Under normal circumstances,
pharmacokinetics staff would have access only to approved test data under
exceptional circumstances, but may be granted access to unapproved, or
LIMS IN A DMPK LABORATORY 69
As with all systems that are not custom designed, there will be
shortcomings. In the current setting, LabManager provides all of the major
functionality required, but some operations are cumbersome and some are
only achieved via a work-around. It would be desirable to review analyte
concentration-time profiles graphically at the validation stage. Such in-
context review is achieved more easily in a graphical setting, but this is not
possible without custom coding on the current system. In-context review is
restricted to review of tabular data.
At present, derived data such as pharmacokinetic parameters are not
submitted to LabManager. To do this would be straightforward, but
entering the data seamlessly without the need for manual transcription
would be problematic, given the various options available for generation of
the derived data. Alternative options for storage of such data are under
consideration.
70 ADVANCED LIMS TECHNOLOGY
References
5.1 Introduction
Clinical trials involving human subjects are a major step in the long process
toward approval of new therapeutic agents for the FDA or equivalent
agencies in other countries. Increasingly, pharmaceutical and biotechnology
firms rely on large central laboratories capable of providing analytical and
data services designed exclusively to meet the exacting requirements of
Phase I through Phase IV clinical trials. Using a variety of sophisticated,
fully integrated information technologies, a state-of-the-art centrallaborat-
ory can provide pharmaceutical clients with an expanded array of patient
and data management services. The advantages afforded by use of
advanced information technologies from the central laboratory include
improved patient management, more effective day-to-day management of
the study, and the assurance that an audited and edited database is always
available - from study inception to database lock. This chapter will discuss
the unique services provided by the centralized clinical trial laboratory and
the approach used by one of the larger central laboratories in the USA to
meet the needs of its pharmaceutical clients. A functional analysis is
presented which compares the implementation of a protocol-synchronous
LIMS with traditional sample-oriented systems in meeting the needs of the
clinical trial client. The use of off-the-shelf information technologies is also
discussed as a readily available means to ensure efficient and reliable
laboratory services.
Table 5.1 The expanding role of the central laboratory in the clinical development program
Traditional central laboratory Trends suggesting an expanded role for the central
function laboratory
Traditional central laboratory Trends suggesting an expanded· role for the central
function laboratory
continue during the coming decade and will largely be driven by the
availability of new information technologies and approaches.
To gain access to the comprehensive clinical trial services that are
currently available, pharmaceutical and biotechnology companies are
increasingly seeking preferred or sole-vendor relationships with those
laboratories which have developed both the capacity and the functionality
to provide value-added analytical and data services. To meet the
challenges of providing expanded services while at the same time
addressing the unique requirements of each protocol, the well-designed
central laboratory moving into the next century must use highly integrated
CLINICAL TRIAL LABORATORY SYSTEMS 75
Protocol-specific design
Visit sequence Visit sequence definition not Investigator, visit and test
inherent to traditional schedule are not predefined
LIMS at time of database
construction
Visit-specific kit design Use of generic kits. No pre- Study co-ordinators would
definition need bulk supply materials
and would add barcoded
labels to all samples that
correspond with requisition
accession number.
Deferred testing Testing completed on all Dictate, rather than
samples within 24-96 h of accommodate, protocol
receipt testing requirements.
Batch testing Batch testing at convenience Does not route and monitor
of LIMS system, not batch testing based on
protocol defined quantity and predefined
schedule.
Reflex testing General, rather than May have additional non-
protocol specific, reflex protocol or missed testing.
testing
Result reporting
Sponsor-specified design LIMS-specified report Transcribe information from
format standard laboratory report
to the CRF form.
Trigger statement Not applicable to LlMS Investigator would have to
system remember particular actions
to take based on a test
result, e.g. 'place patient on
vitamin supplementation' or
're-order given test before
enrolment'
Alert flagging Standard highflow flags print Would need to decipher
on report impact of high or low flag
based on protocol
therapeutic area.
Delta assignment Delta flags not inherent to Continuously monitor
traditional LlMS due to patient laboratory reports
visit sequence not being a manually for delta change
standard function of LlMS between visits.
Protocol management
Specimen management Not an inherent part of a Investigator would need to
traditional LlMS store samples or ship to a
variety of destinations.
Result management Not an inherent part of a Results captured at
traditional LlMS investigator site would need
entry into a common
database at the end of the
study.
Database validity
Patient information Straightforward key entry of Patient identification not
patient information. No guaranteed at time of
validity checking for database transfer and lock.
consistency between visits
Visit sequence Visit sequence definition not Integrity of visit sequence
inherent to traditional not guaranteed at time of
LIMS database transfer.
78 ADVANCED LIMS TECHNOLOGY
Figure 5.1 Laboratory Events Schedule. a component of most clinical trial protocols which is
provided to the laboratory by the sponsor prior to study commencement. listing the scheduled
Patient Events. The Laboratory Events Schedule represents the functional specifications
upon which the Time and Events Matrix (TEM) is based. As described in the text, the TEM is
one of three protocol-synchronous structures which are used to provide the specialized
functionalities needed to support the clinical trial process.
CLINICAL TRIAL LABORATORY SYSTEMS 79
Control
Time & Event
Structures
Structures
• MinIMax Allowable Age
o Pattern Matching
• Protocol
• AsstgnICheck Patient Number
• Study
• Investigator
Blod<s
• Visrt Name • Visit Sequence Verification
• Optional Test Restriction
• Vis~Type
• Non-Protocol Test Restriction
• Test Schedule
• Reflex Tesl Ordering
• Demographic Based Test
\
Ordering
Output
Structures
• Record a Data Clarification
• Delay Medical Report unlil
ReSOlved
• Activate VAX Fax-gateway and
send Clarifteation to Investigator
• Receive darification
• Update QLlMS
• Release Medical Report
Figure 5.2 Three major types of protocol-synchronous structures upon which the Quintiles
LIMS is based. Time and Event structures are derived from the sponsor's Laboratory Events
Schedule (see Figure 5.1). Control structures provide real-time validation and error checking
during entry of extemporaneous data. Output structures interact with both the Time and
Events Matrix and Control Structures to provide the specialized reports and other output to
clinical trial clients.
.<f
I A0107266
I A0109397
="TB-S"""'S
t =""c""he-d"""'ul-e--' .<f A0104418
Visit Type .<f
Patient Eyent
Visit name
Investigator
Testing:
Study
Hematology
Protocol Chemistry
Urinalysis
Weight Collection
-
A0107266
Fasting Information f--
A0109397
A0104418
Figure 5.3 Multidimensional matrix. The Time and Events Matrix (TEM) is a multi-
dimensional matrix which holds the six major elements of the sponsor's protoco\' The
accession number represents a discrete address in the matrix which defines a particular
Patient Event. Because all protocol design elements are linked in the TEM, numerous
protocol-synchronous Control and Output Structures can be devised to support the special
services of the clinical trial laboratory.
82 ADVANCED L1MS TECHNOLOGY
PATIENT POOL
Patient Event
Accession Requisition & Enter Patient Information· Establish Patient:Time & Event Link
Figure 5.4 Linking protocol Time and Events through the accession number. All elements of
the protocol's Time and Events Matrix are prospectively linked at the time of database
construction via a unique eight-character accession number. As shown, the accession
number's address defines the visit and test schedule, patient demographics, collection dates
and times, and other information related to a patient's visit to an investigator site.
CLINICAL TRIAL LABORATORY SYSTEMS 83
5.4.2 Using the time and events matrix to manage the production,
distribution and inventory of clinical trial materials
Clinical trial materials (CTMs) which are produced by the full-service
central laboratory include protocol-specific investigator instruction manuals,
visit-specific collection materials, mailers, and express courier airway bills.
Functioning as a linchpin for the time and event elements in the protocol,
QLAB's eight-character alphanumeric accession number is produced in
both human-readable and barcoded format on all CTMs that are produced
at the facility. This process is facilitated by high-resolution thermal graphic
printers (Zebra Technologies Corporation, Vernon Hills, IL, USA) which
can use specialized labels able to withstand extremes of temperatures and
humidities for both frozen and ambient specimens. Interfaced directly with
the Fisons LIMS, the thermal graphic printers and conventional graphical-
mode dot-matrix printers (Okidata 341Os, OKI America Inc., Japan) are
used to produce kit building work orders, laboratory requisitions,
84 ADVANCED LIMS TECHNOLOGY
barcoded tube and exterior kit labels, and preprinted airway bills for each
of the investigators.
As one of the only three central laboratories supporting global clinical
trials, QLAB serves in excess of 2 600 physicians throughout the USA,
Canada and Europe. Thus, systematic tracking and inventory control of
clinical trial materials is an important undertaking. Due to the protocol-
synchronous LIMS structures used in QLIMS, shelf inventories of all
investigators for a particular protocol can be automatically tracked via a kit
inventory module (KIM). The kit inventory module is directly interfaced
with the specimen receipt area of the facility where, using hand-held laser
barcode scanning guns (Symbol Technologies Inc.), QLAB personnel can
register receipt of a particular kit type by scanning the accession number
which appears on the exterior of the kit (Figure 5.5). This process registers
receipt of the kit into the facility, records the incoming airway bill number
and decrements the shelf inventory at the investigator's site. At night, a
background process monitors shelf inventories of kits and, based on
protocol- and investigator-specific thresholds for each kit type, work
orders are produced and printed in the kit building facility. The kit building
work orders are used by the laboratory operations personnel to produce
replacement kits for shipment on the following day. As with all specialized
functions within QLIMS, the kit inventory module was written using
Fison's proprietary programming language, VGL. The open structure of
the Fisons LIMS allows the efficient development of new functionalities as
the needs of the central laboratory expand. Quality checking and
distribution of outgoing CTMs are also monitored within QLIMS using the
laser scan gun approach. As kits are constructed for each investigator, each
kit is quality checked by scanning the kit contents prior to packing into
large shipping cartons. As shown in Figure 5.6, this process ensures that
the kits contain the appropriate materials (e.g. laboratory requisition,
blood and urine collection tubes), that all accession numbers on the various
items in the kit match the laboratory requisition, that the shelf inventory at
the investigator site is automatically incremented, and that records of the
outgoing airway bill number are made for the purposes of audit-trail
documentation. It is the use of protocol-synchronous data structures
during the construction of the client's database that allows a broad array of
quality checks to be carried out with a single scan of the accession number.
5.4.3 Using the time and events matrix to monitor receipt and validity
checking of incoming specimen collection kits
For outgoing kits, laser scanning of the barcoded accession number
provides several key quality checks and permanently documents these
checks in the Patient Event audit trail. This same process is used to register
receipt of each incoming specimen collection kit shipped from the
CLINICAL TRIAL LABORATORY SYSTEMS 85
Figure 5.5 Photograph showing laser scanning of incoming specimen collection kits.
Incoming specimen collection kits are laser-scanned to document that all necessary items have
been received, to determine their condition upon receipt. and to ensure that all barcoded
accession numbers match the laboratory requisition. Missing or unauthorized items are
documented in the client's database audit trail for future reference.
investigator site. Due to the prospective linking of time and event elements
from the protocol, it is possible to complete and document an array of
functions simply by scanning the incoming accession number. As shown in
Figure 5.7, these functions include:
• Decrementing of shelf inventory at the investigator site
• Recording of time and date of specimen receipt
• Validation of the accession number on each item in the kit to ensure
that this lynchpin number is consistent on all items and on the
laboratory requisition
• Ensuring that all expected items have been received and documenting
any items that are missing
• Recording receipt of any unauthorized or unexpected items
• Recording the incoming airway bill number
• Recording all of the above in a permanent Patient Event audit trail.
The ability to undertake highly automated validity checking of the
incoming specimens is an important benefit to the central laboratory. By
doing so, the accessioning process and analysis of specimens can occur in a
timely manner, thereby allowing for the transmittal of critical patient alerts
to the investigator sites within hours of specimen receipt.
86 ADVANCED LIMS TECHNOLOGY
Figure 5.6 Photograph showing laser scanning of outgoing specimen collection kits. Specimen
collection kits are manufactured using barcoded work orders. Prior to shipment to
investigator sites, kits and work orders are laser-scanned to record the airway bill and
accession numbers in the client's database audit trail and to increment the QUMS shelf
inventory in each kit type on file at the investigator site.
Clearly, the time and events matrix (TEM) is a useful LIMS structure for
managing those elements of the sponsor's protocol which can be
predefined in the database in advance of study start-up. The manufacture,
distribution and receipt of protocol-specific clinical trial materials is an
example of how a well-defined time and events matrix can be used to
manage efficiently events in the protocol that are prospective by nature.
While, indeed, many times and events in a research protocol can be
foreseen, it is the nature of other events in the clinical trial to be
extemporaneous or ad hoc. Examples of protocol events that are
CLINICAL TRIAL LABORATORY SYSTEMS 87
.------,!'"~."", WIth
Record DatefTime
of Kit Receipt
Figure 5.7 Flowchart showing validation of kit contents using protocol-synchronous control
structures. Specimen kits and contents are laser-scanned to validate and capture relevant time
and transport information in a regulatory-compliant audit trail. Control structures are used to
detect protocol violations and thereby initiate an automated fax request for data clarification
to the investigator, either at the time of kit receipt or during sample accessioning.
Scanning of kit contents with Protocol-driven kit contents If plasma level tube scanned,
scan guns module call site.
Key entry of patient Investigator name and Pattern matching of patient
identification at time of number, visit name and identification schema
accessioning visit type Reasonability limits on
kilogram entry
(QUMS will trigger
verification on an entry of
> 130 or <30 kg)
Automatic ordering of Test schedule Age group 18-50 for visit 1
pregnancy tests for and 7 only.
selective patients
Investigator requests Allowable non-protocol Non-protocol testing limited
additional testing testing defined in TEM to T4 levels only.
Reflex ordering of Reflex test schedule Medical technician confirms
additional test analyses elevated total bilirubin
based on results obtained >2.0, which activates a
from regularly scheduled trigger routine within
testing QUMS to automatically
order direct bilirubin.
number, the QLIMS operator need only input demographic and secular
data (e.g. collection date and time, or time of last meal) as shown on the
completed requisition. QLIMS immediately responds with a series of
protocol-specific queries that are derived from the control structures
predefined in the sponsor's database. Because the accessioning process
involves a series of extemporaneous events, a corresponding series of
protocol-synchronous control structures is used to ensure that key-entered
data (or LIMS-generated events) are compatible with the protocol's time
and events matrix. Among the most important of the protocol-synchronous
control structures used in the accessioning process are: (1) minimum and
maximum allowable age of the patient at enrollment; (2) pattern matching
on screening and randomization schema; (3) assignment and checking of
blocks of patient identification schema; (4) phraseology mapping of visit
name and visit type to match those specified by the sponsor; (5) restriction
of optional tests allowed per protocol; (6) control and restriction of non-
protocol laboratory testing; (7) and, if requested by the sponsor, the
automatic ordering of selected tests based on patient profiles, e.g.
pregnancy testing. For visits occurring subsequent to the initial visit,
additional protocol-synchronous controls are used to ensure that the visits
are in the sequence specified in the protocol and that patient demographics
and identification schema match those previously key-entered for the
patient whose samples are being accessioned. Generic control structures
(Le. not specific to a particular research protocol) are used in concert with
protocol-synchronous controls to direct actions such as automatic cancella-
tion of testing for specimens which were received beyond the established
stability for each analyte.
5.5.2 Managing the data clarification and data revision processes through
control structures
Although protocol-synchronous control structures can be used to support a
wide range of protocol-specific functions, one of the most important is to
provide automated validity checking of key-entered data at the time of
receipt of the samples from the investigator sites. Requisitions which fail to
pass protocol-specific queries during the accessioning process (e.g. due to
missing or conflicting information) are automatically sequestered by
QLIMS within a Data Clarification Queue (DCQ). Accession numbers
entered into the DCQ are subject to immediate follow-up and clarification
with the investigator. Sequestration of the accession number in the DCQ
provides several important controls: (1) printing of the laboratory report is
temporarily suspended pending outcome of the clarification query; (2) a
GCP-compliant audit trail linked to the accession number is created which
registers the reasons for requesting the data clarification and the resulting
clarification itself; and finally (3) a queue of accession numbers is created
90 ADVANCED L1MS TECHNOLOGY
5.5.4 Using control structures to manage and document phone alert and
reflex messaging functionalities
The protocol-synchronous range-checking control structures described
above provide an efficient, reliable and fully automated means to ensure
that critical telephone, panic and exclusionary flags are annotated on the
printed laboratory report. However, since central laboratory data are used
to ensure patient safety throughout the course of the clinical trial,
markedly abnormal results that violate sponsor-approved ranges must be
conveyed to the designated parties in a timely and documented manner.
Using a series of background processes and protocol-synchronous control
structures, QLIMS continuously monitors incoming data from the laboratory
instrumentation for those results which violate critical, protocol-specific
alert ranges. Accession numbers which have laboratory results appended
with an alert flag (telephone, panic or exclusionary value) are automatically
transferred to the protocol services group using a QLIMS-resident Result
Alert Queue (RAQ). The RAQ is monitored throughout the day by
QLAB staff and, based on emergency contact control structures, critical
values are phoned to the investigator, the CRO or the sponsor within
minutes of receipt from the laboratory. Functionalities within QLIMS
allow the Protocol Services representatives to reference recipient phone
and fax numbers easily and then record the identity of the party to whom
the critical result was conveyed. QLIMS automatically time- and date-
stamps the transaction and enters these data elements in an accession
number-linked audit trail for reference as needed. To convey detailed
information which is less 'time-critical' than telephone or panic alerts to
CLINICAL TRIAL LABORATORY SYSTEMS 93
Valid
Record Patient Ii -+
Information
Event
Clarification
Required
Figure 5.8 Using control and output structures to support the data clarification process.
Barcoded laboratory requisitions are scanned to activate protocol-synchronous control
structures used to validate operator-inputted data. Patient data which violate control
structures are sequestered by QUMS into a Data Clarification Queue pending contact with
the investigator. Requests for data clarifications are accomplished using protocol-synchronous
output structures, which provide an automatic fax message to the investigator site complete
with patient information held in the UMS.
5.7 Summary
.
ICYCLOSPORIN 114
+ --•••••-.-••••••••••-.- •••••••.••••••••• _- ••• ----_ ••--.•••••.••••• --••••••••• -•.•••-•• +
13 5818 29089703 SMITH, BOB
R B 22:00 3/10/94 10210·01
M 23Y
+ •••••••••••••••••••••••••.•• PREVIOUS TESTS .••••..•••.•.••••.•••.•••••• +
lTEST: CYCLO COLLECTION: 1130 318-94 SPECIMEN: B
ICYCLOSPORIN 215
•• END OF REPORT ••
the instrument in batch mode. Hence, when the instrument scans the
specimen barcode prior to analysis, the instrument can retrieve which
assays to perform on the specimen from the local database.
6.3.10 Billing
The capture of laboratory billing data is an essential function of the LIS.
Billing charges are captured for some tests at test request time; other tests
are billed by the LIS when results are verified. The LIS must handle a
variety of billing discounts, especially in reference laboratories. Billing
transactions are transmitted to the HIS from the LIS on a daily basis. The
actual patient statements and accounts receivable functions are rarely
maintained on the LIS. Rather, these functions are usually performed by
the HIS accounts receivable module.
MEDICAL LABORATORY INFORMATION SYSTEMS 105
The sales 'pitch' during demonstrations. The user should beware of being
mesmerized by a gifted salesperson who is adept in presenting the system
in its best light during demonstration sessions. Note that the salesperson
has been trained to maximize the system's strengths while minimizing its
weaknesses. It is the potential user's responsibility to probe with
appropriate questions during such demonstrations and find the weaknesses.
The importance of site visits. The prospective user should pay more
attention to the function of the LIS system during various site visits where
the system is actually being used. Prospective users should talk directly
with site users of the system and log their unbiased remarks. Good systems
tend to have a cadre of devoted users who demonstrate a great deal of
'ownership' during site visits. Because the selection of a LIS involves the
purchase of a highly complex package of technology, the less enlightened
potential user tends to be easy prey to the pitfalls of the selection process.
Certainly, the use of an experienced consultant is a worthwhile investment,
assuming that unbiased advice is obtainable.
will be required for survival in the future. Newer computers will continue
to be more powerful and less expensive. Networking will improve along
with better-designed software. New systems will employ more powerful
workstations and networking. Software will be more 'off-the-shelf', much
as personal computing software is presently. Users will become more
sophisticated and will demand more flexibility and adaptability in
commercially available systems. Clearly, vendors will be forced to offer
less proprietary solutions than presently being offered. Many vendors will
join forces to offer a variety of solutions and options to clients as
connectivity becomes easier. Governmental regulations will continue to
expand and will require computerization of quality control and quality
assurance programs. Standards for data interchange between computers
and between computers and automated instruments will continue to
improve the connectivity between systems. Automation and computeriza-
tion will continue until essentially all aspects of the testing process are
included. Users will continue to expand their knowledge and skill in the
use of laboratory information systems. We can anticipate the entire
medical laboratory industry undergoing a vast improvement in the
accuracy and speed of laboratory services because of new capabilities
afforded by automation and computerization.
References
7.1 Introduction
7.2 Development
YES
DOCUMENTATION COMPLETED
LOGICAL DESIGN UPDATE - FUNCTIONAL SPECS. DOC.
• CASE TOOLS - ACCEPTANCE TEST PLAN
• FUNCTIONAL SPECS. DOC. - USER'S REFERENCE GUIDE
• DATA DICTIONARY
NO
• Together with the third rule, it meant that the laboratories could adapt
the system to their needs rather than adapting the laboratory to the
constraints of the UMS.
The key consequence of the above rules is that all the requirements of all
of the participants could be accommodated in some way.
To help the process along using the ground rules, a list of all the possible
functions that could be identified was developed for the first workshop.
This was the basis from which the system requirements were specified, and
bound the scope of the system at the top architectural level.
A total of six workshops was held over about six months. Between each
workshop, the functions, features and interface specified at the previous
workshop were prototyped for presentation at the next workshop.
Simultaneously the logical design was updated using the RDBMS's CASE
tools, and the RUMS Functional Specification document and the data
dictionary were modified to reflect the latest requirements. A package
containing the latest documentation was sent to the workshop participants
one to two weeks before the next workshop. At the workshop, the
prototype was demonstrated and the participants exercised the prototype
during hands-on sessions.
Rapid prototyping helps to avoid several major pitfalls common in
analyzing user requirements. Some conditions that rapid prototyping
alleviates are:
Test Plan was written. A 600-page EPA RUMS User's Reference Guide
was also produced.
The various components of the system were peer-reviewed and any
inconsistencies in appearance or functioning were eliminated. Next, the
components were integrated and tested. The final testing was based on the
EPA RLIMS Acceptance Test Plan. Three rounds of acceptance testing
and subsequent corrections were required to achieve final system accept-
ance.
7.3 Implementation
EVENT
'" ""
EVENT
/ /
""
EVENT
I / '"-'
EVENT
"'/
,;f"
'"
flexibility and include sample QC, instrument run QC, and prep batch
QC information; they also define the method QC requirements, and
advance analysis statuses with a review option if desired .
• The Archive/Restore modules provide the capability to move historical
data from the database or to the database as required to use storage
space best; the data may be archived or restored based on project,
batch, or instrument run.
Reports are not a part of the core system, but many example reports are
included in the delivery package as a help to the laboratories in designing
and implementing their own laboratory-specific reports.
The normalized, relational data model efficiently represents laboratory
information. RUMS has a unique way of processing analyses that require
multi-analyte reporting. RUMS handles these analyses through predefined
or sample-specific analyte lists. The analyte lists are subsets of RUMS
methods and allow customized reporting of analytes such as metals by ICP
(inductively coupled plasma) instruments.
Another distinctive feature of RUMS is the instrument run. Laboratory
samples, calibration standards, method blanks, and other QC samples are
stored with the original instrument sequence in the RUMS instrument run
tables. RUMS reports can then recreate the instrument run for reporting
samples with the appropriate QC data.
The RUMS database and core set of modules, delivered with source
code, documentation, training, and flexible Oracle 4GL and reporting
tools, provide a laboratory with a solid base for implementing a UMS.
And RUMS fully conforms with the nine general principles listed above
that were the basis for its design and implementation.
7.4 Conclusions
References
8.1 Introduction
samples per day, which equates to over 2 500 000 determinations per year.
The type of samples analysed include: wastewater; soils; trade effluent
discharges; and sewage sludges. The range of determinands covers:
microbial agents; metals; biological and chemical oxygen demand;
nutrients; and organic pollutants.
Figure 8.2 Simple conveying system showing a pusher unit. which can be used to transport
sample containers or crates.
130 ADVANCED LIMS TECHNOLOGY
8.5 Auditability
BAR CODE
---l SAMPLE
rIll:ADE.!S.~ ·1 PLANNING AND
SCHEDULING CUSTOMER ~
SYSTEM I !' ( ~~IMAINFRAME:
,~/ COMPUTER ,
..------~
LABORATORY
TERMINAL~ MANAGEMEN
;~:~~A~ Ji'..IlI -----.-
, SYSTEM_.. __ .._ . ~---------.. . ANALYTICAL TlONI
INSTRUMENTA
i BARCODE
~
jt --
,I
L ~
~AiJTOMATEDI
iEQUIPMENT _ t
BARCODE I
READERS _.J
Figure 8.3 Laboratory information management system interfaces supporting automated equipment.
LIMS TO ROBOTICS INTERFACE 133
or procuring a new more appropriate system. It soon became clear that the
traditional PC-based LIMS would not be able to support real-time
operation, unless it was running a multitasking operating system and had a
very large database capability. Thus North West Water invested in a new
LIMS system. The preferred system had to support a fully automated
environment.
The selected LIMS package (ChemLMS, Hewlett Packard) conformed
with a functional design specification and technical specification produced
as part of an initial design study, ensuring that the basic required
functionality was available. This included a certain amount of forward
planning to cater for future requirements. Justification of this level of
investment depended upon the business benefits to be derived from
centralisation of the laboratory services. The new LIMS system had to be
reliable with the minimum amount of downtime, particularly as all the day-
to-day operation would be under direct control of the LIMS. The added
risk to the company had to be minimised by providing contingency plans.
This involved having two computers, one on duty and the other on
standby, in order to protect the business against hardware and software
failure. Maintenance and support agreements with suppliers of the
hardware and software have been negotiated to provide for the right level
of response to minimise loss of operation.
Providing the links from equipment to the LIMS required an extensive
data network involving Ethernet and token ring cabling. A schematic
diagram of the LIMS network is presented in Figure 8.4, where it can be
seen that a number of laboratory area networks were installed which, for
resilience and reliability, were interconnected. This architecture reduces
the effect of a failure by a network segment because only a simple patching
exercise is required to reconnect the equipment to another segment. Each
computer has RAID (redundant array of inexpensive disks) storage
facilities. These again are interconnected so the possibility of losing data is
minimised. Connection of equipment to the networks has to be controlled
and sufficient spare capacity made available for expansion. Laboratories
by their nature tend constantly to uprate instruments and equipment, so
there must be spare capacity to accommodate changes.
Figure 8.4 An example of a laboratory information management system network supporting automation.
L1MS TO ROBOTICS INTERFACE 135
and provided added value, and were not just designed to look impressive.
Automated equipment had to be reliable and resilient in order to
engender confidence in the North West Water staff. This was achieved by
incorporating tried and tested technology in the design of automated cells.
Part of the initial design study was an evaluation of types and suppliers of
the individual components to be used in analytical cells. The source of
some components came from areas not normally associated with laboratory
functions e.g. conveyors are normally used in factories and warehouses
(Figure 8.2).
Scientific equipment tends to be expensive. A custom-built robot for use
in a laboratory can cost up to £50 000. A similar robot used in industry
(Figure 8.5) will cost £25 000, so this can represent a considerable saving.
However, these prices do not include the cost of development work needed
to get any system into a production ready state.
When the equipment has eventually been manufactured and installed
there has to be a commissioning period during which the automated cells
are optimised and their analytical capability validated (Figure 8.6).
Automated equipment only starts to be of value when it goes into service.
To ensure that commissioning was completed within the minimum period
testing of the systems followed a defined plan, enabling progress to be
monitored regularly and accurately. The high level of uncertainty in
Figure 8.5 An automated cell consisting of an articulated robot and a sample container
conveying system.
136 ADVANCED LIMS TECHNOLOGY
Figure 8.6 General layout of an automated cell during the commissioning phase.
The decision to automate was not undertaken lightly and the laboratory
now has the appearance of a production line in a factory (Figure 8.7). It
was necessary to redefine the skills required by the laboratory personnel
and consider employing staff with different skills to use the equipment.
When all the equipment is fully operational there will be a significant
reduction in staff numbers. To enable this reduced work force to be more
effective, the concept of self-managing teams was introduced, which will
LIMS TO ROBOTICS INTERFACE 137
enable staff to take more responsibility for the day-to-day running of the
laboratory. This approach will also facilitate the culture change required.
Appendix
References
9.1 Introduction
Computer systems are very adept at taking data that have been entered
and then manipulating, storing and passing them on to other computer
systems. The perennial challenge is getting the data into the system in the
first place.
Originally, punched cards and then paper tape were the only methods
available. Over the last 20 years, interactive data input using a keyboard
has taken over as the method of choice. Any data that needed to be
entered into the computer system were entered by keyboard operators.
This is how the early Laboratory Information Management Systems
(LIMS) operated. The analytical results were obtained from the appropriate
instrument and the data were entered along with sample information into
the LIMS. A summary of the development of manual data entry to LIMS is
shown in Table 9.1.
UMSPC Instrument PC
Report
I Manual Data Make
Result Transfer Measurement
I L...
__ ~I+-. _
(c)
Figure 9.2 Barcode reading system: (a) bench-mounted; (b) hand-held scanner; (c) pen and
wedge.
L1MS PC Instrument PC
Read Sample
Analysis
Enter
Sample
D
Details Details
Report
I Manual Data
I
Automated Data Make
Result Transfer Transfer Measurement
I L.._~...._~
__--'
Figure 9.3 Fully automated analysis procedure.
INTERFACING THE REAL WORLD TO L1MS 145
UMS
Database
Balance UMSUnk
Figure 9.4 Very simple instrument automation application used to illustrate the difference
between 'interfacing' and true automation.
Table, where its calibration and maintenance histories are stored. If the
instrument is outside the stated 'grace' period, LIMS can disallow use of
the instrument and present a choice of alternatives.
Acquire data. The analyst measures solution pH with the pH meter and
after the reading equilibrates, strikes the pH meter's 'send data' key. The
LIMS interface program acquires and decodes the ASCII data string sent
from the pH meter and displays both the pH value and the temperature of
the first replicate with the message, "Awaiting Reading 2 ... ". This
process is continued until all replicates have been analysed.
Execute calculations. After all the replicates have been acquired, the
LIMS interface may execute a simple calculation to determine the average
of the replicates.
Compare against limits. The LIMS interface program queries the LIMS
database for the specification limits fo this pH test for the specific product
or sample type. A test message of "Pass, Fail, or Warn" is determined and
the status is displayed to the analyst, prior to posting the data to LIMS
permanently.
Accept/reject results. The LIMS interface then presents the analyst with a
choice. "Accept" the results will result in the data being transmitted to the
database. "Reject" discards the acquired data. Implementation of this step
in the instrument automation process mayor may not be desired by
laboratories. Definitions of raw data vary from laboratory to laboratory.
Many laboratories prefer to empower analysts with the authority to
interpret the validity of an analytical result at the instrument level. This
implies that the data do not exist and that the raw data have not been
generated until they have been stored in the LIMS data files. Other
laboratories choose a much more rigorous approach and do not present the
analyst with a choice. This implies that the raw data come into existence
when they are output from the digital instrument. The definition of 'raw
data' is a fundamental responsibility of the laboratory and is usually
documented in the organisation's standard operating procedures.
connection) to the LIMS. This automatically stores the results against the
appropriate sample and test, assigns a test disposition, and updates the
status of the test. The test results are recorded with a full audit trail of
testing activity including the time-date-user stamp, instrument number,
and all replicate results if desired.
From this simple example, it is evident that 'instrument interfacing' or
the data acquisition represents but one component of the complete
instrument automation process. Instrument automation automates the
laboratory's procedures, their rules for operation or standard operating
procedures (SOPs), for analyses. Even a concept as fundamental as the
definition of raw data must be recognised and then effectively implemented
in instrument automation.
Instrument automation applications are further complicated by the
range of completely different analyses that are typically performed on a
single instrument. For example, a balance in a laboratory may be used at
different times for simple weight measurements, loss on drying, or content
uniformity. The calculations, logical processing, even the qualified analysts
for each of these methods will vary significantly. Most laboratories
establish basic guidelines for instrument automation applications which
dictate common procedures for user, sample, and test identification,
analyst and instrument verification. A number of different aspects of
instrument interfacing to LIMS are discussed in the following sections
illustrated by the approaches of three of the major LIMS manufacturers.
9.3 Beckman
LabManager
Database
LP = LlL Program
Lab
Manager
Forms
Applications
Figure 9.5 Instrument automation software components and their interaction with each
other.
fJJ!.!}i4
< >
SDectCQDbotpmeter
Assay
Barcode
~
{I
Wodcsheet
Weight
LlMS Link
--+;;;;;;
11I11I .
11I11I ••
11I11I
LabManager
CPU
LAN
Figure 9.6 Examples of a LIMS Link employed as a 'work cell' instrument automation
configuration. Three instruments and a barcode wand are connected to the LIMS Link.
9.3.6.1 LIMS Link Instrument Coupler. Figure 9.7 illustrates the instru-
ment automation scenario utilising a LIMS Link Instrument Coupler. The
LIMS Link is a particularly attractive tool for automating 'single-reading'
instruments such as balances and pH meters. It is also employed often as a
'work-cel\' station when two or more instruments are proximate and are
used together in related analytical techniques. For instance, a balance and
Karl Fisher titroprocessor form a common instrument 'cell' for the analysis
of water content.
In the example illustrated in Figure 9.7, a LIL program for the
application has been developed and downloaded into the LIMS Link,
where it resides. The program typically provides many of the discrete
functions such as analyst and instrument verification, acquisition and
parsing, database access and transmission of results, etc. The LIMS Link
may optionally be equipped with a barcode wand or scanner to facilitate
entry of analyst, sample, and/or test information. In the event of a power,
network, or computer failure, the battery-backed and buffering capabilities
of the LIMS Link can support continued, autonomous operation .
T
LabManager
P
Database
o
Figure 9.7 Logical processing for the instrument operation scenario employing the LIMS
Link Instrument Coupler.
152 ADVANCED LIMS TECHNOLOGY
After appropriate interaction with the LIMS Link, data from the
instrument are acquired by the LIMS Link in an ASCII format. The LIL
program parses the string or file, calculates final results, and if they are
accepted, creates a test data entry transaction (TDAT). The transaction
(message) is transmitted through the host port of the LIMS Link, which is
defined to the LIMS host. The serial port may be directly connected to the
host or may operate on a terminal server on a local area network. In this
example, IDAS operates as an 'instrument operating system'. It maintains
a directory of all the instruments connected to the LabManager system and
their associated ports, manages the multitasking of all the other LIL
programs that may be resident on the host, and serves as a 'traffic cop' in
directing externally generated transactions to the TPO. Once accepted by
TPO, the transaction is processed by LabManager LIMS. In the TDAT
format, the individual components of the test result are updated into the
appropriate test and sample. The results are compared to specification
limits (if they exist) and assigned a disposition of "pass, fail, or warn". The
result data are tagged with the time-date-user and the instrument number
used in the analysis. A message denoting that the data were acquired
online is also possible. The test status is then automatically updated from
"Awaiting testing ... " to "Tested, pending validation ... ".
labManager
Database
Figure 9.8 Logical processing for the instrument automation scenario employing the
LILiforms approach.
9.4.1 HP ChemLMS
HP ChemLMS is an Oracle-based LIMS that operates within the Hewlett-
Packard HP-UX open systems environment. HP ChemLMS can be
installed on a wide range of hardware platforms, starting from the HP 9000
Series 700 workstation up to the HP 9000 Series 800 symmetric
multiprocessing Corporate Business Server. This product can be used to
create a highly customised, automated LIMS that helps laboratory
personnel to be more productive. HP ChemLMS automates laboratory
methods management and provides a secure data store and integrated
audit trail of all changes to laboratory definitions, processing rules and
data. The system can be tailored to work and automated as required by
using the component parts of the operating system (Figure 9.9).
HP-supplied applications
Custom languages (PSLlCP)
Base product clients and servers
Oracle relational DBMS
PA-RISC servers/workstations/H P-U X
9.4.3 C to CP communication
The C to CP library is a family of C-callable library routines that allow a C
program to read or write data to ChemLMS Oracle data bases (including
array read and write). This application program interface (API) enforces
security and audit trail rules to maintain a secure LIMS data repository.
A C program may use the C to CP library to start the execution of a CPL
macro residing on the ChemLMS machine.
To connect a C application to ChemLMS, one of the following two
communication methods can be used:
• The pipe communication method. Select this method if both the
ChemLMS and C application processes run on the same machine .
• The TCP/IP 'sockets' communication method. Select this method
whether the ChemLMS and C application processes run on the same
machine or on different machines (client-server). After the C
application starts, it connects with the ChemLMS background server,
ccpSocketMgr.
The library currently supports HP-UX C programs only.
9.5.1 Introduction
Interfacing instruments to the LIMS database is achieved with a Windows-
based software package called LabSation running on a 486 IBM-compatible
PC. LabStation is a client-server application that can acquire and manage
data from eight devices by way of RS-232 ports when networked to Sample
Manager, the core LIMS module for managing sample, test and result
data. Protocol converters enable instruments with parallel, IEEE or HP-IB
outputs to be interfaced with the LIMS. LabStation can also operate in
standalone mode and include the ability to transfer data by way of DDE
(Dynamic Data Exchange) to other compatible Windows applications
(Microsoft Word, Excel, etc.).
User Interlace
Samples
MethOds Tests
Archiver IMM
Network Comms
Drivers Drivers
Network
RS232
1
I
lIMS Data
Base I Instrument
Instrument
I I
Figure 9.10 LabStation architecture.
application specified. The next layer defines how the IMM will be used to
create an application.
9.5.2.4 Methods and Tests. Methods and Tests provide the primary-
application specific code which controls and acquires data from instruments
and stores the results in the sample. The role of Methods is to implement
the operation of the instrument(s). Tests are used to specify the
miscellaneous data required to perform the method. The use of Tests
facilitates the creation of generic methods. For example, a Test could
specify the number of replicates required for an average weight analysis
and the same Method would be used to get each of the results.
9.5.2.5 Samples. Samples are the standard data representation, all data
being stored within a sample object.
9.5.2.7 The Archiver. Once results are available they may be transferred
to the LIMS database. The Archiver gets the data from the Sample
objects, packages the data and manages the transactions within the
database server. The interface enables the analyst to configure how the
data should be transmitted, either on command or automatically on receipt
of the data packet.
9.5.2.8 Host Interface. This Host Interface provides the logical link to
the host computer. There are usually two co-operating components, one
part running on the client and the other on the host. The Archiver uses the
Host Interface to talk to the host and all data transfer and remote
procedure execution are accomplished by using the link.
9.5.2.9 Network drivers. LabStation does not provide its own network
drivers, but instead relies on drivers provided by third-party vendors.
"C;' INSTRUMENT
=
=
=
=
=
=
LABSTATION (PC) =
'--
DIGIBOARD PATCH
PANEL
Usually 4-6 Inotrumento
LABSTATION (PC)
case serial lines are required to allow data transfer between the instrument
and a LabStation, and then Ethernet from the LabStation to the LIMS.
only output from the instrument was by way of a parallel line to a printer.
Incorporation of a parallel to serial converter has enabled data to continue
to flow to a printer as well as to LabStation.
been combined to offer a user interface for both Challenger LIMS (BP)
and Beckman LIMS. The software package has been designed to offer the
user the option to follow samples and procedures and allow the system to
update data files in the LIMS automatically.
As indicated in previous sections, major LIMS vendors offer interface
modules, but few have addressed the problem of how to tackle the large
number of manual laboratory methods that have not yet been automated.
Also, most systems allow a multitude of functions that generally make the
system difficult or time-consuming to use. The most frequent LIMS user is
the laboratory analyst who requires very little functionality; a list of
samples for analysis and a simple method of reporting results are
frequently all that are required. Most laboratories are aiming to introduce
quality assurance systems to their operations and are implementing either
ISO 9000 or BS 5750 systems. Both systems rely on procedures and
auditing. The problem for laboratory management is to develop routines
that ensure that staff follow the prescribed analytical methodology, record
relevant data and archive results. Personal Interface (Process Analysis &
Automation) is an easy-to-use device that interfaces analysts to LIMS and
achieves these objectives.
The Personal Interface development is based on the new generation of
mobile PCs. The AT&T Safari 3115 PC has been selected for the task and
is a full implementation of a 486/25SX computer with 20 Mbyte virtual hard
disc, 4-8 Mbyte RAM, and VGA resolution monochrome liquid crystal
display. The device has been designed to be rugged and can be dropped on
to concrete from 1.5 m. Data input is by way of a stylus which can either
emulate a mouse or be used for alphanumeric input. The system can be
used in either the MS-DOS or the Windows 3.1 environment.
The initial data transfer can be achieved either by Ethernet or more
conveniently by WA VELAN. This is a radio-based link between host
computer and a mobile pen computer that allows full communications
throughout the building and laboratory without the inconvenience of a
cable. Using a PCMCIA WAVELAN card, the PC can be logged on to a
site computer system using a radio-based communication system rather
than an Ethernet wire.
The laboratory supervisor can download the analyst's daily work list
from the LIMS by way of the site Ethernet to the Notedpad PC using a
PCMCIA WAVELAN interface card. A suite of software presents the
operator with electronic data sheets to complete as each analysis is
progressed. These data sheets are identical to their traditional paper
copies, and in the training mode detailed instructions are given for each
stage of the analysis. Results can be compared with specification values, so
facilitating an immediate repeat if necessary. When some or all of the
analyses are completed, the computer transmits the data file of results back
to the LIMS for validation prior to LIMS database updating (Figure 9.12).
164 ADVANCED LIMS TECHNOLOGY
Acknowledgements
10.1 Introduction
A LIMS, like any other major purchase, should be replaced when it does
not meet current needs, when needs are changing due to an organizational
or re-engineering change, or when it becomes too difficult or expensive to
maintain. Very often more than one of these factors will be involved. Each
one will be explored separately.
166 ADVANCED LIMS TECHNOLOGY
10.2.1 Current LIMS does not adequately support business strategy and
user requirements
The laboratory and laboratory information are becoming increasingly
important in helping the corporation to achieve its goals. Most organiza-
tions are recognizing the importance of analytical skills in corporate
performance, and laboratories and their information are playing more
important roles. Laboratories are in transition from being providers of
requested results to being team members in addressing corporate problems,
opportunities and competitive advantage.
Most existing LIMS were implemented to meet the immediate needs of
the laboratory. Often the justification process alluded to benefits outside
the laboratory, but typically cost pressures kept the focus on the immediate
needs of the laboratory and other potential benefit areas disappeared from
the project plan.
LIMS often represents an 'island of automation' within the corporate
infrastructure. As the pressures on the corporation to perform increase,
laboratories must become more cognizant of their ability to assist in
meeting corporate goals. This results in a broadening sphere of influence
and increasing requirements on the system. The insular nature of most
early LIMS projects, and also the design of the early LIMS, results in many
LIMS being incapable of accommodating new requirements.
To be strategic, the LIMS must allow the company to achieve important business goals.
• Excellent customer service
• Improve regulatory compliance
- Environmental
- FDA
- Accounting
• Improve product quality
• Reduce new product introduction cycles
• Improve quality and timeliness of product applications
can be used. The next section addresses changing requirements due to the
changing structure of business and new ways of looking at business
processes.
deterrent to change, external forces that require new integrations can often
catalyze the switch to a new LIMS. For instance, in a stable environment,
the users might be ready to let the current system survive for a substantial
additional period of time. However, in a changing environment, the
realization that changes to the existing system will be expensive will often
force management to take a closer look at its overall architecture and
recognize that the existing system should be replaced before making
substantial new investments in integration.
The major factors in this situation are often timing, risk management
and project management. It is beneficial to avoid the cost of integrating a
new LIMS with old applications or the old LIMS with new applications or a
new infrastructure. This situation will often lead to substantial pain and
costs, and the overall objective must be to minimize both the pain and
costs. Although it may be attractive to attempt to co-ordinate the projects
tightly, this often results in projects becoming too big and unmanageable,
resulting in loss of communications, major delays, and higher costs. It is
usually best to keep projects small and manageable, emphasizing
communications to maximize success, and having carefully thought out
contingency plans to handle expected deviations from the overall plan.
• Hype
• Maturation
• Deterioration
• Eradication
What are the strategies of the salesperson, the vendor, and the customer
during each phase of the life cycle? How do the strategies of the
salesperson and the vendor differ from the strategies of the buyer? It is not
unethical for a vendor or salesperson to paint their product in its best light.
However, it is important for the customer to understand the process, the
market, and the technology and to develop a strategy to meet the
company's requirements and protect the company's investment.
174 ADVANCED LJMS TECHNOLOGY
During the initial 'hype' stage, the product is immature and the vendor
cannot deliver features fast enough to suit customers, but customers have
confidence they have a system that will last. The vendor needs early sales
to convince customers that the product will be successful. A few good
reference sites are critical.
The second, and fortunately longest, stage involves 'product maturation'.
During this stage, powerful capabilities are delivered, systems meet
expectations, and market saturation gradually occurs. This is usually the
high-point in the popularity of a product, and everyone believes the
product will last forever and that selecting the product is 'safe'. No-one
should be criticized for picking a platform in this important and desirable
stage, but unfortunately the decision is not as safe as it looks if a product is
nearing the end of its life cycle.
The third stage in the product life cycle usually involves 'deterioration'.
Some vendors will attempt to avoid this stage by refreshing their
architecture regularly and providing a steady stream of enhancements.
However, it is very difficult in the software industry to stay out of this stage
because technology is moving so rapidly. Deterioration is a harsh term
since vendors will claim a product is 'fully supported', but from the
customer's perspective, the product is falling behind current technology,
and relative to new products entering the market deterioration is
occurring. It is often common during this stage for established vendors to
defend their products and criticize the competition for using unproved
technology that may be nice, but really is not pertinent to laboratories.
During this stage, vendors may be frantically developing new technology
that will allow them to exhibit a competitive impact on the marketplace. A
new version of the old product may even come out during this stage, giving
the sales force 'evidence' that the product is fully supported. This version
may even have a feature or two that allow the sales force to claim use of the
technology that has bypassed their product. However, this new version
seldom exhibits important 'breakthrough' features of the type evident
during the hype or maturation stages of the product.
The final stage in the product life cycle is 'eradication'. During this stage
the vendor's sales force visits loyal customers to offer them a migration to
the new product. This may even include 'services' to assist in migration.
Vendors can maintain a high degree of customer loyalty by providing a
reasonable migration path to new systems.
services, etc. A switch to another system is far more costly than the quoted
cost of new hardware and software from the vendors. Any major project
should look at the big picture before deciding to move forward. A careful
consideration of all long-term costs and benefits should lead to a rational
decision of what is best.
It is a mistake to make a decision to replace a LIMS from too narrow a
viewpoint. For instance, if there are problems with a vendor's support, it is
possible to rationalize changing systems based on reduced ongoing support
costs and problems versus the cost of a license for a competing product.
But when the cost of duplicating all features and integrations that users
assume will continue to be available is factored in, incremental benefits
must usually be substantially greater than reduced support costs for the
change to be justified.
new LIMS does not have these capabilities as standard features, then the
cost of obtaining these customizations must be considered when making a
decision to replace a LIMS. It is rare that users are willing to sacrifice
capabilities they already have!
These customizations may be a major factor even when upgrading to
new versions from the same LIMS vendor. This impediment to change
should be carefuIly considered when designing customizations and migra-
tions. A slight added cost in an original customization is weIl justified if it
ensures the ability to accept upgrades, change vendors and change other
components of the application or system architecture.
installed just in the nick of time, does it make sense to give it a weaker
justification by allowing only incremental benefits?
Sometimes either the cost or the benefit side of the equation will be
more clear. But, for a balanced and thorough justification, it is important
to consider all factors. The costs may be very close to the laboratory - in
the form of license fees, hardware, customization, training, validation -
but the benefits may be from other departments and difficult to quantify.
As a typical problem, if a LIMS is being replaced to shorten the new drug
development cycle, what is the benefit? For a $1 billion/year drug, if LIMS
can shorten the development time by a single day, this equates to an
increased revenue potential of $3 million, easily justifying most LIMS
projects. If the benefits can be that large, how can you ever turn down a
project? On the other hand, how often is the laboratory on the critical path
where the development cycle can be impacted by LIMS? Such justifications
are difficult to rationalize and are often scrutinized closely by financial
executives.
Similarly, benefits in the form of savings may be clear but costs may be
obscure. The most difficult cost to quantify is the cost associated with
training and loss of experience. It is very difficult to estimate for how long
typical analysts will be less efficient after a new system is installed. It is also
difficult to estimate the cost of providing ongoing training and encourage-
ment outside the formal classroom.
Unfortunately, there are no easy answers to the justification questions.
user community so that support can be built for the process used and the
actual decision.
10.8.8 Maintenance
Maintenance typically begins before the system is implemented. As users
are first exposed to a system, they will start asking for enhancements. Most
enhancements can be delivered on a separate timetable. It is essential to
document the needs, develop the enhancements, and validate their correct
operation in the same manner as the initial system was implemented.
MOVING FORWARD AND SYSTEM REPLACEMENT 191
10.9 Keep an eye on the future - is it easier the second time around?
ILl Introduction
11.2 Review of LIMS development over the last ten years - the story so far
Each technique outlined previously has its own advantage for use in
particular situations and phases in the ADLC. Recently all of the
techniques have shown considerable convergence. Eclectic approaches to
systems analysis and software development propose use of those techniques
that are 'best' for a particular phase of the ADLC. 'Best' will be judged
according to user, business and environmental requirements and factors
considered may include speed of development, depth of documentation
and understanding of users. However, without a consistent means of
supporting an ADLC, applications may be developed and maintained to
different levels of regulatory 'completeness'. Often a variety of different
programming styles and paradigms will have been used with little
recognition of human factors. In a number of cases, software may have
been produced prior to significant use of structured programming.
Maintenance may have been performed on these applications, resulting in
applications that are now becoming costly to maintain and whose reliability
is difficult to test and validate. This situation will be accentuated by
applications that are acquired from vendors, where it may be impossible to
apply the ADLC to the software development process directly. In these
cases, the application design, implementation and testing are generally
done by the vendor who provides the application. In a regulated
environment and under these circumstances, the quality of the software
can only be inferred from recognised quality accreditation (e.g. BS 5750,
ISO 9000). In all cases, use of defined minimum standards is required.
However, there is a fine balance that must be achieved between meeting
user, business and social or political requirements. As a direct consequence
of the political environment and the requirement for validated applications,
the ADLC may still cause unacceptable lead times in the delivery of critical
business systems.
'payback' to the user and business benefit. The rapid prototype develop-
ment method favours development of a small subset of the total
functionality, getting that working, and later developing additional
functionality in the same way. By using such a rapid prototyping method, it
is possible to demonstrate quickly to users the key aspects of a business-
focused system as part of a phased implementation in a way that is easy to
demonstrate and easy for users to understand, and which can be developed
according to changing business and user requirements.
With access to more powerful applications on the desktop and with
prototyping, users are increasingly intimately involved in development of
solutions. However, if users take on development too fast there is a risk
that central computing facilities will need to rewrite the applications, at a
cost, if things go wrong. There is a need for users to understand the
discipline of IS development including support of an ADLC and
recognition of the need for verification and validation. Indeed there is a
need to develop new applications with a view of the global architecture.
11.11 Communications
When developing new systems to increase access or reach new users, one
approach to implementing client-server technology is to overlay interfaces
on existing applications. A logical starting point is to assess what sorts of
platforms are already instalIed. In some organisations there are elements
of the client-server architecture already in place. In other environments,
users are happy working with graphical front ends like Windows on the PC
or in a Macintosh environment, but they need a server to consolidate and
access information. The type of platform wilI steer the direction of
development towards certain types of tools. It remains essential that any
existing IT environment is critically assessed to establish whether it will
work with client-server technology.
11.13 Conclusions
To achieve these broad objectives, both identified users and the developers
of LIMS have a proactive role to play in its successful implementation and
use. Each must have a clear understanding of:
References
12.1 Introduction
12.3.1 Technology goals must meet the needs of people and businesses
Early in the ADISS Program, it became clear that an integrated approach
was needed to ensure success for end-users and suppliers. Because the
overall mission of the program was to facilitate the tasks of people and to
make businesses more productive, simply addressing technology issues
would not solve the problem. A limited business perspective could
210 ADVANCED LIMS TECHNOLOGY
To meet these people and business goals, the technology solution defined in
the AD/55 Program must:
• Provide a consistent systems architecture that can be used for:
- Communicating data
- Storing and archiving data
- Building analytical applications from standardized, extensible soft-
ware modules
- Building analytical databases that can be rapidly searched and
accessed
- Information management, presentation, and visualization tools of
the future
- Reporting and publishing of analytical information.
STANDARDS FOR EXCHANGING ANALYTICAL DATA 211
12.3.1.2 Standard file format problems. Major problems exist with the
'standard file format' approach. Formats change constantly for each
application, because performance of applications depends greatly on how
files are accessed on disk. As new applications require additional features,
changes to the file format are needed. Thus a 'standard file format'
approach is not easy to maintain. File formats in the past have typically
been too 'byte-oriented', or 'file position-oriented', which is extremely
tedious for developers. Because file formats are typically created by
independent developers, they are often too focused on a particular data
type (e.g. IR, NMR, spreadsheets, graphics, etc.) or vendor (e.g.
instrument vendor, computer vendor, etc.), which makes retrofit to other
data types or applications difficult. File formats are designed by many
different people for many different applications and for many different
systems, and they are rarely designed to be modules in global 'systems'.
They have limited extensibility. The widespread heterogeneity of file
design approaches most often leads to the use of the lowest common
denominator for a standard, typically a structured ASCII flat file.
212 ADVANCED L1MS TECHNOLOGY
Class Description
• Chromatography*
• Mass spectrometry*
• Optical spectroscopy (infrared and ultraviolet-visible)*
• Nuclear magnetic resonance spectroscopy*
• Surface chemical analysis techniques*
• Secondary ionization mass spectrometry
• Atomic absorption and emission spectroscopy*
• Inductively coupled plasma spectroscopy*
• X-ray spectroscopy
• Thermal analysis techniques
r DEVELOPER'S
TOOLS
111
UTILITIES
ADISS/NETCDF
L . . . - _ - - III L..-_ _--J 1 -_ _--1
(XDR encoding)
OPERATING SYSTEM
COMPUTER HARDWARE
and a set of low-level tools for byte-stream encoding and decoding, called
the ADISS Toolkit, which makes the data machine-independent. The
ADISS API simplifies data access by allowing programmers to input and
output data by name, as logical entities, according to data objects defined
by the ADISS data model, rather than as a stream of bytes. The netCDF
system described in the next section gives first-level conformance to
ADISS API definition.
The AD ISS Toolkit is a lower-level interface that shields scientists and
programmers from the details of how data are formatted for interchange
and storage. The low-level (netCDF) ADISS Toolkit ensures data object
portability and independence of both computer hardware and operating
systems. Both interfaces require design and implementation using object-
oriented concepts to achieve the modularity needed for graceful extension
and evolution [17].
Over the next two years, the ADISS and AlA Programs will begin to focus
on data elements beyond those needed to transfer raw data and results
information for single sample chromatography runs. They will extend the
chromatography standard to include other detector types such as mass
spectrometry, LC-diode array, infrared, NMR, and possibly others. The
AlA's plan is to move into a second phase of data standardization for
chromatography, if the demand is high enough and more customers inform
the AlA of their applications of the existing standards. This section covers
the projected enhancements to the AlA Chromatography Data Standard
scheduled for completion between late 1994 and mid-1995.
On-going work of major international instrument vendors to apply the
ADISS/netCDF approach to infrared spectroscopy (IR), nuclear magnetic
resonance (NMR) spectroscopy, and inductively coupled plasma (ICP),
direct coupled plasma (DCP), atomic absorption (AA) and atomic
emission (AES) spectroscopies is also briefly covered.
The AlA Data Standard extensions include seven work items, divided into
two 'waves' of development. Customer input received over the past three
years has driven the selection of these particular enhancements. The full
ADISS Chromatography Specification proposal contained several of the
short proposals. However, the AlA pushed these out beyond version 1.0
of the ANDIIChromatography Standard because of the 'time-to-market'
issues - there was simply too much to consider for version 1.0.
The issue of uncertain customer demand for and usage of the standard
has been reducing the AlA's momentum in chromatography since about
mid-1993. A 'chicken-and-egg' problem exists to a certain extent, because
if the proposals have not been completed and accepted by the vendors and
end-users, they cannot commit themselves to implement them. Vendors
are trying now to answer the question of why they should invest in a
product (standard) if customers are not demanding that it be done. Users
are reticent in demanding a product (standard) that does not fully exist,
and which they may not fully understand. A Catch-22!
Alternatives exist for end users: (1) pay directly for completion of the
220 ADVANCED LIMS TECHNOLOGY
netCDF chromatography_template_name
dimensions:
variables:
Description Information Class will make it possible for users to build and
transfer instrument system log books more easily, i.e. to track chromato-
graphy columns and other instrument components. The Sample Description
Information Class records significantly more sample information, so that
samples described by different vendors' systems can be cross-referenced
more easily. Finally, the addition of the System Suitability Information
Class enhances the AlA Chromatography Data Standard to allow
customers to track performance of separation systems over time and across
different systems better. The goals were to deliver the first set of
enhancements by mid-1994. However, now the market-awareness and
demand-measurement programs have taken precedence over enhance-
ments. This has delayed delivery of the first wave of enhancements by
many months. These first enhancements are now expected to be delivered in
vendors' products in early to mid-1996, along with support for diode array
detectors and more general multi-dimensional detectors (see below).
sample run data. Each approach has its own tradeoffs. Vendors have
developed many approaches that are correct for the particular applications
needed by their customers. These approaches must be accommodated by a
neutral model for sequences to allow interoperability among products.
logical time for all AlA members to upgrade is when extensions for
chromatography-MS, chromatography-IR, chromatography with diode
array detectors, and other are done. That is the earliest time that the
chromatography, mass spectrometry, and optical spectroscopy data
standards must use the same netCDF version. However, preliminary
testing of netCDF v2.3.3 and later versions was to have been completed on
an individual basis by AlA Chromatography Committee members before
the beginning of 1995. New standards for IR and atomic spectroscopic
techniques will use the latest version available when they are introduced,
expected to be in early 1995. A major design goal of netCDF is full
backward compatibility with all previous versions of netCDF, i.e. the
ability to at least read files from all previous versions, and the ability to
write them out in the current format. This is important because of the need
to read historical meteorological datasets from many years past.
The benefits of the new netCDF system (version 2.3.3) are:
(1) Record-level access, to allow access to groups of netCDF variables
as individual records, which will also help with sequences of sample
and spectral data
(2) Improved performance, as much as 40 times better for some
operations (e.g. cross-cuts across hyperslabs), which will be
important for GC-MS, GC-IR, NMR etc., and other 'large-data'
techniques
(3) New command line options; ncdump supports several options for
selectively dumping variables and comments
(4) A new C++ interface prototype, which simplifies the usage of the
data access library somewhat (this is still prototype code; it is being
tested extensively before it is finalized in a near future version)
(5) More computer platforms supported (HP-UX, Alpha OSF/l, MS-
DOS 5.0).
Unidata released netCDF v2.3.3 in June 1993. All reported problems
with the beta version were fixed, and the software was ported to several
additional computer architectures. NetCDF Version 2.3.3 was built and
tested successfully on the platforms listed in Table 12.2.
Some users have ported netCDF v2.3.3 to other platforms, including
Apple Macintosh and Microsoft Windows NT. Unidata does not maintain
these ports, but they can be obtained over the Internet from the netCDF
user community. Totally new ports are now easier due to a new 'configure-
based' approach to installation, which adapts more easily to what an
operating system provides. A number of systems and utilities built on top
of netCDF are available as source code for scientific visualization,
laboratory data handling, converters for spreadsheet data to netCDF data
and vice versa. The next version of NetCDF (v2.4) is in the works now,
expected to be released by the end of 1995 on the computer platforms
226 ADVANCED LIMS TECHNOLOGY
laboratory productivity and report results to clients, but they do not help
R&D workers to collect, store, and manage the many heterogeneous data
types generated during R&D (textual data, data arrays, tables of numbers,
spectra, molecular structures, images from CAT scans or fluorescence gel
scans, and DNA sequences are but a few of the data types encountered).
What we really need is a comprehensive system that I call the 'R&D
LIMS'.
The next generation of laboratory data management software (the R&D
LIMS) will include an 'analytical laboratory data repository' or 'data
librarian'. Laboratory data repositories will include all laboratory raw
data, methods, and processed data held in a central respository in a
standard form and assessed using a standard interface. Scientists and
managers will be able to access and cross correlate results from datasets
across sample runs, across instruments, across laboratories, across labo-
ratory protocols, across clinical trials for drug development, or across
environmental pollutant tracking studies, over many years, instantly. This
is possible because data are stored, interchanged, and archived in common
data repositories using standard information models, held in standard
forms and accessed by using standard software interfaces.
A few laboratory data management products are beginning to support
input and output of data from a variety of data systems and store them in a
standard form. Standard repositories based on the ADISS/netCDF
specifications and tools are an important step toward an R&D LIMS that
can support teams of scientists directly, not just their managers or
customers.
The computer and electronics industries have long been ahead of the
laboratory instrument industry in addressing future needs for standards,
perhaps because they are much larger, are more widespread, and have felt
a dire need for standards much sooner. Witness the early creation of
standards such as the Electronics Industry Association's EIA-232 (RS-
232), IEEE-488 (GPIB), IEEE 802.3 (Ethernet), IEEE lOO3.X (POSIX),
XlOpen XPG3 and XPG4, and many others. Some of these efforts began
five to ten years before the market demanded or was fully ready for them.
The instrument industry has much to learn from the foresight (the cynics
would say stupidity!) of these high-technology industries, which have
become commodity producers. Standardization in these industries has in a
large part produced this commoditization and the economic problems from
the resulting restructuring. However, it is also standardization and
commoditization that have produced the rapid growth rates these
industries have enjoyed over the past 20 years.
228 ADVANCED LIMS TECHNOLOGY
mately $3 million since late 1988. Revenues from the purchases of the
AlA Chromatography Data Standard have been about $30 000 in total.
From July 1993 to January 1994 revenues were about $7000. This does not
nearly cover the expenses of packaging and shipping the standard kit,
creating and distributing an awareness newsletter, or other incidentals.
Before significantly expanding the work, the AlA and its member
companies must see a significant demand for and application of the
standards by end-users. The AlA also wants to know what is missing, in
order to make it easier for end-users (and developers) to use the standards.
But first end-users must know what to request by learning what exists
today.
Once end-users know that the standards exist and can be used to solve
some of their key data handling problems, they must push all their
instrument manufacturers to complete the standards work, to become
compliant and to support them. This is accomplished by making standards
compliance a condition of purchase for any new instrument. Start this
process by educating your local instrument sales force. If they do not know
about the standards, feign ignorance, or show lack of support, then there
are other vendors from whom you can purchase instruments supporting
them. Your business may soon depend on the ability to communicate
freely between your various computer systems.
Wave 1 and Wave 2 extensions must be completed to meet future needs.
Proposals are now pending for most of this information; consult the
Appendix of the current AlA Chromatography Data Standard Specifica-
tions for details [7]. However, lead times are long for these standard
features. If demands are not made now, they will not make it into the
standards for one or two more years. Being a partner with the instrument
vendors is important. Vendors need to be able to measure demand clearly,
and to see many applications of the standards before they can commit to
the significant investment that it takes to implement more. Being a
demanding partner helps the standard agreement process to speed along,
helps the industry to grow, and helps to return more focus to scientific
innovation.
Benefits from the ADISS Program include giving businesses the proper
justifications to request these standards from vendors and to implement
the standards themselves. The ADISS Program's results will make data
access more consistent and easier, and will improve software and data
reusability, especially in the long term. It will protect the enormous data
and software investments being made by major institutions and corpora-
tions. It will increase productivity by using time more effectively and help
to shift scientists' focus back to science, away from the data communications
difficulties that are typical of distributed, multivendor systems. Finally, it
will reduce systems integration costs for analytical instruments, LIMS, and
other systems, and help to improve product and data quality [24].
STANDARDS FOR EXCHANGING ANALYTICAL DATA 231
Note
References
1. Lyaskowski, R. (1989) Proposed Plan for Industry-Wide Analytical Data Standards - the
ADISS Architecture and Project Overview. Digital Equipment Corporation, Four Results
Way, MR04-3/C9, Marlboro. MA 01752, USA.
2. Lysakowski, R. (1991) The global standards architecture for analytical data interchange
and storage, ASTM Standardization News, March, pp. 44-51.
3. Lysakowski, R. (1993) NetCDF - A defacto standardized framework for analytical
data exchange, storage and retrieval. In Computerized Chemical Data Standards:
Databases, Data Interchange. and Information Systems, ASTM STP 1214 (eds. R.
Lysakowski and C.E. Gragg). American Society for Testing and Materials, Philadelphia,
PA, USA, pp. 57-74.
4. Digital Equipment Corporation (1991) The Open Systems Handbook: A Guide to
Building Open Systems, reference number EC-HI089-48/91. Digital Equipment Corpora-
tion, Marlboro, MA 01752, USA.
5. MacDonald, R. and Wilks, P. (1988) 1CAMp·DX for infrared spectroscopy. Applied
Spectroscopy, 42 (1), 151-162.
6. Stanz, D., Campbell, S., Christopher, R., Watt 1. and Zakett, D. (1994) ANDIIMass
Spectrometry: Data Interchange for Mass Spectrometry. The Analytical Instrument
Association, 225 Reinekers Lane, Suite 625, Alexandria. VA 22314, USA.
7. Mattson, D. (1994) AND/Ilnfrared Spectroscopy: Data Interchange for Infrared
232 ADVANCED L1MS TECHNOLOGY
Spectroscopy. The Analytical Instrument Association, 225 Reinekers Lane, Suite 625,
Alexandria, VA 22314, USA.
8. Analytical Instrument Association Committee on Data Communications Standards
(1991) AlA Chromatography Data Standard - Specification. (Gives full definitions of data
elements and data models used by the AlA for chromatography data.) The Analytical
Instrument Association, 225 Reinekers Lane, Suite 625, Alexandria, VA 22314, USA.
9. Analytical Instrument Association Committee on Data Communications Standards
(1991) AlA Chromatography Data Standard - Implementation Guide. (Contains an
overview of netCDF and how to implement the content of the AlA Specification with
netCDF.) The Analytical Instrument Association, 225 Reinekers Lane, Suite 625,
Alexandria, VA 22314, USA.
10. Analytical Instrument Association Committee on Data Communications Standards
(1991) Proposed Global Extensions to the AlA Data Interchange Standard. The Analytical
Instrument Association, 225 Reinekers Lane, Suite 625, Alexandria, VA 22314, USA.
11. Macur, A. (1993) Private communication. Tripos and Associates, St Louis, Missouri,
USA.
12. Gaarenstroom, S. and Lee, R. (1993) Journal of Surface Science Spectra, 2 (4), 5-22.
13. ASTM (1993) E49.52.oo2. Roo3. Draft Standard Specification for the Analytical Informa-
tion Model for Analytical Data Interchange and Storage. American Society for Testing
and Materials, 1901 Race St., Philadelphia, PA 19103-1187, USA.
14. Weimar, R.D. (1993) Future ADISS architecture for materials properties and analytical
testing and The demand for and value of a standard unified data architecture for
analytical testing data. Both in Computerized Chemical Data Standards: Databases, Data
Interchange, and Information Systems, ASTM STP 1214 (eds. R. Lysakowski and E.
Gragg), American Society for Testing and Materials, Philadelphia, PA, USA, 96-107 and
84-95.
15. MacLaughlin, D. (1992) Integrating molecular structure and spectral databases with
LIMS, Eastman Kodak, presented at a conference on Scientific Computing and
Automation, Washington, DC, October.
16. Wollenberg, G. (1994) Private communication. Merck & Co., Rahway, NJ, USA.
17. Meyers, B. (1988) Object-Oriented Software Construction. Prentice-Hall International,
New York, NY, USA.
18. Rew, R.K., Davis, G. and Emmerson, S. (1994) netCDF User's Guide. An interface for
data access. Unidata Program Center, University Corporation for Atmospheric
Research, P.O. Box 3000, Boulder, CO 80307-3000, USA.
19. Rew, R.K. and Davis, G.P. (1990) netCDF, an interface for scientific data access, IEEE
Computer Graphics and Applications, July, 76-82.
20. Rew, R.K. and Davis, G.P. (1990) The unidata netCDF: software for scientific data
access. In Proceedings of the Sixth International Conference on Interactive Information
and Processing Systems for meteorology, oceanography and hydrology. American
Meteorology Soc. Anaheim, CA, February, 33-40.
21. Sherretz, L. and Fulker, D.W. (1988) Unidata: enabling universities to acquire and
analyze scientific data. Bulletin of the American Meterological Society, 69 (4), 373-376.
22. Davies, A.N. and Lambert, P. (1993) JCAMP-DX for nuclear magnetic resonance.
Applied Spectroscopy, 47 (8), 1093-1099.
23. Salit, M.L., Guenther, F.R., Kramer, G.W. and Griesmeyer, J.M. (1994) Integrating
automated modular systems with modular architecture. Analytical Chemistry, 66 (6),
361A-367A.
24. McDowall, R.D. (1994) A matrix for the development of a strategic LIMS. Analytical
Chemistry, 65 (20), 896A-901A.
Index
support key business processes 187 real time control of instruments 126
time consuming 44 replacement
user interface 193 change management 189
within company business reasons for 165-175
strategy 185-186 see also replacement of LlMS
instrument management 145-146 reasons not to change 175-179
interface device 145-146 see also reason not to replace LlMS
investment lessons from past 192 result of inadequate
justification 1, 192 maintenance 190-191
based on business benefits 124-125 requirements
incremental benefits and LlMS definition adjustments 189
replacement 179 definition and features 188
and laboratory automation 15 definition packaged systems
life-time 180-181 survey 188
data systems' life-cycle 181 definition rapid prototyping
ease of integration 180 benefits 203
linked to system architecture 180 definition team approach 187-189
vendor support 180 in water laboratory 126
Link bi-directional communication with non-functional hard- and software 43
LabManager 148-149 non-functional system
independent operation 148, 151 management 44
automating 'single reading' non-functional training 43-44
instruments 151-152 robotic interface 123-137
bar code recognition devices 151 scheduling and controlling
Instrument Coupler 148-149 instruments 127
Instrument Coupler distribution of strategic design 16-18
processing to instruments 152 study management requirements 54
operator interaction with 151-152 technology driven solutions 193
work cell configuration 149 traditional features 75-77
management of automated methods 10 clinical laboratory systems 97
model inability to capture protocol design
basis for interdisciplinary elements 76
discussion 25 result managers 226
commercial system see also protocol asynchronous LlMS
comparisons 31-33 traditional functions of 1, 126
data base medium for sample and specification
communication 25 management 54
definition 21-24 transparent medium for sharing data 12
definition functional components 22- usefulness I
23 user most frequent 163
definition interaction between user perception 1
components 23-24 users wrongly identified 1
functional complexity 30 vendors automated interface
global issues 34-35 modules 147-158
importance of perspective 20-21 vision and data standards 207
laboratory environment definition 25 vision in rapidly changing business
organization 24 environment 193-194
package systems selection 31-33 login xxi
potential for expansion 24, 25 schedulers sample context 11
system management 35
technological independence 25 management of automated laboratories
next generation 13 self-managing teams 136
packages managerial control of subordinates impact
Challenger 163 of technology 8
ChemLMS 133, 153 manual data entry
LabManager 147 large instrumental data volumes 142
SampleManager 156 operator analytical work load 142
purchase objectives 40-42 transcription errors 142
242 INDEX
and development of data standards for WA VELAN radio based protocol for data
instruments 227 transfer 163
evaluation LIMS model 32-33 work flow 6
importance of relationship with workforce polarisation 8
customer 184 workload statistics production 50
importance of track record of worksheet
hardware 106 and chromatography standards 222
marketing practices sales pitch 109 automated batch analysis in clinical
marketing practices vaporware 109 laboratories
marketing strategies and LIMS product non-host query 100-102
life-cycle 173-174 real-time host query 102-103
need to measure demand for generation DMPK studies 64
standards 230 manual batch analysis in clinical
return on LIMS investment 184 laboratories 100, 102
systems response times 109 template driven 47
VGL sponsor specific reports 94
virtual analytical instrument 212 xenobiotic xxiv
voice recognition systems limited use
164 Zuboff, S. 8