gclp-method-systems-validation

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

GCLP: Method & Systems Validation

Contents
1. Contents
2. Introduction
3. Module Objectives
4. Background
5. Method Validation
1. Analytical Methods
2. Method Development
3. Validation Process
4. Validation Plan
5. Conduct Validation
1. Conduct Validation 1
2. Conduct Validation 2
6. Validation Report and Approval
6. Systems Validation
1. Computer Systems
2. Computer Systems Validation
3. Validation Plan
4. User Acceptance Testing
5. Operational Use
6. Maintenance and Data Storage
7. Audit Trail
7. Summary
8. Final Quiz/Results

Introduction
GCLP: Method & Systems Validation is the fifth module of the Good Clinical Laboratory
Practice (GCLP) course. The course is designed for all those wanting to gain an in-depth
understanding of the principles of GCLP, how each principle is applied in a clinical laboratory and
the impact implementation of these principles has on the laboratory and the clinical trial. It is
highly recommended that you complete the following short courses and modules on the Global
Health Training Centre before starting this module:

Good Clinical Practice

Introduction to Good Laboratory Practice

GCLP: Organisation and Personnel

GCLP: Facilities, Equipment, Materials and Reagents

GCLP: Standard Operating Procedures & Analytical Plan

This module examines GCLP guidance for the selection and validation of the analytical
methods to be used in the laboratory and the computer systems required by the laboratory
to receive, capture, process and report clinical trial data. This module is based on the following
guidelines and materials:

ICH Harmonised Guideline – Integrated Addendum to ICH E6(R2): Guideline for Good
Clinical Practice (ICH GCP E6 R2 2016)

Good Clinical Laboratory Practice (WHO GCLP 2009)

Good Clinical Laboratory Practice (RQA GCLP 2012)

Good Laboratory Practice Number 17 (OECD GLP 2016)

Module Objectives
By the end of this module you should be able to:

Identify the different types of analytical methods

Understand the process involved in selecting an appropriate analytical method

Know when the validation of a method should take place

Identify and understand each step of the method validation process

Define the key elements details for validation of bioanalytical methods

Identify computerised systems in a laboratory

Define the required elements for validation of a computerised system


Understand and develop user acceptance testing

Ensure the integrity of clinical trial laboratory data through effective validation and
operation of computerised systems and analytical methods

Background
So far this course has covered each main section of GCLP in the
order they appear in the GCLP guidelines, but this module will now
address ‘method validation’ and ‘computer systems’, two
major subsections of the 12th section of GCLP ‘Conduct of the
Work’. This is because the course aims to explain GCLP guidance
in a chronological order and so far we have covered:

1. Organisation and Personnel

2. Facilities, Equipment, Materials and Reagents

3. SOPs and Analytical Plan

Therefore, the analytical methods to be used in the


laboratory and the computer systems required by the
laboratory must be in place prior to the conduct of the clinical trial
wo r k. Conduct of the work and the management of trial
samples; the 7th Principle of GCLP, will be covered in the next
module Conduct of the Work, Sample Management, Data &
Recording.

Method Validation

What the guidelines state:

The selection of instrument platforms and analytical methodologies should take into account
current regulatory standards and sponsor expectations, where appropriate.

Each analytical method used in the analysis of trial materials should be appropriately
documented, validated, controlled and approved. Changes to a method should be controlled
and validated and result in the issue of a further version of the method.

Each analytical method should be appropriately validated to establish and demonstrate its
fitness for purpose.

Records to demonstrate the validity and suitability of such methods within the trial facility
should be retained.

Analytical platforms/methods should not be changed during the course of a trial, without prior
consultation and agreement with the Sponsor. Such changes must be controlled, documented
and appropriately authorised and may result in the need for further method validation.

WHO GCLP 2009

RQA GCLP provides further guidance to WHO GCLP and is highlighted below:

What the guidelines state:

The selection of instrument platforms and analytical methodologies should take into account
current regulatory standards/guidelines and sponsor expectations, where appropriate.

Such guidance includes, but is not limited to: FDA Guidance for Industry on Bioanalytical
validation (CDER 2018); Best Practices for Chromatography and Ligand Binding Assays (AAPS
2007); EMA guideline on the Validation of Bioanalytical Methods (2011)

Each analytical method used in the analysis of trial samples should be appropriately
documented, validated, controlled and approved with acceptance criteria defined. Changes to
a method should be controlled and validated, resulting in the issue of a further version of the
method.

Records to demonstrate the validity and suitability of such methods within the trial laboratory
should be retained.

Analytical platforms/methods should not be changed during the course of a trial without prior
consultation and agreement with the sponsor. Such changes must be controlled, documented
and appropriately authorised, and may result in the need for further method and equipment
validation.

RQA GCLP 2012


Analytical Methods
Analytical methods refer to the way in which the analysis is performed and should be specified
in documented procedures to ensure consistency of analysis. The analytical method should
describe in detail the steps necessary to perform each analytical test. This may include but is
not limited to: the sample, the reference standard and the reagents preparations, use of the
apparatus, generation of the calibration curve, use of the fomulae for the calculations, etc. (ICH
Q2 R1).

Analytical methods can generally be divided into two categories;manual methods and
instrumental methods employing specialised equipment in the analysis. For the purposes of
GCLP, we have split analytical methods in to three basic types of methods:

Manual methods
e.g. Dip sticks (i.e. for the assessment of urine), Rapid Diagnostic Testing etc.

International methods
e.g. Routine analysis (Haematology, Biochemistry, Serological tests etc. for
biochemistry, haematology, urinalysis, drugs of abuse testing, virology (Complete
Blood Count (CBC), Liver Function tests, HIV ELISA, etc.). Typically, these methods will
come pre-validated, e.g. use of standard kits in the analysis of the sample, developed
and validated by the manufacturer. It may be necessary to qualify them on individual
equipment.

Study or compound specific methods


Specific study methods based on protocol requirements, e.g. for determining the level
of an investigational product in a participant sample for
Pharmacokinetics/Pharmacodynamics purposes.

When selecting an analytical method the laboratory must first ensure it is of appropriate
design and suitable for its intended use, this is achieved through verification. Verification was
explained for selecting laboratory equipment in GCLP: Facilities, Equipment, Materials and
Reagents (see section 6.2. Therefore, the laboratory must determine that the method is
demonstrably fit for purpose:

Demonstrably – there is evidence to support that the method works as intended

Fit – the set characteristics for the method can be achieved

Purpose – the use and scope of the method is clearly defined

The process of validation determines if a method is demonstrably fit for purpose!

Once a method has been selected the process of method validation can begin:

Method Development
T h e development stage of the method validation process is the activity of obtaining
knowledge and evidence to develop a method that meets the predefined criteria. Method
development is research and not a clinical process so Good Laboratory Practice should be
adhered to and the whole process must be documented using Good Documentation
Practices. The information required for method development can be gathered through a
variety of different sources such as:

Technical literature

Manufacturer’s information

Your own knowledge and experience

Previous methods

Previous experience

Sponsor experience/information

Using this Information combined with a process of elimination and a planned and controlled
process of trial and error a method is developed that meets the predefined criteria. This entire
process must be recorded and retained as evidence to show how this method was selected
and developed for validation; this will also assist with future assessment of the method.

Validation Process
Once an appropriate method has been developed, validation of this method can take place
according to a Validation Plan. Validation should take place in the laboratories where the
work using this method will take place and will include, people, equipment, materials,
samples and reagents. A method validated in one location can be transferred to another
location but you will need to validate that the method works at the new location and the results
obtained are comparable between sites. The laboratory personnel involved must be trained and
competent in the use of the equipment and materials required for the method; and the
equipment, materials and reagents must all be verified and validated as per GCLP (see
Facilities, Equipment, Materials and Reagents – section 6.2 & 6.4).

The validation process follows a standard format:

Validation Plan
T he validation plan, also known as the validation protocol, is a controlled and approved
document detailing the entire validation process. It defines the content and lists the
personnel, resources, equipment, materials and reagents to be used. It should also
specify the timelines for completing the entire validation process and must include all the
predefined acceptance criteria for each of the elements included in the validation. These
elements typically include; but are not limited to:

Accuracy

Precision
Repeatability
Intermediate precision
Reproducibility

Sensitivity

Specificity

Selectivity

Analytical range

Linearity

Robustness

The sample size for each element must be adequate to determine a definitive answer and
this will vary from element to element and from method to method. The laboratory should
determine the sample size required and this could be from the manufacturer’s information
and/or the laboratory manual and/or statistically determined.

Conduct Validation

Accuracy – is the degree of closeness of measurements of a quantity to its actual (true) value.
Precision – is the repeatability or reproducibility of the measurement.

Figure 1 shows the relationship between accuracy and precision.fFgure 2 shows how an assay
can be accurate but not precise, precise but not accurate, neither, or both:

Fig 1:

Fig 2:

Precision may be considered at three levels:

1. Repeatability – expresses the precision under the same operating conditions over a short
interval of time.

2. Intermediate precision – expresses within-laboratories variations: different days,


different analysts, different equipment, etc.

3. Reproducibility – expresses the precision between laboratories.

ICH Q2 (R1)

Conduct Validation 1
Sensitivity – represents the smallest amount of substance in a sample that can be accurately
and precisely measured by an assay and is the degree of response to a change in concentration
of an analyte being measured in an assay. This is different to the Limit of detection, which is
the lowest amount of an analyte in a sample that can be detected without accuracy and
precision.

Specificity – is the ability of an assay to measure one particular analyte (organism or


substance), rather than others, in a sample. It expresses freedom from interference by any
element or compound other than the analyte.

Selectivity – is the extent to which an assay can determine particular analytes under given
conditions in mixtures or matrices, simple or complex, without interferences from other
components.

Conduct Validation 2
Analytical range – the interval between the upper and lower concentration (amounts) of
analyte in the sample (including these concentrations) for which it has been demonstrated that
the analytical procedure has a suitable level of precision, accuracy and linearity. This is not to be
confused with reference ranges for clinical safety analysis.

Linearity - is the ability (within a given range) of an assay to obtain test results which are
directly proportional to the concentration (amount per volume) of an analyte in the
sample. Figure 3 below shows linearity between known concentrations of an analyte and the
obtained (‘recovered’) test results.

Fig 3:
Robustness – is a measure of an assays capacity to remain unaffected by small, but deliberate
variations in method parameters and provides an indication of its reliability during normal
usage. (Click here for a practical example).

The method is only deemed valid if it meets the acceptance criteria for all of the elements
listed in the validation plan / protocol and failure to meet the criteria may require the
whole process to be re-started!

Validation Report and Approval


Once the analytical validation work as detailed in the validation plan has been successfully
completed a validation report should be produced. The report should contain all the results
from the validation and together with a conclusion based on the plan and acceptance criteria
and should be retained indefinitely by the laboratory. Well documented methods ensure
reproducibility, consistency and data integrity. Once the final report has been approved by the
sponsor and laboratory management the final documented Analytical Method can be
produced as a written procedure to be followed by the laboratory staff when conducting the
analysis. This may be in the format of a SOP or work sheet. It is important that the laboratory
staff performing the anaylsis are trained on the analytical method before they start using it.

This validation is specific and only applies to the laboratories where the validation was
performed including the equipment, materials, reagents and the method criteria. Changes to
the Analytical Method may require a partial or full re-validation and the decision-making
process should be documented in the protocol. An analytical method should not be changed
during the course of a clinical trial without prior consultation and agreement with the trial
sponsor. All changes need to be controlled, documented and appropriately authorised. A
method validated in one location using the equipment at that location can be transferred to
another location but you will need to validate that method works using the equipment at the
new location and the results obtained are comparable between sites. The degree of validation
may be reduced but a separate validation plan, report and approval is required, this can
however be based on the original validation.

Systems Validation

What the Guidelines state:

Computerised systems should meet the general requirements for equipment as described (in
this booklet). Due to the nature of computerised systems and their key role in the operation of
the laboratory, further requirements apply to their use. In all cases, computer systems should
be appropriately validated and maintained, and be demonstrably fit for purpose.

When a computer system is used to capture raw data, a definition of what constitutes the raw
data should be documented. This is usually documented within the SOP for that system.

Computerised systems used to receive, capture, process or report data should be acquired,
developed, tested, released, used, maintained and retired according to established guidelines
or laws.

These may include the OECD Monograph ‘The Application of GLP Principles to Computerised
Systems’, the FDA ‘21CFR Part 11: Electronic Records, Electronic Signatures, Rule’ and the
‘Guideline for the use of Computer Systems in the Conduct of Clinical Trials’.

Procedures should exist which address the security and operation of the computer systems.
This should include the maintenance of a data audit trail, which include the date/time and
individual responsible for the collection of the data, system change control procedures,
maintenance and system security procedures that ensure the integrity of trial data.

Access to computer systems should be restricted to authorised personnel.

If data is retained electronically, means should exist to ensure the data held can always be
retrieved.

RQA GCLP 2012

Computer Systems
Computerised System - a system (consisting of one or more hardware components and
associated software) that is involved with the direct or indirect capture of data, processing or
manipulation of data, reporting and storage of data, and may be an integral part of automated
equipment. Examples include a programmable analytical instrument or a personal computer
linked to a Laboratory Information Management System (LIMS).

RQA GCLP 2012

Computer systems incorporate all the processes needed to effectively manage data in a
clinical trial laboratory; this includes both incoming and outgoing participant data and all data
transfers within the laboratory. Computerised systems consist of:

Hardware - the computer, analyser, server, cables etc,

Software - the programs within the hardware that run the system.

Some hardware will be purchased with all the required software already installed, whereas
some hardware may require software to be purchased separately and then installed on the
hardware. A complete computerised system such as a LIMS has the capacity to quickly and
easily manage, analyse, and retrieve data and offers multiple advantages over paper-based
systems such as:

Error reduction
Ability to perform data searching
Built-in quality control management
Generation of reports
Ability to track reports
Ability to track and analyse trends
Integration with additional laboratories
Manufacturer-provided training

Computer Systems Validation

Validation of a Computerised System - a documented process that demonstrates that a


computerised system is suitable for its intended purpose

RQA GCLP 2012

Given that participant test results are the final product of the laboratory for each clinical trial
they are involved in and also may be critical to subject safety; the laboratory management must
ensure that the laboratory has an effective system in place to achieve accessibility, accuracy,
timeliness, security, confidentiality, and privacy of participant information. Where possible this
should be in the form of a computerised system; however, this may not be achievable in all
settings, especially in resource poor settings. Wherever participant data is transferred
electronically it is very important to demonstrate the suitability and validity of the data transfer
to ensure the security and integrity of the data following transfer.

Some laboratories may develop in-house computer


systems and use locally developed systems based on
commercially available software, whereas others may
choose to purchase completely developed laboratory
systems. Either way, the chosen computer systems,
including all hardware and software, should be treated
no differently to any other equipment within the
laboratory and therefore the laboratory must
determine that the system is demonstrably fit for
purpose through validation. (See GCLP: Facilities,
Equipment, Materials and Reagents section 6.2. for Design, Installation, Operational and
Performance Qualification of Equipment/systems)

Although software vendors usually supply ‘validated’ software, this has only been validated in
their controlled environment on their systems, not in the intended laboratory environment
and system, therefore validation in the laboratory must still be performed. The extent of the
validation will vary depending on its intended use and if the system is configured to meet the
requirement of the organisation or user. It may just require installation and operational
qualification to be performed to ensure that the system has been installed with all the required
services needed to operate the system and that it operates as originally intended and validated.
Services can include a reliable supply of electricity, connection to internet, operating systems
e.g. MS Windows, etc.

Validation Plan
Once a computer system has been selected and defined a validation plan must be developed.
The detail of this plan will depend on the function and complexity of the system to be
implemented. There should be a SOP describing the lifecycle of computerised systems from
initial concept and need, selection of systems, establishing user requirements, which can then
lead to systems specifications, design and function depending if a consumer ‘off the shelf’
system to be used as it is, or whether it will be configured, or developed from scratch. The
validation plan may include, but is not limited to:

User requirements

System specifications

Vendor supplied system information

Acceptance testing

Operational documents

There will be various validation documentation required for the validation of the system,
including:

Approved validation
plan
Approved acceptance
testing plan
System specifications
Test data sets
Test results
Error resolution /
Pre-
change control
validation
Updated system
documentation
Acceptance test
review summary
Acceptance test
Post- report
validation
Approval for use of
the system

Once the validation plan and pre validation documentation are in place, user acceptance
testing can commence.

User Acceptance Testing


The objective of acceptance testing is to ensure that the system is fit for purpose by testing
the computer system; hardware and software, in the end user environment, by the user.
Testing should take place by laboratory personnel who are familiar with the system being
tested. If the system has been purchased fully developed, the supplier will normally provide
training. The first step in the acceptance testing process is to ensure that the system conforms
to the system’s required specificationand established user requirements. System
specifications should include, but are not limited to:

Outline of the system structure


Boundaries of the system
Definition of the functionality of the system
User environment information
Inputs and outputs
Definition of data
Regulatory requirements

The user acceptance testing plan must be defined and agreed to by the laboratory
management prior to user acceptance testing and should outline all the test descriptions for
the system including:

Criteria for acceptance


Expected results
Change control
Error resolution
Retesting requirements
Documentation of results
Approval and acceptance criteria

The acceptance testing plan should also include unplanned events and potential user
errors where possible. These can include power failures, disconnections, security breaches,
logon discrepancies, stress testing and destruction testing (intentionally attempting to cause the
system to fail).

Operational Use
Upon successful completion of user acceptance testing a report of the entire process must
be produced and should detail all the testing performed and the results obtained. This
report will then be reviewed by the laboratory management and quality assurance personnel
who will decide if the system is acceptable for operational use in the laboratory. There should be
a scheduled formal release of the system for operational use that will include:

Appointment of a System Manager

Training of laboratory personnel

System SOPs, including

End-user operation

Emergency procedures

Data preservation

Reporting archived data during system downtime

Maintenance and calibration

Documentation procedures

Change control procedures

There must be a formal documented request for a change to any hardware or


software.

The need for the change must be assessed by the system manager and tested prior to
formal acceptance of the change and implementation

There must be documentation explaining at what level a revalidation should occur.

Definitions of the raw/source data accepted by the system

It is of utmost importance to safeguard the privacy of participants and therefore the laboratory
management must ensure security measures are in place to protect the confidentiality of
laboratory data. Access to the computer system must be limited to authorised
laboratory personnel only and policies must be in place to establish the level of
access for each individual. For instance, there may be a different level of access for
personnel entering participant data to personnel who review and edit the data. Logon
credentials and passwords must be unique to the individual user and not shared between staff.

Maintenance and Data Storage


As computer systems are considered part of laboratory equipment, there should be both
routine and preventative maintenance procedures in place. All maintenance performed
must be documented and retained. As part of system maintenance, the laboratory must ensure
there are procedures in place for the preservation of electronic data (‘disaster recovery’)
during hardware failures, software failures and unexpected destructive events (e.g. fire, flood
etc.). Such procedures should include periodic backing up and storing of data, restoring data
from backed up media and off-site storage and back-up data. The frequency, media used and
type of secure storage of data back-ups should be agreed to with the laboratory management
and documented in a SOP. Data storage media, such as tapes, disks, etc., must be properly
labelled, stored (onsite or offsite), and protected from damage and/or unauthorised use. The
process of backing up data must be tested periodically as part of ongoing system maintenance.

The data storage capacity of the system should be sufficient to meet the needs of the
laboratory and the sponsor, and the stored participant results data and archival information
must be easily and readily retrievable. The computer system must be able to reproduce
archived test results completely, including the reference range originally given for the
specific test, and any flags, footnotes, or interpretive comments that were attached to the
specific result at the time of the original report. When multiple identical analysers are used for
reporting participant test results, there should be the ability to trace results back to the
original instrument on which the test was performed.

Audit Trail
Computerised systems that capture raw data should have
the ability to create and retain an audit trail for its entire
operation. There are certain points in data handling where
errors may occur e.g. during a manual transfer of
participant data into a system, a user error on the
keyboard during data modification or a transcription error.
The audit trail ensures that any identified errors and the
subsequent changes made are documented. The
system must therefore provide a secure, computer
generated time- stamped audit trail that records the date and time of operators’ entries
and actions that create, modify, verify or delete data. This will effectively identify who,
when, what, and why any changes to raw data have taken place and this should be
automatically requested by the system itself. In addition, all original records of testing, data
capture, inputs, outputs and operation should always be retained. Meta data should also be
retained by the system and any errors detected during computer system usage must be
documented along with the corrective action taken.

Summary
Analytical methods can be manual or instrumental.

When selecting an analytical method the laboratory must first ensure it is of appropriate
design and suitable for its intended use through verification.

Each analytical method should be appropriately validated to establish they are


demonstrably fit for purpose.

Method development is the activity of obtaining knowledge and evidence to develop a


method that meets predefined criteria.

Method validation process follows a standard format:

Validation should take place in the laboratories where the work using this method
will take place.

A method validated in one location can be transferred to another location but you will need
to validate that the method works at the new location and the results obtained are
comparable between sites.

The key elements for validation of bioanalytical methods include:Computerised systems


consist of hardware and software

Accuracy
Precision
Repeatability
Intermediate precision
Reproducibility
Sensitivity
Specificity
Selectivity
Analytical range
Linearity
Robustness

Computerised system validation is a documented process that demonstrates that a


computerised system is suitable for its intended purpose

Validation of a computerised system includes:Acceptance testing ensures that the system


conforms to the system’s required specification and established user
requirements
User requirements
System specifications
Vendor supplied system information
Acceptance testing
Operational documents

The laboratory management must ensure the computer system has sufficient security
measures in place to protect the confidentiality of participants data

Final Quiz/Results
Please ensure you have answered all questions before clicking the ‘submit’ button

1. The process of validation determines if a method is demonstrably fit for purpose.

True
False

2. The information required for method development can be gathered through which of the
following: (please select all that apply)

Technical literature
Previous methods
Validation Plan
Previous experience
Validation report
Manufacturer’s information

3. A method validated in one location can be transferred to another location but you will need
to…

Complete a full revalidation of the method


Validate that the method works at the new location and the results obtained are
comparable between sites
Validate that the results at each site are exactly the same

4. For each of the elements included in the validation plan there must be acceptance criteria
that can be defined before and/or after testing.

True
False

5. In regard to the precision and accuracy, what do A and B reflect in the following figure?
(please select two answers)

A = Not accurate, precise


A = Accurate, not precise
B = Not accurate, precise
B = Accurate, not precise

6. Which of the following best describes specificity of an assay?

The extent to which an assay can determine particular analytes under given
conditions in mixtures or matrices, simple or complex, without interferences from
other components.
The ability of an assay to measure one particular analyte (organism or substance),
rather than others, in a sample. It expresses freedom from interference by any
element or compound other than the analyte.
The smallest amount of substance in a sample that can be accurately and precisely
measured by an assay and is the degree of response to a change in concentration of
an analyte being measured in an assay.

7. A method is deemed valid if it meets the acceptance criteria for the majority of the
elements listed in the validation plan?

True
False

8. The validation report should be retained indefinitely by the laboratory.

True
False

9. Computerised systems consist of: (please select all that apply)

Operators
Software
Hardware
Reagents

10. If software has already been validated by the vendors, it does not require validation in the
laboratory.

True
False

11. The validation plan may include: (please select all that apply)

User requirements
Participant data
System specifications
Acceptance testing
Operational documents

12. The objective of acceptance testing is to ensure that a computer system is fit for purpose
by testing the system in the end user environment, by the user.

True
False

13. The user acceptance testing plan should outline all the test descriptions for the system
including: (please select all that apply)

Criteria for acceptance


Expected results
Change control
Error resolution

14. Once validated, all laboratory personnel must be given full and equal access to the
computer system.

True
False

15. What is the purpose of an Audit Trial?

To identify errors and the laboratory personnel responsible


To ensure that identified errors and the subsequent changes made are fully
documented.
To prevent errors from occurring

Submit

You might also like