CSV Risks Requirements Tests and Traceability

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 44

Sold to Anand Pandya (#F38UUULQ)

Computerised System
Validation
Risks, Requirements, Tests and Traceability

Christopher Montgomery and the RQA Computing Committee

CSV - Risks, Requirements, Tests and Traceability CVR_A5.indd 1 30/01/2013 10:11:10


Sold to Anand Pandya (#F38UUULQ)

Published by Research Quality Association

3 Wherry Lane
Ipswich
Suffolk
IP4 1LG
UK

January 2013
IBSN 978-1-904610-20-5

Copyright © 2013 Research Quality Association Ltd


All rights reserved

CSV - Risks, Requirements, Tests and Traceability CVR_A5.indd 2 30/01/2013 10:11:10


Sold to Anand Pandya (#F38UUULQ)

CONTENTS Page
Foreword 1
Acknowledgments 1
1. Introduction 2
2. The Validation Process 3
3. The Regulations 4
4. Risk Management 4
4.1 Risk and Risk Management 5
4.2 Risk Mitigation 5
4.3 Risk Avoidance 6
4.4 Risk Acceptance 6
4.5 Risk Decision Tree 6
4.6 Risk Rating 7
5. Project Deliverables 8
5.1 Project Initiation 8
5.2 User Requirements Specification 8
5.3 Traceability Matrix 8
5.4 Risk Assessment 9
5.5 Roles and Responsibilities 9
5.6 Functional Specification 9
5.7 System Design Specification 10
5.8 Software Development and Coding 10
5.9 Implementation (Architecture) 12
5.10 Configuration and Final System Testing 12
5.11 Release into Production 12
5.12 System Maintenance 12
5.13 Decommissioning 13
6. Validation Process 13
6.1 Validation Assessment 13
6.2 Supplier Qualification and Audit 13
6.3 Qualification Plan 14
6.4 Validation Plan 14
6.5 Production Readiness Check 15
6.6 Planned Variances 16
6.7 Validation Summary 16
6.8 Change Control 16
7. Qualifications 17
7.1 Sequence of Qualification Activities 17
7.2 Retrospective Evaluation 18
7.3 Infrastructure Qualification 18
7.4 Supplier Qualification 19

25569-theRQA_Text_AW5.indd 1 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

7.5 Installation Qualification 19


7.6 Operational Qualification 19
7.7 Performance Qualification 19
7.8 User Acceptance Testing 19
7.9 Regression Testing 20
7.10 Test Scripts 20
7.11 Maintenance 20
8. Validation Strategy 21
8.1 Scope 21
8.2 Responsibilities and Authorities 22
8.3 System Architecture 22
8.4 System Interfaces 23
8.5 Validation Approach 23
8.6 Installation Qualification Strategy 23
8.7 Installation Qualification Documentation 23
8.8 Operational Qualification Strategy 24
8.9 User Acceptance Testing 24
8.10 Migration and Conversion Strategy 24
8.11 Go-Live Performance Qualification Strategy 25
8.12 Validation Approval Statement 25
8.13 Acceptance Criteria 25
8.14 Production Readiness Assessment 26
8.15 Planned Variance Document 26
8.16 Validation Summary Report
and Project Summary Report 26
8.17 System Operation and Maintenance 27
8.18 Documentation 27
9. Appendix A – Example User Requirements Specification 29
10. Appendix B – Example Functional Design Specification 30
11. Appendix C – Example System Design Specification 32
12. Appendix D – Example Traceability Matrix 35
13. Appendix E – Example Test Script 36
14. Appendix F – References 36
15. Appendix G – Example Architecture Diagram 37
16. Appendix H – The Iterative or Prototyping CSV Model 38
17. Acronyms 38

25569-theRQA_Text_AW5.indd 2 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Foreword
This booklet has been produced by the RQA Computing Committee to provide
guidance on current best practice and the application of risk management
in Computerised System Validation. The principles are applicable to all
regulated computerised systems, particularly those used within the
pharmaceutical industry.

Acknowledgments
Much of the information within this booklet has been developed concurrently
with the the RQA courses ‘Computerised System Validation - Risks,
Requirements, Tests and Traceability’ and ‘Auditing Computerised
Systems.’ The RQA is indebted to the whole Computing Committee for their
commitment of time and effort in contributing the material and expertise
that has led to this publication.

25569-theRQA_Text_AW5.indd 1 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

1. Introduction
In 1987 FDA published a ‘Guideline on Process Validation’ that defined
validation as: ‘Establishing documented evidence that provides a high degree of
assurance that a specific process will consistently produce a product meeting its
predetermined specifications and quality attributes’.

This definition was later expanded to include computer systems used in the
development and production of pharmaceutical products and medical devices.
In ‘General Principles of Software Validation (2002)’, FDA stated that:

‘The system life cycle model that is selected should cover the software from
specification through to decommissioning. Activities in a typical system life cycle
model include the following:

Quality planning
System requirements definition
Detailed software requirements specification
Software design specification
Development and testing
Installation and configuration
Qualification and release
Operation and support
Maintenance
Decommissioning.

Verification, testing and other tasks that support software validation occur
during each of these activities. Software requirements should be evaluated to
verify that:

There are no internal inconsistencies among requirements


System requirements account for identified risks
Fault tolerance, safety and security requirements are complete and correct
Allocation of software functions is accurate and complete
Software requirements are appropriate for the system hazards
All requirements are expressed in terms that are measurable or
objectively verifiable’.

FDA has also stated that, ‘It is not possible to validate software without
predetermined and documented software requirements’.

25569-theRQA_Text_AW5.indd 2 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

2. ThE valIdaTION PrOCESS


Computerised System Validation (CSV) is a project based activity
with defined objectives in terms of timescales and deliverables.
A validation methodology identifies the tasks and deliverables required
to verify through formal (documented) testing and reporting that a
computerised system has been designed, constructed and is capable
of being operated following procedures to ensure that it is suitable for
its intended purpose.

The process can be modelled in a number of ways to illustrate what is


required and the order of delivery. The common models are the time-based
linear version, also known as the ‘waterfall model’, the validation or
‘V’ model variant that illustrates traceability through requirements and
functions to testing and the cyclical or iterative variant that is also known
as prototyping.

For the purposes of this guide, a linear model will be used to illustrate
a simple, step-by-step process, as illustrated in Figure 1.

Figure 1 – Project Phases and Deliverables

Proposal Requirements Design Implementation Configuration Release VE Operation


LI
O
G

Validated
URS FDS SDS Test Scripts Test Reports System

Risk Analysis
Validation Assessment
Supplier Qualification and Audit
Validation Plan (VPL)
Installation Qualification (IQ)
Operational Qualification (OQ) or System Testing
Performance Qualification (PQ) or UAT
Release into Production (Sign Off)
Process Deviations
Validation Summary (VSR)
Traceability Change Control

25569-theRQA_Text_AW5.indd 3 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

3. The Regulations
All of the following regulations require that computerised systems meet
their quality requirements through validation:

FDA 21 CFR Parts 11, 58 and 210


EU GMP annex 11
ICH E6 GCP
The GLP regulations
EU pharmacovigilance regulations
Sarbanes Oxley.

Pharmaceutical regulatory inspections typically include review of the


validation status of computerised systems that support drug research,
development or manufacturing processes. This may include examination
of the validation documentation including plans, specifications, traceability
matrices, test protocols, scripts and reports.

4. Risk Management
Risk management is a systematic process for the identification, assessment,
control and documentation of risks to patient safety, product quality, data
integrity and business continuity based on a framework described in ICH Q9
Quality Risk Management.

Project teams are required to undertake formal risk assessment for


the identification and evaluation of risks and their control and mitigation.
In addition, for Information System (IS) projects, the risk management process
must be integrated within the system life cycle influencing the design of
computer hardware and software to minimise the likelihood of failure, and to
determine the level of validation required for the system through:

Identification of critical performance parameters


Selection of the essential functional requirements
Analysis of the architectural design
Code generation and review
Definition of test methods and the scope of testing
Assessment of security of electronic records and signatures
Change control, disaster recovery and business continuity.

The application of a risk management process enables effort to be focused on


critical aspects of the system life cycle in a controlled and justifiable manner.

25569-theRQA_Text_AW5.indd 4 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Risk scope may be defined for ISs according to business function:

Corporate and financial


Enterprise resource planning
Facilities and equipment
Manufacturing execution
Laboratory analytical
Packaging and labelling
Quality control and assurance
Research and development.

Risk type may be classified according to impact or root cause:

Business and financial – affecting the financial viability of the business


Operational – process inefficiencies, procedural, automation or
managerial failures
Project – affecting the likelihood of success and benefits of a change
initiative
Regulatory and strategic – affecting direction and regulatory compliance
Technological – related to the introduction of new technologies.

Risk severity may be determined according to area of impact


as follows:

Patient safety – potential impact on risk to patients


Product quality - potential impact on the quality of a drug product
Data integrity or accessibility - potential impact on the integrity

or availability of critical data
Business continuity – potential impact on the ability of the business to

operate following a system failure
Financial reporting - potential impact on financial regulations compliance

Personal identifiable information - potential compromise of data privacy

and intellectual property.

4.1 Risk and Risk Management


Risk is exposure to the possibility of a negative outcome. Risk management
is deliberate action to shift the odds in favour of a positive outcome.

4.2 Risk Mitigation


The process of reducing the potential impact of risks through prediction
and proactive strategy such as regularly updating templates used by
anti-virus software.

25569-theRQA_Text_AW5.indd 5 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

4.3 Risk Avoidance


Avoidance of specific risks requires removal of the root cause of the potential
risk so that it cannot have any impact, e.g. deciding not to implement a new
technology to run a manufacturing process.

4.4 Risk Acceptance


Acceptance of specific risks requires assessment of likelihood and potential
impact, acknowledgement that a negative outcome is possible and
undertaking activity required to mitigate impact.

4.5 Risk Decision Tree


A series of questions used in the risk assessment process to guide the
assessor formally to a risk rating as shown in Figure 2.

Figure 2 – Risk Flow Chart

Business
Requirements

Non- Patient or
Regulated Product
Regulatory
Impact

Monitoring Manufacturing
Information or Study data
High Impact
Business Assessment
Criticality

Low High
Data
High Criticality
High
Risk Exposure
System Low
Criticality
Low
Low

Low Risk Medium Risk High Risk

IS Best Practice +
IS Best Practice: IQ, IS Best Practice + Formal Testing +
DR, Change Control
Risk-based UAT
Validation Report

25569-theRQA_Text_AW5.indd 6 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

4.6 Risk Rating

An arithmetic value for a specific risk may be generated using an algorithm


to calculate risk as follows:

Risk is the combination of the seriousness of a threat (severity of harm)


with likelihood, or probability, of occurrence plus detectability of the event
using ratings such as 1, 2 and 3 for Low, Medium and High.

Severity (1-3) x Probability (1-3) x Detectability (3-1) = Risk rating

The impact on the validation process of the risk rating is principally the
scope and extent of the system testing, more extensive testing being
required where risk is higher. Typically low-risk systems rely on developer
testing and history of use in the customer community, whilst high-risk
systems are extensively tested by the end-user up to the boundaries
of the system’s capability.

The approach chosen will also impact the level of documentation that is
required to mitigate the risks, e.g. more detailed specification and test
documentation will be required for high-risk systems.

Risk assessments may also be carried out at critical steps in the validation
process to determine the extent of validation as shown in Figure 3.

Figure 3 – Risk Assessment and the Validation Process


Project Activities Validation Activities

User No
Is Validation End process,
Requirements RA Required? document decision
Specification

Yes

Response to URS Determine scope


of validation

Supplier Document scope


Assessment
RA of CSV approach

Functional Update
Specification Validation Plan
RA

Develop
System Tests RO Test Plans

Risk Assessment recommended


Validated System

RA Risk Assessment optional

Change Control

25569-theRQA_Text_AW5.indd 7 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

5. Project Deliverables

5.1 Project Initiation


A project to develop a new or upgraded computerised system can be initiated
from either a strategic or business initiative or a performance or functional
need from the IS infrastructure or operational team to support the business
demand. It may therefore begin with a proposal from IS management, an
end-user department or a vendor.

5.2 User Requirements Specification


The user requirements specification (URS) details the required and desired
functions and features from the end-users’ perspective. The URS is supplied
either by the vendor based on their understanding of the end user needs, or
gathered from selected expert users through meetings and workshops.

Requirements should be ‘SMART’, an acronym meaning Specific,


Measurable, Achievable, Relevant and Time-constrained.

Specific – each requirement must be focused on a clear objective or



broken down further
Measurable - the achievement of the requirement must be measurable
by an objective test
 Achievable – requirements must be non-contradictory and be realistic in
terms of the available functionality
Relevant - the requirements must be clearly related to a specific, agreed
business need
Time-constrained – it must be possible to meet all listed requirements
within the time-frame of the current implementation.

The URS must be reviewed and agreed with the end-user group.

See example in Appendix A.

5.3 Traceability Matrix


Each user requirement must be uniquely identified and be traceable through
a functional specification to the system testing as part of the validation
process, normally by using a traceability matrix that links requirement, function
and test identification codes in a table. The matrix is used prior to testing to
verify that every requirement has been addressed through a functional test,
or postponed to a later implementation.

See example in Appendix D.

25569-theRQA_Text_AW5.indd 8 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

5.4 Risk Assessment


Once requirements have been established the regulatory, safety, financial
and business risks can be identified with a view to mitigation activities. This
should be done following a standard procedure and ideally using a template.
The outputs should be used to direct the validation approach.

5.5 Roles and Responsibilities


Roles and responsibilities should be agreed with the project manager early
in the project, e.g. Table 1.

Table 1 – Team Roles and Responsibilities

Role Responsibility
Collation of requirements Project Team
Creation of validation documentation IS CSV
Technical review of documentation IS
Approval of documentation Business Users
Quality Assurance

5.6 Functional Specification


The functional specification (FS) is a technical description of the system
functionality created by the project team in a pre-development phase to
translate all notes, concepts and scope into a complete requirements
document. The functional specification must be clear, consistent, precise
and unambiguous. The document can include anything from text, diagrams,
flowcharts and screenshots to models. At a minimum, an FS will contain an
organised list of requirements that can be used for development testing, and
client sign-off. The FS holds all the critical information for the application
development project. The FS must go through an extensive review and
approval process once the requirements have been collated.

See example in Appendix B.

25569-theRQA_Text_AW5.indd 9 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

5.7 System Design Specification

The system design specification (SDS) is a description of the architecture


of the system based on a set of functional requirements that include system
performance criteria such as number and location of users, response
times, data manipulation, capacity and storage, required interfaces, internal
standards and regulatory compliance needs. The design specification should
include type and number of servers, operating systems and application
software including a complete list of vendors.

This document should detail how each requirement of the system is to be met
and is based upon the review and verification of functional requirements, any
constraints imposed upon the system and the project goals.

See example in Appendices C and G.

5.8 Software Development and Coding

Software creation is typically done by a specialist vendor as a bespoke


service, or as part of product development. Software development is a
multi-step process that begins with the approval of the user and/or
business requirements specifications. These are transcribed into technical
descriptions and incorporated into the FS taking into account constraints
such as operating system requirements, performance needs and internal
coding standards. Each functional requirement is broken down into a series
of single or unit operations, and then encoded using an appropriate language
and reviewed against standards and specifications. Lines of code are then
sequenced to form a unit that is tested with defined inputs against expected
outputs. Validated units are assembled into larger modules or ‘routines’
and the integration is validated with more testing. Ultimately the modules
are stored in a library that allows them to be called, linked and exercised by
the supervisory software that responds to requests from the user or other
software. Whatever development methodology is used, the end-user testing
must be traceable to specific requirements.

Agile software development methodologies are now becoming popular in


regulated environments as an alternative to the linear approach typified by
the waterfall method or V-model. Agile methods are more adaptive and less
plan-driven.

10

25569-theRQA_Text_AW5.indd 10 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Examples of agile methods include eXtreme programming (XP), feature


driven development (FDD), test driven development (TDD), dynamic
systems development method (DSDM) and scrum. These do not mandate
a fixed set or sequence of activities nor even supporting artefacts such as
a user requirements specification document, design analysis or design
architecture.

It is this lack of fixed deliverables and defined task sequence that is of


major concern to regulated companies. The agile methods place the
principles of people and communications ahead of any defined process
and documentation. Therefore when an agile method is used in a regulated
environment, a number of controls and safeguards must be in place and an
understanding of risks must be included in the strategy.

The following controls should be considered:

The business domain expert must have an understanding of the regulatory


requirements and be a full-time resource for the agile based project,
rather than be a consultant only at the beginning and the end as in a typical
waterfall project.

The independent quality function must communicate the need to be included


in stand-up (short duration) progress meetings, to break down any ‘them and
us’ attitude that could arise.

All stakeholders must define and agree up-front the required documentation
to be produced in the modified agile based project, whether it is before,
during or post development iterations.

In summary, the agile methods can be used within the regulated


environment, but not straight ‘out-of-the-box’, with a few minor inclusions.
The benefits of agile methods can be realised and such software can be used
in a compliant system.

11

25569-theRQA_Text_AW5.indd 11 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

5.9 Implementation (Architecture)

The system build phase is an assembly process during which the system
design is realised in the form of hardware and software, i.e. the server
and application components. This realisation process is documented in a
schematic and the build process recorded in the installation qualification (IQ)
in the form of protocols, scripts and a report.

See example in Appendix G.

5.10 Configuration and Final System Testing

The configuration phase begins when the base system is qualified with all server
and application components installed according to the manufacturer and/or
developer instructions. Configuration is the process of adapting the system
to the business workflow and requires that a system administrator selects
options or sets switches so that the application meets the specific end-user
requirements as far as is practicable. Configuration must be tested through UAT
or performance qualification (PQ) to ensure that critical end-user requirements
have been met.

5.11 Release into Production

Release into production is a business critical process that allows the


system to be operated in a realistic, operational environment with genuine
business data and the intended business workflow. There are associated
data migration and access granting processes that have pre-requisites that
include user training and approval from QA and management.

5.12 System Maintenance

The maintenance phase is also known as operational use and is


characterised by a handover from the development team to the support
staff. The validated status of the system is now maintained through a change
management process that uses formal impact assessment and risk-benefit
analysis to process requests for changes to the system. It is important
to ensure that a change requested by a minority group is not allowed
to compromise essential functionality used by the majority and that the
validation status is not compromised.

12

25569-theRQA_Text_AW5.indd 12 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

5.13 Decommissioning
When a system is no longer required by the business, a plan should be
developed and approval obtained for an orderly transition to an upgraded or
replacement system taking into consideration migration of, and on-going
access to, existing data. Furthermore a mechanism should be in place to
ensure that the new system can be adapted to evolving business requirements.

Figure 4 – Decommissioning Activities

Archive

Audit Trail Image Audit Trail Snapshot

Data Copy

Data Image Data Snapshot

Data Migration
Data Verification Retirement

Replacement system Existing system

6. Validation Process

6.1 Validation Assessment


Validation assessment is a formal risk analysis process to determine
whether a computerised system falls under the scope of regulatory
inspections (e.g. by holding information that could impact product quality,
patient safety or financial results), or is business critical. Typically the
assessment comprises a decision tree, similar to that shown for risk in
section 2. Validation differs from basic acceptance testing (UAT) by virtue of
the formal documentation it generates and the requirement for independent
review of the process, normally by QA.

6.2 Supplier Qualification and Audit


Supplier or vendor qualification and audit are QA processes providing
documented evidence that a third party has a quality system in place which
complies with industry acceptable practices and follows that quality system
to provide quality assured products and services.

The qualification process is an assessment based on responses to enquiries


and/or a summary of previous experience, while an audit requires an on-site
visit with a review of the vendor’s documentation.

13

25569-theRQA_Text_AW5.indd 13 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

6.3 Qualification Plan


A qualification plan (QP) outlines the scope and purpose of activities required
to qualify, or commission, components of the IT network infrastructure
that connect to and support the operation of individual computerised
systems. The QP generally includes references to separate infrastructure,
installation and operational qualification protocols to define tasks required to
complete the qualification. The plan itself does not need to include details of
qualification testing (i.e. qualification test scripts) or system documentation,
but should reference these documents and require their completion.

6.4 Validation Plan


The validation plan describes the tasks and responsibilities associated with
the validation deliverables identified by the risk assessment process and is
applicable for all regulated or business-critical systems acquired, developed,
installed or maintained for use by the business. It should include:

The objectives of the validation


The system description, including a list of system components
and equipment

A schedule for completing all qualifications, explicitly defining the scope



of the validation and the technical rationale and supporting data for any
planned deviation from the applicable standard operating procedure (SOP)
Responsibilities and authorities including approvers
Validation acceptance criteria
A list of documentation to be included in the validation package
High level system requirements and specifications
 A list of associated SOPs, e.g. system operation/maintenance, security,
backup recovery, archive, training and change control.

14

25569-theRQA_Text_AW5.indd 14 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Figure 5 – Validation Deliverables

Proposal Requirements Design Implementation Configuration Release VE Operation


LI
O
G

Validated
URS FDS SDS Test Scripts Test Reports System

Risk Analysis
Validation Assessment
Supplier Qualification and Audit
Validation Plan (VPL)
Installation Qualification (IQ)
Operational Qualification (OQ) or System Testing
Performance Qualification (PQ) or UAT
Release into Production (Sign Off)
Process Deviations
Validation Summary (VSR)
Traceability Change Control

Validation plans generally include reference to separate IQ/operational


qualification (OQ)/PQ/UAT protocols to define tasks required to complete
the validation. They do not need to include details of qualification testing
(i.e. IQ/OQ/PQ/UAT testing scripts) or system documentation, but should
reference these documents and require their completion. The validation plan
objectives must be formally reviewed and used when writing the validation
summary report in order to document that all elements of the plan have
been addressed.

6.5 Production readiness Check


The purpose of this production readiness check is to provide a means to
evidence that a specific system has been installed and validated according
to the approved validation plan, user requirements, manufacturer
specifications and quality objectives, to confirm that the validation of the
system was conducted according to the validation plan, and conclude on the
readiness of the system for release into the production environment.

The production readiness assessment is intended to permit the system to be


released into general use prior to completion of all the validation documents
required by the validation plan.

15

25569-theRQA_Text_AW5.indd 15 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

6.6 Planned Variances


Planned variances, or deviations from a prescribed model or methodology
may occur following a review of project issues and approval of the deviation.
The variances are summarised in this document. Unplanned deviations must
be documented in the validation summary.

6.7 Validation Summary


The validation summary reports on the success or otherwise of the activities
anticipated in the validation plan in order to document that all elements have
been addressed. Deviations, discrepancies and/or failures must be explained
fully, accepted or rejected and a corrective and preventive action (CAPA) plan
approved to resolve any open issues. Finally the validation summary must
report on the validation status of the system and identify any exceptions or
limitations on its use.

6.8 Change Control

Change control is the process of managing change in order to minimise the


impact on data integrity and system availability. The normal sequence is
change request, impact assessment, approval (or rejection), implementation
in a test environment, then release into production, as shown in Figure 6.

Figure 6 – Change Control Process

Testing Coding

Approval Impact Assessment

Operational
Change request
Use

16

25569-theRQA_Text_AW5.indd 16 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

7. Qualifications
The remaining qualification deliverables map to the projectdeliverables as
shown in Figure 7, to ensure traceability of all components. Traceability is
captured in a matrix that maps requirements to functions and qualifications
or tests to ensure that no requirement is unmet.

Figure 7 – Qualification Activities

Accepted Qualified
Proposal System System
Acceptance Performance
Requirements Definition Testing Qualifications
Qualified
URS System Testing functions
Traceability Tested Operational
Functional Design System Qualification

FDS Qualified components


Installation
System Design
Qualification
SDS Qualified design
Unit/Module Design Design Qualification

Code and Test

7.1 Sequence of Qualification Activities


The sequence of the qualification activities is defined in Table 2.

Table 2 – Qualification Sequence

Qualification Activity Prerequisites of the Qualification Activity

Risk analysis Project approved


Vendor assessment Risk analysis completed
Validation plan Vendor(s) approved
Purchase and/or development of system Hardware and software available
Installation qualification Validation plan approved
Operational qualification (or UAT) Installation qualification report approved
Performance ‘go-live’ qualification Operational qualification (or UAT) approved
Production readiness assessment Go-live qualification report approved
Variance document Production readiness assessment approved
Validation report SOP variance document approved

17

25569-theRQA_Text_AW5.indd 17 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

7.2 Retrospective Evaluation


Retrospective evaluation, also known as validation of legacy (pre-existing)
systems may be required for a change of use of the system of simply
because a review indicated that for compliance or business needs it is
necessary to demonstrate suitability for purpose.

The most effective strategy is to seek out as much information as possible


from users and the supplier to build up a portfolio of information. A gap
analysis will indicate whether new system testing is required to demonstrate
that the software works as intended. If this cannot be shown then the system
should be replaced.
Figure 8 – Retrospective Evaluation

Develop new URS Risk Analysis

NO
Maintain
Retire and Replace
System?

YES

YES
Existing
Test Coverage Documentation?
Analysis

NO
YES
Validation Plan and
Further
Qualifications
Testing?

NO
Validation Summary
Report

7.3 Infrastructure Qualification

Infrastructure qualification establishes documentary evidence that all


hardware, platforms and components are installed and built as specified in the
qualification plan and system design specification (if applicable), and that each
component or sub-system is installed and configured as per manufacturer or
architect instructions. During the system testing phase, system installation
and configuration in the pre-production environment is verified by following
procedures described in the infrastructure qualification protocol.
18

25569-theRQA_Text_AW5.indd 18 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

7.4 Supplier Qualification


Supplier or vendor qualification and audit are QA processes providing
documented evidence that a third party has a quality system in place which
complies with industry acceptable practices and follows that quality system
to provide quality assured products and services. The qualification process
is an assessment based on responses to enquiries and/or a summary of
previous experience, while an audit requires an on-site visit with a review of
the vendor’s documentation.

7.5 Installation Qualification


Installation qualification (IQ) execution establishes documentary evidence that
the system’s architecture has been installed as specified in the validation plan
and system design specification (if applicable) in its intended environment
according to specification, that all components are present and in the
proper condition, and that each component or subsystem was installed per
manufacturer or developer instructions.

7.6 Operational Qualification


Operational qualification (OQ) testing establishes documentary evidence
that each unit or subsystem functions as intended throughout its anticipated
operating range.

7.7 Performance Qualification


PQ testing establishes documentary evidence that the integrated system
performs according to critical user requirements, procedures and processes
in its intended operating environment.

7.8 User Acceptance Testing


User Acceptance Testing (UAT) may be executed in lieu of end-user OQ
and PQ when a software vendor is able to demonstrate to the end-user’s
satisfaction that OQ testing has been completed and documented to an
adequate standard.

UAT establishes documentary evidence that each end-user requirement can


be met by a system function, or functions, as intended in a representative
operational environment and with realistic inputs.

During the UAT, the system is installed in a test environment following procedures
described in the test protocol that also defines how the project team will execute
the system tests. The UAT scripts are listed within the UAT protocol and describe

19

25569-theRQA_Text_AW5.indd 19 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

specific tests that will be conducted to determine whether the system performs in
accordance with the operations defined in the functional specification which can
be mapped to critical end-user requirements.

The UAT protocol must be written by the project team and approved by QA
prior to initiation of testing. The UAT results are produced by execution of
the UAT protocol. Where UAT identifies functional or software errors these
must be either accepted or corrected according to a formal discrepancy
review procedure. Following any required corrective actions, the UAT report
documents that the system has performed the expected functions to fully
meet the URS.

7.9 Regression Testing


Regression testing may be required following code changes to ensure that
there has been no negative impact on previously tested functionality. The
selective re-testing of software that has been modified to fix bugs must
ensure that no other previously working functions have failed as a result
of changes and that newly added features have not created problems with
related functions.

7.10 Test Scripts


Test scripts are the underpinning evidence that the qualification steps have
been executed successfully, that test errors (also described as discrepancies
or failures) have been resolved through acceptance or remediation, and that
validation of the system is possible. Test scripts for all qualifications and for UAT
must include in writing a description of each test, an anticipated or expected
outcome, a record of the actual outcome and a conclusion on the pass/fail status
of the test. The tests must be indexed in the associated test protocol.

See example in Appendix E.

7.11 Maintenance
Maintenance is an on-going process that can compromise the validated state
of the system. Re-validation may be required at regular intervals due to
accumulating change implementations, bug-fixes and minor upgrades. The
process is known as periodic review and comprises of a formal assessment
of the change records and issue resolution log. Review of corrective actions
such as procedure changes and bug-fixes, and user surveys may be required
to gather the information. In the system life cycle the processes are as in
Figure 8.

20

25569-theRQA_Text_AW5.indd 20 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Figure 9 – Maintenance Activities

Code and review

Approval Change requests

Re-validation
Periodic Review Operational Use

Regression testing
Version Upgrade Release

Qualified system

8. Validation Strategy
8.1 Scope
The scope of validation for a specific application should be defined in the
validation plan, e.g.

 The application software


 Server hardware
 Server software
 Database server hardware
 Database engine and specific database instances
 Data conversion and transmission systems.

Out of scope components should also be declared, e.g.


 Unused modules and functionality
 Interfacing systems
 Client supporting software
 Network infrastructure (hardware and drivers shared
by applications).

21

25569-theRQA_Text_AW5.indd 21 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

8.2 Responsibilities and Authorities


Responsibility for each element of the project and validation documentation
is indicated in Table 3.

Table 3 – Team Documentation Responsibilities

Responsibility Matrix IS Business IS CSV IS QA


System System
Manager Owner

Validation Plan AP AP W AU

Risk Analysis and Validation AP AP W AU

User Requirements AP AP W AU

Functional Design AP AP W AU

System Architecture AP W AU

System Configuration AP AP W AU

Requirements Tracing Matrix AP AP W AU

IQ Protocol AP W E

IQ Report AP W AU

OQ Protocol AP AP W E

OQ Report AP W AU

PQ Protocol AP E W AU

PQ Report AP AP W AU

Variance Document AP AP W AU

Validation Report AP AP W AU

AP: Approve / AU: Authorise / E: Execute / R: Review / W: Author

The project manager should review all documents listed above.

System specific SOPs will be the responsibility of the business units to


develop and approve and may therefore be out of the validation scope for
the project. However, defined business processes need to be in place for the
system to go live.

8.3 System Architecture


The system architecture and high-level functions (e.g. purchase order
creation, transmission and tracking) should be fully described in the
validation plan with any history relating to existing applications, previous
versions or predecessors.

22

25569-theRQA_Text_AW5.indd 22 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

The architecture design features should be illustrated in a high-level


diagram showing the main server components. The system architecture
should be fully described in the system design specification documentation.

8.4 System Interfaces


The system may or may not be designed to interface with other computerised
systems and this must be stated explicitly in the system design specification
document. Even if there is no specific interface there may well be a requirement
to import files of various types from other systems, e.g. Laboratory or clinical
databases, office applications, etc., and export summary information and
submissions via electronic gateways or extranets.

8.5 Validation Approach

The general approach will be to establish through IQ, OQ and PQ (or IQ,
system testing and UAT) that the functionality and performance meets or
exceeds that available with the existing system configuration, and can meet
all requirements as documented in the URS.

8.6 Installation Qualification Strategy


An installation qualification (IQ) should be conducted to provide documented
evidence that the system and its components are installed according to the
specifications and current server standards. The IQ will be limited to the
system components identified in the system architecture diagram in the
system design specification.

Enterprise systems often reside on the two subsystems in separate locations


with the configuration designed to include primary and secondary (backup)
critical servers for database, application and remote desktop functions. The
latter server type may not need to be duplicated if there is redundancy for
load balancing. The IQ documentation would therefore comprise virtually
identical server-specific documents for multiple sites, but will be executed
by staff local to each one. The IQ is normally executed in the acceptance
(validation) environment.

8.7 Installation Qualification Documentation


IQ and OQ will be executed in an acceptance environment with access limited
to the IT project team and a small number of end-users. IQ documents may
be generated as separate documents such as protocol, test scripts and test
reports using templates. Separate test scripts should be generated for each
server, testing each component listed in the server software component table.

23

25569-theRQA_Text_AW5.indd 23 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Each server should be qualified using a bundled set of hardware and


software test scripts, linked by a testing matrix in the IQ protocol. The server
qualifications will be summarised in the IQ report.

8.8 Operational Qualification Strategy


The OQ will be used to establish that the system and its components operate
according to the specifications described in the functional specification. The
OQ may be designated as system testing, but the objective is similar.

Initially a ‘sandbox’ environment should be built to allow evaluation of functionality


against the user requirements. Once a vendor is chosen, the sandbox environment
will be rebuilt and used as the validation environment for OQ testing.

The OQ protocol should include a test coverage analysis to take into account
any functional testing that has already been performed by the developer.
The OQ will therefore verify that all functionality identified in the FS operates
correctly, and additionally will include any regression testing if the system
is an upgrade. The OQ may pre-execute some components of the PQ testing
depending on the final acceptance testing required by the end-users.
The business process should be simulated during this PQ with the aim of
demonstrating that the critical functions are operating correctly.

The OQ will normally be executed in the acceptance (validation) environment.

8.9 User Acceptance Testing


User Accep may be executed in lieu of end-user OQ and PQ when a
software vendor is able to demonstrate to the end-user’s satisfaction that
OQ testing has been completed and documented to an adequate standard.

8.10 Migration and Conversion Strategy


Where a system is being upgraded or replaced, if the existing data format is
compatible with the new version, no data conversion will be required. However,
a QC check should be carried out to ensure that files have been correctly
migrated. Any changes to data format or folder structure in line with business
requirements will require verification that the database has not been corrupted
with orphaned data.

Initially, the upgraded system will be operated in a development environment


with access limited to the IT project team, while the architecture is evaluated
and initial performance assessments are completed.

Cut-over will normally occur over a weekend and no users should be allowed
to access the system during this period.

24

25569-theRQA_Text_AW5.indd 24 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

8.11 Go-Live Performance Qualification Strategy


A pre-production qualification should be executed in the acceptance
environment as part of the ‘go-live’ and/or PQ, with access granted to key
users of the system to exercise the system with valid data. Once validation is
complete, the wider user community is given access to the production system.

The acceptance environment is essentially a model or replica of the


system production environment, to provide documented evidence through
appropriate testing that the system meets the business requirements,
identified as critical in the URS, in a consistent way.

The go-live PQ consists of a protocol and a final report written by the


validation team.

The content and execution of the go-live PQ should be agreed with the
project team. The business process will be simulated during the go-live/PQ
with the aim of demonstrating the critical system functions as agreed with
the end-users.

8.12 Validation Approval Statement


Once the go-live PQ is complete and assuming all tests have passed without
major or critical exceptions, a validation approval statement should be
issued to confirm the system is fit for use in the production environment by
all users. As all the data will have been migrated into the new system and
the previous system is no longer available for use, the release for production
need not be delayed whilst the results of the final testing are written,
reviewed and approved. As a result, the necessary business processes will
need to be in place prior to executing the go-live/PQ.

8.13 Acceptance Criteria


In order for the validation to be considered successful and complete, all
qualifications must be completed, all required documentation must be
present, accurate and complete and test results must be as expected.
Any deviations encountered must be satisfactorily resolved.

The validation process should be approved according to the acceptance criteria


described in Table 4.

25

25569-theRQA_Text_AW5.indd 25 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Table 4 - Validation Process Approval

Validation Process Acceptance Criteria


The system is accepted when:
 ll qualification activities included in the ‘Validation Plan’ are performed and declared
A
accepted.
The I/O/P Qualification is accepted when:
 he I/O/P Qualification activities are completed and no residual corrective actions are
T
required.
The I/O/P Qualification is conditionally accepted when:
 he I/O/P Qualification activities are completed; corrective actions remain to be
T
completed but no issues prevent system use or remain unsolved without an obvious
solution, and
An I/O/P Qualification report summarising execution is written and approved.

8.14 Production Readiness Assessment


The production readiness assessment will consist of documented confirmation
that all qualifications and acceptance tests have been executed, reviewed and
approved, that the system architecture is stable and as specified in the SDS,
and that all global users have been granted access to the system.

8.15 Planned Variance Document


A variance document will be included with the validation documentation to
record any deviations from the validation procedure and assess the impact of
the deviations on the validation status of the system.

8.16 Validation Summary Report and Project Summary Report


Depending on allocation of responsibilities, a combined report may be written
concluding on both the project and the validation activities, or separate
validation summary report (VSR) and project summary report (PSR) may be
produced. This VSR will summarise the results of the validation activities and
give a status for each acceptance criteria described in this validation plan. The
VSR will take account of any issues noted in the SOP variance document.

The VSR concludes on the validation status of the system. If the VSR classifies
the system as ‘not accepted’, the IS system manager is responsible to update
the VSR when the status can be changed to ‘accepted’.

The VSR will also address compliance with the validation plan, and concludes
the validation activities for the project.

26

25569-theRQA_Text_AW5.indd 26 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

8.17 System Operation and Maintenance


In order to keep the system under control and validated, the following
procedures must be followed.
 Changes to the system will be managed through a change control
procedure
 Maintenance and use will be performed according to the change
control procedure
 Incidents on the system will be managed through the incident
management procedure
 User access will be managed according to the system procedures
 Control of the validation status will be performed according to the
validation procedure.

8.18 Documentation

Table 5 indicates the documentation which should be included in the system


validation documentation set.

Table 5 – Project Documentation

Title
Risk analysis report IQ protocol and report
Validation assessment OQ protocol and report, or user
acceptance testing
Project proposal, plan, roles and PQ /go-live protocol and report
responsibilities
Validation plan Production readiness assessment
Vendor assessment reports SOP variances and deviations
User requirements specification Validation summary
User requirements matrix Project summary
Functional requirements specification Change control log or records
System design specification

27

25569-theRQA_Text_AW5.indd 27 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

The following SOPs in Table 6 are required for the validation, use and
maintenance of the system:

Table 6 – System Procedures

SOP Title
Information systems security
Project QA or validation process
Procedure variances and resolution
Risk (and validation) assessment
Vendor qualification
Validation (and project) plan
User requirements specification
Functional requirements/design specification
System design specification
Installation qualification protocol, results and report
Operational qualification protocol, results and report
Performance qualification protocol, results and report
User acceptance testing protocol, results and report
Production readiness assessment
Validation summary
Training procedure
User access and administration procedure
User application permissions
Change control procedures

Backup, recovery and archive

Disaster recovery procedure

System administration and maintenance

28

25569-theRQA_Text_AW5.indd 28 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

9. Appendix A – Example User Requirements Specification


1. Document Management
1.00 The system should allow defining a master list of documents.

1.10 The system should allow defining documents related to different stages of the study.

1.20 The system should allow defining documents related to regulatory authorities, labs,
sites, investigators etc.

1.30 The system should allow defining regulatory documents required to be tracked
during the study execution.

1.40 The system should allow defining non-regulatory (contractual, legal) documents
required to be tracked during the study execution.

1.50 The system should allow choosing mandatory documents for all studies for sites,
subjects, irb/ec, labs, etc.

1.60 The system should allow defining custom fields for all the documents defined.

1.70 The system should allow defining notifications based on status changes.

2. Document Association
2.10 The system should list all the documents defined as ‘mandatory’ that would be
tracked for a study/protocol including documents related to study, site, irb, lab,
personnel, investigators, vendors etc.

2.20 The system should allow adding and/or deleting to the list of documents for a
study/protocol including documents related to study, site, irb, lab, personnel,
investigators, vendors etc.

2.30 The system should automatically associate documents when a study/protocol is added.

2.40 The system should automatically associate documents when a country is added to
the study.

2.50 The system should automatically associate documents when a site is added to the
study.

2.60 The system should automatically associate documents when an investigator is added
to the study.

2.70 The system should automatically associate documents when a lab is added to the
study or site.

2.80 The system should automatically associate documents when a regulatory board is
added to the study.

3. Document Tracking and Attaching files

3.10 The system should send notifications based on status changes and other definitions.

3.20 The system should allow attaching files (.DOC, .PDF, .XLS, etc.) to all the documents
being tracked. Specifically the following documents can be attached *.txt ( Text ), *.doc (
Microsoft Word ), *.pdf ( Acrobat ), *.xls ( Microsoft Excel ) *.rtf ( Rich Text Format ),*.mpp
( Microsoft Project ), *.htm ( Hyper Text Markup ) *.log ( Log files ), *.ppt ( Microsoft Power
Point ), *.jpg (Joint Photographic Experts Group )

29

25569-theRQA_Text_AW5.indd 29 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

10. Appendix B – Example Functional Design


Specification
Functional Design Specification for a Content Management System

Records Repository

Helps ensure the integrity of the files stored in the repository, and supports
information management policies that consistently and uniformly enforce
auditing and expiration of records.

Email Content as Records

Provides consistent, policy-based solutions for managing e-mail content


across Outlook 2007, Exchange and STS.

Legal Holds

Makes it possible for records to be searched and placed on hold during


litigation discovery to override the retention schedule of the records.

High Fidelity Websites with Consistent Branding

Provides the concept of master pages and page layouts to enforce the
branding and navigation of websites. CSS supports gives pixel level control
on the look and feel of these sites.

Navigation Controls

Out-of-the box navigation controls that can be easily customised by


end-users.

Content Authoring

Provides the ability for information workers to create content rich


webpages using a web browser.

Content Publishing and Deployment

Built in approval workflow allows web content to be sent for approval prior
to publishing. Content deployment to production sites can be scheduled by
setting up jobs and a ‘live’ time period for each page can be specified within
which that page is viewable.

30

25569-theRQA_Text_AW5.indd 30 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Site Templates

Includes support for several new enterprise site templates:


The enterprise portal template provides a means for a business unit to
create and share content that is relevant to the ongoing operation of an
enterprise, division, or business unit
The corporate internet presence site template includes tools and
workflows to create and manage web content for products and service
descriptions, company news, and public filings, among other things
The application portal template brings together all of the tools
and information related to a particular line-of-business application
The roll-up portal template consolidates data and content from several
applications or locations and presents it in an integrated format.

Page Layouts

Page layouts simplify content authoring and publishing - site administrators


define a structure that guides authors through the publishing process; content
contributors focus on doing their jobs rather than on the details of publishing
and deployment. Flexible page layouts also allow designers to mix and match
ASP.NET applications, web parts and authoring templates in any configuration
to create customised sites to meet specific business needs.

Site Variations

A new feature of STS, sites can be linked together in a parent-child type


of relationship providing a one-way orchestration framework for web
content. This feature allow organisations to deploy multi-lingual publishing
sites in a much more structured and manageable environment.

WYSIWYG Web Content Editor

Extends the SharePoint user interface with additional commands and status
indicators for in-context webpage authoring.

Slide Libraries

The repository features in Windows SharePoint Services 3.0 provide the


platform support for slide libraries, a feature of STS. Slide libraries enable
the storage of individual slides in a SharePoint site. PowerPoint slide decks
can be automatically created from a selection of slides in a slide library.

31

25569-theRQA_Text_AW5.indd 31 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Policies, Auditing and Compliance

Repositories in STS support the following policy, auditing and compliance


features.
 Document retention and expiration policies
 Highly customisable policies
 Workflow process to define expiration
 Access control and security
 IRM policies applied on download to secure the functional access
to documents
 Logging of all actions on sites, content and workflows
 Official document-of-record repositories
 Site for storing or archiving enterprise approved content types.

11. Appendix C – Example System Design


Specification
Assumptions and Dependencies
 Related software or hardware
 Operating systems
 End-user characteristics
 Possible and/or probable changes in functionality.

General Constraints

Describe any global limitations or constraints that have a significant impact


on the design of the system’s software (and describe the associated impact).
Such constraints may be imposed by any of the following:
 Hardware or software environment
 End-user environment
 Availability or volatility of resources
 Standards compliance
 Interoperability requirements
 Interface/protocol requirements
 Data repository and distribution requirements
 Security requirements (or other such regulations)
 Memory and other capacity limitations
 Performance requirements
 Network communications

32

25569-theRQA_Text_AW5.indd 32 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

 Verification and validation requirements (testing)


 Other means of addressing quality goals
 Other requirements described in the requirements specification.

Goals and Guidelines

Describe any goals, guidelines, principles, or priorities which dominate or


embody the design of the system’s software. Such goals might be:
 The KISS principle (‘Keep it simple, stupid!’)
 Emphasis on speed versus memory use
 Working, looking, or ‘feeling’ like an existing product.

For each such goal or guideline, unless it is implicitly obvious, describe the
reason for its desirability.

Development Methods

Briefly describe the method or approach used for this software design. If one
or more formal methods were adopted or adapted, then include a reference
to a more detailed description of these methods. If several methods were
seriously considered, then each method should be mentioned, along with a
brief explanation of why all or part of it was used.

Architectural Strategies

Describe any design decisions and/or strategies that affect the overall
organisation of the system and its higher-level structures. These strategies
should provide insight into the key abstractions and mechanisms used
in the system architecture. Describe the reasoning employed for each
decision and/or strategy (possibly referring to previously stated design goals
and principles) and how any design goals or priorities were balanced or
traded-off. Such decisions might concern (but are not limited to) things like
the following:
 Use of a particular type of product (programming language, database,
library, etc.)
 Reuse of existing software components to implement various features of
the system
 Plans for extending or enhancing the software
 User interface paradigms (or system input and output models)
 Hardware and/or software interface paradigms
 Error detection and recovery
 Memory management policies
 External databases and/or data storage management and persistence
 Distributed data or control over a network

33

25569-theRQA_Text_AW5.indd 33 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

 Generalised approaches to control


 Concurrency and synchronisation
 Communication mechanisms
 Management of other resources.

Each significant strategy employed should probably be discussed in its own


subsection, or (if it is complex enough) in a separate design document. Make
sure that when describing a design decision that you also discuss any other
significant alternatives that were considered, and your reasons for rejecting
them. Employ a standard format for describing a strategy.

System Architecture

This section should provide a high-level overview of how the functionality


and responsibilities of the system were partitioned and then assigned
to subsystems or components. Don’t go into too much detail about the
individual components themselves. The main purpose here is to gain a
general understanding of how and why the system was decomposed, and
how the individual parts work together to provide the desired functionality.

At the top-most level, describe the major responsibilities that the software
must undertake and the various roles that the system (or portions of
the system) must play. Describe how the system was broken down into
its components/subsystems (identifying each top-level component/
subsystem and the roles/responsibilities assigned to it). Describe how the
higher-level components collaborate with each other in order to achieve
the required results. Provide some sort of rationale for choosing this
particular decomposition of the system (perhaps discussing other proposed
decompositions and why they were rejected). Feel free to make use of design
patterns, either in describing parts of the architecture, or for referring to
elements of the architecture that employ them.

If there are any diagrams, models, flowcharts, documented scenarios


or use-cases of the system behaviour and/or structure, they may be
included (unless you feel they are complex enough to merit being placed
in the detailed system design section). Diagrams that describe a particular
component or subsystem should be included within the particular
subsection that describes that component or subsystem.

34

25569-theRQA_Text_AW5.indd 34 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

12. Appendix D – Example Traceability Matrix


URS ID No. Requirement Description: Category = Reports Test ID No.
1.1 Protocol – cross protocol activity tracking report REP_PRO_001
1.2 Study - milestone tracking report REP_SMT_002
1.3 Site – subject log report REP_SSL_003
1.4 Site – site monitor visit schedule report REP_SSM_004
1.5 Site – planned site visits by protocol report REP_PSV_005
1.6 Site – list of sites by protocol report REP_SBP_006
1.7 Site – subject CRF tracking report REP_SCR_007
1.8 Site – planned CRA site visits report REP_PCR_008
1.9 Subject – CRF tracking report REP_CRT_009
1.10 Subject – subject visits report REP_SSV_010
1.11 Subject – adverse event – excel report REP_AEE_011
1.12 Subject – CRFs collected report REP_CFC_012
1.13 Subject – detailed subject list report REP_SDS_013
1.14 Subject – subject summary report REP_SSS_014
1.15 Audit trail report REP_ATR_015
1.16 SAE status report REP_SAE_016
1.17 DCFs outstanding report REP_DCF_017
1.18 Payments for a protocol report REP_PFP_018
1.19 Scheduled payments for a protocol report REP_SPP_019
1.20 Protocol deviations report REP_PDR_020
1.21 Visit report status summary report REP_RSS_021
1.22 Visit report status report REP_VRS_022
1.23 Project review form report REP_PRF_023
1.24 Site monitor visit report REP_SMV_024
1.25 Task assignment report REP_TAR_025
1.26 List of study contacts report REP_LSC_027
1.27 List of protocol and personnel per country report REP_PPC_028
1.28 Planned subject visits report REP_PSV_029
1.29 Milestone report REP_MSR_030

35

25569-theRQA_Text_AW5.indd 35 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

13. Appendix E – Example Test Script

14. Appendix F – References


 FDA 1987 Guideline on General Principles of Process Validation
 FDA 21CFR PART 11 Electronic Signatures, Electronic Records
 Guidance for Industry - Part 11, ERES - Scope and Application
 Guidance for Industry - Computerised Systems Used in Clinical
Investigations – May 2007
 Guidance for Industry - Q9 Quality Risk Management, ICH/FDA, 2006
ICH GCP 5.5.3 Electronic Systems
 OECD Consensus Document 10 Application of GLP to
Computerised Systems
 GMP for Medicinal Products in the EU Annex 11 Computerised Systems,
January 1992
 GAMP 5: A Risk-based Approach to Compliant GxP
Computerised Systems
 Pharmaceutical Inspection Convention GMP Guide Annex 11
Computerised Systems
 ACDM/PSI Clinical Research Computerised System Validation.

36

25569-theRQA_Text_AW5.indd 36 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

15. Appendix G – Example Architecture Diagram

37

25569-theRQA_Text_AW5.indd 37 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

16. Appendix H – The Iterative or Prototyping CSV Model


Process controlled by policies, SOPs, standards

Operational Use Initiation


Management approval Project request
Acceptance Certificate High level URS
Exceptions/Exclusions Feasibility study
Training Risk Assessment
Procedures Approval
Maintenance Validation Plan
Support
Change Control Requirements
Project Team
Test Detailed URS
Functional Requirements
Developer testing Planning document
Unit/Integration/ Prototyping (VMP, QP)
System test/ Assess vendor/package
Acceptance testing Approval
Test against URS/FRS
Test plans/scripts/report

Build Design
Coding High Level/Detailed design
Programming standards Developer test plans
Source Code Project Plan review/revision
Approval Approval

17. Acronyms
CAPA Corrective and Preventive Action
CSV Computerised System Validation
DSDM Dynamic Systems Development Method
FDD Feature Driven Development
FDS Functional Design Specification
FS Functional Specification
IQ Installation Qualification
IS Information Systems
OQ Operational Qualification
PQ Performance Qualification
PSR Project Summary Report
QA Quality Assurance
QP Qualification Plan
RQA Research Quality Association
SOP Standard Operating Procedure
SMART Specific, Measurable, Achievable, Relevant, Time-constrained
TDD Test Driven Development
UAT User Acceptance Testing
URS User Requirements Specification
VSR Validation Summary Report
XP eXtreme Programming

38

25569-theRQA_Text_AW5.indd 38 30/01/2013 09:46


Sold to Anand Pandya (#F38UUULQ)

Research Quality Association (RQA)


An Association dedicated to informing and advancing its members, the RQA
provides status and visibility for individuals concerned with the quality of research
in pharmaceutical, agrochemical and chemical industry sectors. Since its inception
in 1977, the Association has grown and developed to reflect regulatory changes, the
impact of regulatory inspection and the changing structure and needs of the industry.

Who Should Join the RQA?


All those with an interest in the quality and integrity of scientific research and
development and those working with, or interested in, Good Laboratory, Clinical or
Manufacturing Practice regulations or Pharmacovigilance, be they from industry,
government, academia or contract research and wherever in the world they are based.

Benefits of Membership
Quasar (members quarterly magazine)
Online Members Directory
Website (members area)
Conferences (discounts for members)
Professional Development Courses and Seminars (discounts for members)
Networking

Professional Development Courses


The RQA’s well established professional development programme offers courses
to suit all levels and disciplines. It includes courses aimed at meeting the needs of
auditors, scientists and managers. The RQA also offers the flexibility of running these
courses ‘in-house’ both in the UK and abroad.

For more information on RQA membership and our professional development


programme please contact us at

RQA, 3 Wherry Lane, Ipswich, Suffolk, IP4 1LG, UK


Tel: +44 (0)1473 221411 Fax: +44 (0)1473 221412
Email: rqa@therqa.com
Website: www.therqa.com

CSV - Risks, Requirements, Tests and Traceability CVR_A5.indd 3 30/01/2013 10:11:10


Sold to Anand Pandya (#F38UUULQ)

Published by Research Quality Association


3 Wherry Lane, Ipswich
Suffolk, IP4 1LG
January 2013
ISBN 978-1-904610-20-5
RRP £10.00

CSV - Risks, Requirements, Tests and Traceability CVR_A5.indd 2 30/01/2013 10:11:09

You might also like