Professional Documents
Culture Documents
CSV Risks Requirements Tests and Traceability
CSV Risks Requirements Tests and Traceability
CSV Risks Requirements Tests and Traceability
Computerised System
Validation
Risks, Requirements, Tests and Traceability
3 Wherry Lane
Ipswich
Suffolk
IP4 1LG
UK
January 2013
IBSN 978-1-904610-20-5
CONTENTS Page
Foreword 1
Acknowledgments 1
1. Introduction 2
2. The Validation Process 3
3. The Regulations 4
4. Risk Management 4
4.1 Risk and Risk Management 5
4.2 Risk Mitigation 5
4.3 Risk Avoidance 6
4.4 Risk Acceptance 6
4.5 Risk Decision Tree 6
4.6 Risk Rating 7
5. Project Deliverables 8
5.1 Project Initiation 8
5.2 User Requirements Specification 8
5.3 Traceability Matrix 8
5.4 Risk Assessment 9
5.5 Roles and Responsibilities 9
5.6 Functional Specification 9
5.7 System Design Specification 10
5.8 Software Development and Coding 10
5.9 Implementation (Architecture) 12
5.10 Configuration and Final System Testing 12
5.11 Release into Production 12
5.12 System Maintenance 12
5.13 Decommissioning 13
6. Validation Process 13
6.1 Validation Assessment 13
6.2 Supplier Qualification and Audit 13
6.3 Qualification Plan 14
6.4 Validation Plan 14
6.5 Production Readiness Check 15
6.6 Planned Variances 16
6.7 Validation Summary 16
6.8 Change Control 16
7. Qualifications 17
7.1 Sequence of Qualification Activities 17
7.2 Retrospective Evaluation 18
7.3 Infrastructure Qualification 18
7.4 Supplier Qualification 19
Foreword
This booklet has been produced by the RQA Computing Committee to provide
guidance on current best practice and the application of risk management
in Computerised System Validation. The principles are applicable to all
regulated computerised systems, particularly those used within the
pharmaceutical industry.
Acknowledgments
Much of the information within this booklet has been developed concurrently
with the the RQA courses ‘Computerised System Validation - Risks,
Requirements, Tests and Traceability’ and ‘Auditing Computerised
Systems.’ The RQA is indebted to the whole Computing Committee for their
commitment of time and effort in contributing the material and expertise
that has led to this publication.
1. Introduction
In 1987 FDA published a ‘Guideline on Process Validation’ that defined
validation as: ‘Establishing documented evidence that provides a high degree of
assurance that a specific process will consistently produce a product meeting its
predetermined specifications and quality attributes’.
This definition was later expanded to include computer systems used in the
development and production of pharmaceutical products and medical devices.
In ‘General Principles of Software Validation (2002)’, FDA stated that:
‘The system life cycle model that is selected should cover the software from
specification through to decommissioning. Activities in a typical system life cycle
model include the following:
Quality planning
System requirements definition
Detailed software requirements specification
Software design specification
Development and testing
Installation and configuration
Qualification and release
Operation and support
Maintenance
Decommissioning.
Verification, testing and other tasks that support software validation occur
during each of these activities. Software requirements should be evaluated to
verify that:
FDA has also stated that, ‘It is not possible to validate software without
predetermined and documented software requirements’.
For the purposes of this guide, a linear model will be used to illustrate
a simple, step-by-step process, as illustrated in Figure 1.
Validated
URS FDS SDS Test Scripts Test Reports System
Risk Analysis
Validation Assessment
Supplier Qualification and Audit
Validation Plan (VPL)
Installation Qualification (IQ)
Operational Qualification (OQ) or System Testing
Performance Qualification (PQ) or UAT
Release into Production (Sign Off)
Process Deviations
Validation Summary (VSR)
Traceability Change Control
3. The Regulations
All of the following regulations require that computerised systems meet
their quality requirements through validation:
4. Risk Management
Risk management is a systematic process for the identification, assessment,
control and documentation of risks to patient safety, product quality, data
integrity and business continuity based on a framework described in ICH Q9
Quality Risk Management.
Business
Requirements
Non- Patient or
Regulated Product
Regulatory
Impact
Monitoring Manufacturing
Information or Study data
High Impact
Business Assessment
Criticality
Low High
Data
High Criticality
High
Risk Exposure
System Low
Criticality
Low
Low
IS Best Practice +
IS Best Practice: IQ, IS Best Practice + Formal Testing +
DR, Change Control
Risk-based UAT
Validation Report
The impact on the validation process of the risk rating is principally the
scope and extent of the system testing, more extensive testing being
required where risk is higher. Typically low-risk systems rely on developer
testing and history of use in the customer community, whilst high-risk
systems are extensively tested by the end-user up to the boundaries
of the system’s capability.
The approach chosen will also impact the level of documentation that is
required to mitigate the risks, e.g. more detailed specification and test
documentation will be required for high-risk systems.
Risk assessments may also be carried out at critical steps in the validation
process to determine the extent of validation as shown in Figure 3.
User No
Is Validation End process,
Requirements RA Required? document decision
Specification
Yes
Functional Update
Specification Validation Plan
RA
Develop
System Tests RO Test Plans
Change Control
5. Project Deliverables
The URS must be reviewed and agreed with the end-user group.
Role Responsibility
Collation of requirements Project Team
Creation of validation documentation IS CSV
Technical review of documentation IS
Approval of documentation Business Users
Quality Assurance
This document should detail how each requirement of the system is to be met
and is based upon the review and verification of functional requirements, any
constraints imposed upon the system and the project goals.
10
All stakeholders must define and agree up-front the required documentation
to be produced in the modified agile based project, whether it is before,
during or post development iterations.
11
The system build phase is an assembly process during which the system
design is realised in the form of hardware and software, i.e. the server
and application components. This realisation process is documented in a
schematic and the build process recorded in the installation qualification (IQ)
in the form of protocols, scripts and a report.
The configuration phase begins when the base system is qualified with all server
and application components installed according to the manufacturer and/or
developer instructions. Configuration is the process of adapting the system
to the business workflow and requires that a system administrator selects
options or sets switches so that the application meets the specific end-user
requirements as far as is practicable. Configuration must be tested through UAT
or performance qualification (PQ) to ensure that critical end-user requirements
have been met.
12
5.13 Decommissioning
When a system is no longer required by the business, a plan should be
developed and approval obtained for an orderly transition to an upgraded or
replacement system taking into consideration migration of, and on-going
access to, existing data. Furthermore a mechanism should be in place to
ensure that the new system can be adapted to evolving business requirements.
Archive
Data Copy
Data Migration
Data Verification Retirement
6. Validation Process
13
14
Validated
URS FDS SDS Test Scripts Test Reports System
Risk Analysis
Validation Assessment
Supplier Qualification and Audit
Validation Plan (VPL)
Installation Qualification (IQ)
Operational Qualification (OQ) or System Testing
Performance Qualification (PQ) or UAT
Release into Production (Sign Off)
Process Deviations
Validation Summary (VSR)
Traceability Change Control
15
Testing Coding
Operational
Change request
Use
16
7. Qualifications
The remaining qualification deliverables map to the projectdeliverables as
shown in Figure 7, to ensure traceability of all components. Traceability is
captured in a matrix that maps requirements to functions and qualifications
or tests to ensure that no requirement is unmet.
Accepted Qualified
Proposal System System
Acceptance Performance
Requirements Definition Testing Qualifications
Qualified
URS System Testing functions
Traceability Tested Operational
Functional Design System Qualification
17
NO
Maintain
Retire and Replace
System?
YES
YES
Existing
Test Coverage Documentation?
Analysis
NO
YES
Validation Plan and
Further
Qualifications
Testing?
NO
Validation Summary
Report
During the UAT, the system is installed in a test environment following procedures
described in the test protocol that also defines how the project team will execute
the system tests. The UAT scripts are listed within the UAT protocol and describe
19
specific tests that will be conducted to determine whether the system performs in
accordance with the operations defined in the functional specification which can
be mapped to critical end-user requirements.
The UAT protocol must be written by the project team and approved by QA
prior to initiation of testing. The UAT results are produced by execution of
the UAT protocol. Where UAT identifies functional or software errors these
must be either accepted or corrected according to a formal discrepancy
review procedure. Following any required corrective actions, the UAT report
documents that the system has performed the expected functions to fully
meet the URS.
7.11 Maintenance
Maintenance is an on-going process that can compromise the validated state
of the system. Re-validation may be required at regular intervals due to
accumulating change implementations, bug-fixes and minor upgrades. The
process is known as periodic review and comprises of a formal assessment
of the change records and issue resolution log. Review of corrective actions
such as procedure changes and bug-fixes, and user surveys may be required
to gather the information. In the system life cycle the processes are as in
Figure 8.
20
Re-validation
Periodic Review Operational Use
Regression testing
Version Upgrade Release
Qualified system
8. Validation Strategy
8.1 Scope
The scope of validation for a specific application should be defined in the
validation plan, e.g.
21
Validation Plan AP AP W AU
User Requirements AP AP W AU
Functional Design AP AP W AU
System Architecture AP W AU
System Configuration AP AP W AU
IQ Protocol AP W E
IQ Report AP W AU
OQ Protocol AP AP W E
OQ Report AP W AU
PQ Protocol AP E W AU
PQ Report AP AP W AU
Variance Document AP AP W AU
Validation Report AP AP W AU
22
The general approach will be to establish through IQ, OQ and PQ (or IQ,
system testing and UAT) that the functionality and performance meets or
exceeds that available with the existing system configuration, and can meet
all requirements as documented in the URS.
23
The OQ protocol should include a test coverage analysis to take into account
any functional testing that has already been performed by the developer.
The OQ will therefore verify that all functionality identified in the FS operates
correctly, and additionally will include any regression testing if the system
is an upgrade. The OQ may pre-execute some components of the PQ testing
depending on the final acceptance testing required by the end-users.
The business process should be simulated during this PQ with the aim of
demonstrating that the critical functions are operating correctly.
Cut-over will normally occur over a weekend and no users should be allowed
to access the system during this period.
24
The content and execution of the go-live PQ should be agreed with the
project team. The business process will be simulated during the go-live/PQ
with the aim of demonstrating the critical system functions as agreed with
the end-users.
25
The VSR concludes on the validation status of the system. If the VSR classifies
the system as ‘not accepted’, the IS system manager is responsible to update
the VSR when the status can be changed to ‘accepted’.
The VSR will also address compliance with the validation plan, and concludes
the validation activities for the project.
26
8.18 Documentation
Title
Risk analysis report IQ protocol and report
Validation assessment OQ protocol and report, or user
acceptance testing
Project proposal, plan, roles and PQ /go-live protocol and report
responsibilities
Validation plan Production readiness assessment
Vendor assessment reports SOP variances and deviations
User requirements specification Validation summary
User requirements matrix Project summary
Functional requirements specification Change control log or records
System design specification
27
The following SOPs in Table 6 are required for the validation, use and
maintenance of the system:
SOP Title
Information systems security
Project QA or validation process
Procedure variances and resolution
Risk (and validation) assessment
Vendor qualification
Validation (and project) plan
User requirements specification
Functional requirements/design specification
System design specification
Installation qualification protocol, results and report
Operational qualification protocol, results and report
Performance qualification protocol, results and report
User acceptance testing protocol, results and report
Production readiness assessment
Validation summary
Training procedure
User access and administration procedure
User application permissions
Change control procedures
28
1.10 The system should allow defining documents related to different stages of the study.
1.20 The system should allow defining documents related to regulatory authorities, labs,
sites, investigators etc.
1.30 The system should allow defining regulatory documents required to be tracked
during the study execution.
1.40 The system should allow defining non-regulatory (contractual, legal) documents
required to be tracked during the study execution.
1.50 The system should allow choosing mandatory documents for all studies for sites,
subjects, irb/ec, labs, etc.
1.60 The system should allow defining custom fields for all the documents defined.
1.70 The system should allow defining notifications based on status changes.
2. Document Association
2.10 The system should list all the documents defined as ‘mandatory’ that would be
tracked for a study/protocol including documents related to study, site, irb, lab,
personnel, investigators, vendors etc.
2.20 The system should allow adding and/or deleting to the list of documents for a
study/protocol including documents related to study, site, irb, lab, personnel,
investigators, vendors etc.
2.30 The system should automatically associate documents when a study/protocol is added.
2.40 The system should automatically associate documents when a country is added to
the study.
2.50 The system should automatically associate documents when a site is added to the
study.
2.60 The system should automatically associate documents when an investigator is added
to the study.
2.70 The system should automatically associate documents when a lab is added to the
study or site.
2.80 The system should automatically associate documents when a regulatory board is
added to the study.
3.10 The system should send notifications based on status changes and other definitions.
3.20 The system should allow attaching files (.DOC, .PDF, .XLS, etc.) to all the documents
being tracked. Specifically the following documents can be attached *.txt ( Text ), *.doc (
Microsoft Word ), *.pdf ( Acrobat ), *.xls ( Microsoft Excel ) *.rtf ( Rich Text Format ),*.mpp
( Microsoft Project ), *.htm ( Hyper Text Markup ) *.log ( Log files ), *.ppt ( Microsoft Power
Point ), *.jpg (Joint Photographic Experts Group )
29
Records Repository
Helps ensure the integrity of the files stored in the repository, and supports
information management policies that consistently and uniformly enforce
auditing and expiration of records.
Legal Holds
Provides the concept of master pages and page layouts to enforce the
branding and navigation of websites. CSS supports gives pixel level control
on the look and feel of these sites.
Navigation Controls
Content Authoring
Built in approval workflow allows web content to be sent for approval prior
to publishing. Content deployment to production sites can be scheduled by
setting up jobs and a ‘live’ time period for each page can be specified within
which that page is viewable.
30
Site Templates
Page Layouts
Site Variations
Extends the SharePoint user interface with additional commands and status
indicators for in-context webpage authoring.
Slide Libraries
31
General Constraints
32
For each such goal or guideline, unless it is implicitly obvious, describe the
reason for its desirability.
Development Methods
Briefly describe the method or approach used for this software design. If one
or more formal methods were adopted or adapted, then include a reference
to a more detailed description of these methods. If several methods were
seriously considered, then each method should be mentioned, along with a
brief explanation of why all or part of it was used.
Architectural Strategies
Describe any design decisions and/or strategies that affect the overall
organisation of the system and its higher-level structures. These strategies
should provide insight into the key abstractions and mechanisms used
in the system architecture. Describe the reasoning employed for each
decision and/or strategy (possibly referring to previously stated design goals
and principles) and how any design goals or priorities were balanced or
traded-off. Such decisions might concern (but are not limited to) things like
the following:
Use of a particular type of product (programming language, database,
library, etc.)
Reuse of existing software components to implement various features of
the system
Plans for extending or enhancing the software
User interface paradigms (or system input and output models)
Hardware and/or software interface paradigms
Error detection and recovery
Memory management policies
External databases and/or data storage management and persistence
Distributed data or control over a network
33
System Architecture
At the top-most level, describe the major responsibilities that the software
must undertake and the various roles that the system (or portions of
the system) must play. Describe how the system was broken down into
its components/subsystems (identifying each top-level component/
subsystem and the roles/responsibilities assigned to it). Describe how the
higher-level components collaborate with each other in order to achieve
the required results. Provide some sort of rationale for choosing this
particular decomposition of the system (perhaps discussing other proposed
decompositions and why they were rejected). Feel free to make use of design
patterns, either in describing parts of the architecture, or for referring to
elements of the architecture that employ them.
34
35
36
37
Build Design
Coding High Level/Detailed design
Programming standards Developer test plans
Source Code Project Plan review/revision
Approval Approval
17. Acronyms
CAPA Corrective and Preventive Action
CSV Computerised System Validation
DSDM Dynamic Systems Development Method
FDD Feature Driven Development
FDS Functional Design Specification
FS Functional Specification
IQ Installation Qualification
IS Information Systems
OQ Operational Qualification
PQ Performance Qualification
PSR Project Summary Report
QA Quality Assurance
QP Qualification Plan
RQA Research Quality Association
SOP Standard Operating Procedure
SMART Specific, Measurable, Achievable, Relevant, Time-constrained
TDD Test Driven Development
UAT User Acceptance Testing
URS User Requirements Specification
VSR Validation Summary Report
XP eXtreme Programming
38
Benefits of Membership
Quasar (members quarterly magazine)
Online Members Directory
Website (members area)
Conferences (discounts for members)
Professional Development Courses and Seminars (discounts for members)
Networking