Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 8

Method Validation Implementation Scheme

Refer to the excel sheets containing both the process map and decision tree for the RASCIs below.

RASCI
Responsible that is the person who is the owner of the task
Accountable is the person to whom "R" is Accountable and or is the authority who approves to sign off on work before it is effective
Supportive that is a person who provides resources or plays a supporting role in implementation
Consulted that is a person who provides information and/or expertise necessary to complete the task
Informed that is a person who needs to be notified of results but need not necessarily be consulted

The following roles are involved in the Method Validation Process:


 User Group – (User)
 Analyst(s)
 Peer Reviewers – (PR)
 Project Leaders – (PL)
 Regulatory Affairs – (Reg.)
 Validation – (Val.)
 Quality Assurance – (QA)

Role Definitions:
Role Definition
User Group Department(s) involved in the generation/execution of protocols and generation of reports e.g. Tech Support, NPD
User Group Manager Individual(s) who are ultimately responsible and accountable for the members of the user group(s) e.g. NPD
Manager, QC Manager
Analyst(s) Personnel involved in executing the protocol, documenting the testing and collating the data.
Peer Reviewer Personnel, who have not taken part in the execution of the protocol, have not been involved in the creation of the
protocol and or the report and have been trained to review documents and raw data.
Project Leader Person who generates the protocol and or report and has been trained accordingly.
Subject Matter Expert An SME has relevant knowledge and expertise with the process and tasks to be completed. The SME can fulfil a
supportive function within the process.

Page 1 of 8
New Process RASCI – No Problems Encountered (refer to process map)

# Task R A S C I
User
1 Validation Standards Val. Officer Val. Coordinator Reg.
QA
User
2 Method Validation Template Creation Val. Officer Val. Coordinator User Reg.
QA
Val. PR Val. Coordinator User
3 Template Review
QA PR QA Manager Reg.
4 Draft Update Val. Officer Val. Coordinator
Val. Officer Val. Coordinator
5 Template Approval
QA Officer QA Manager
6 Draft Protocol User PL User Manager
7 Protocol Review User PR User Manager
8 Draft Protocol Update User PL User Manager
9 Protocol Sign-on User PR User Manager
10 Protocol Execution Analyst User Manager
11 Data Collation Analyst User Manager
12 Draft Report User PL User Manager SME
13 Report Review User PR User Manager SME Reg.
14 Draft Report Update User PL User Manager SME
15 Report Approval QA Officer QA Manager
16 Protocol, Report and Data Filing Val. Officer Val. Coordinator

As indicated in the RASCI tables, the department heads (managers and coordinators) are ultimately responsible and accountable for their staff
carrying out the set tasks and ensuring that they are to an agreed standard and quality. The department heads can delegate this responsibility
to others in their department but they will still be held accountable for the task completion. This should be represented by validation completion
and a RFT goals being incorporated in KPIs and PPPs of department heads and staff who report to them. The Sponsors will issue a general
directive for KPIs/PPPs to be updated to reflect this at the half year performance review.

Page 2 of 8
New Process RASCI – Problems Encountered (refer to decision tree and the ‘yes’ pathway)

# Task R A S C I
User
1 Validation Standards Val. Officer Val. Coordinator Reg.
QA
User
2 Method Validation Template Creation Val. Officer Val. Coordinator User Reg.
QA
Val. PR Val. Coordinator User
3 Template Review
QA PR QA Manager Reg.
4 Draft Update Val. Officer Val. Coordinator
Val. Officer Val. Coordinator
5 Template Approval
QA Officer QA Manager
Val.
6 Draft Protocol User PL User Manager
QA
Val.
7 Protocol Review User PR User Manager
QA
8 Draft Protocol Update User PL User Manager
User PR User Manager
9 Protocol Sign-on Val. Officer Val. Coordinator
QA Officer QA Manager
10 Protocol Execution Analyst User Manager
11 Discrepancy/Deviation Generation User PL User Manager
Val. Officer Val. Coordinator
12 Discrepancy/Deviation Approval
QA Officer QA Manager
13 Data Collation Analyst User Manager
14 Discrepancy/Deviation Generation User PL User Manager
Val. Officer Val. Coordinator
15 Discrepancy/Deviation Approval
QA Officer QA Manager
16 Draft Report User PL User Manager
17 Report Review User PR User Manager Reg.
18 Draft Report Update User PL User Manager
19 Report Approval QA Officer QA Manager
20 Protocol, Report and Data Filing Val. Officer Val. Coordinator
Page 3 of 8
Implementation Tasks:
 Document RASCI (see above)
 Stakeholder update (purpose of this document and subsequent meeting)
 Create templates
 Review templates
 Update SOPs
 Develop Performance Metrics
 Build Training Modules
 Deliver Training Modules
 Implement Performance Metrics
 Master SOPs
 Monitoring Period

Task Task Breakdown Timeframe


A process map will created with a supporting RASCI to 5th & 6th
Document RASCI
outline how the new process will work. March 2009
This meeting will give the stakeholders an update on the
process as well as allowing them to provide feedback in
Stakeholder 9th to 27th
regards to how this process should be implemented. It will
Update March 2009
also be used to confirm resourcing for the various steps of
implementation.
Create the main protocol & report templates for the tests most
9th to 27th
Create Templates commonly carried out i.e. assay, dissolution, related
March 2009
substances & residual solvents.
Baselines to be documented for the current process. These 9th to 27th
Generate Baseline
will be used to measure the performance of the new process. March 2009
23rd March
Send the templates out to the SMEs for input & review.
Review Templates to 3rd April
Update templates as necessary.
2009
Update the SOPs for the most common tests identified above
30th March
SOPs 447, 449, 1394. Also update SOP 446 which details
Update SOPs to 24th April
how to write reports. This will detail the process for creating a
2009
protocol & report templates.
These will be the metrics used within departments to monitor
the performance of those involved in the process and
performance of the process itself. This needs to be
Develop discussed with departments; current thoughts are to have 30th March
Performance RFT monitored in user departments internally as well as QA to 9th April
Metrics monitoring the RFT of documents reaching them, user 2009
departments to monitor cycle time for documents (creation to
approval), Validation to monitor cycle time of process
(validation number issue to validation filing).
Training modules will focus on the validation types identified
above; providing some general training on why each study in
27th April to
Build Training a validation is carried out (SMEs to be consulted), cover how
15th May
Modules to use the templates created as well as the process to create
2009
templates and finally what is expected of a document
reviewer.
Departments will nominate individuals who will receive the
training and become trainers (prerequisite – staff identified
Deliver Training 18th to 29th
must have completed the ‘train the trainer’ course). Only
Modules May 2009
trained staff will be allowed to generate, review and approve
documents for method validation.

Page 4 of 8
Task Task Breakdown Timeframe
Implement Ensure that the performance metrics above are ready for use
18th to 22nd
Performance and departments have trained their staff in the correct usage
May 2009
Metrics of the system.
After all the above steps have been completed the mastering 18th May to
Master SOPs of the SOPs will be the ‘go live’. Once the SOPs are in place 5th June
staff will be required to follow the new process. 2009
The monitoring period will consist of two parts, the first 3
months will be used to monitor the usage of the templates
8th June to
Monitoring Period and ensure that the content is deemed sufficient, all
9th October
signatories will still be reviewing/approving the documents
created.
This meeting will be used to discuss the progress of the new
Stakeholder 5th to 9th
process, any issues, review the performance metrics and will
Performance October
be used to evaluate whether the signatories can be reduced
Review Meeting 2009
or whether the initial monitoring period needs to be extended.
The second part of the monitoring process will be a 6 month
12th October
period where the signatories will be reduced, this will be used
Monitoring Period 2009 to 13th
to monitor the new process itself and provide metrics to report
April 2010
back to the MC.
If at the end of the monitoring period all relevant groups are
happy with how the process is performing, the process will be
Method Validation 13th April
considered to have been successfully implemented. If all
Implemented 2010
groups are not happy then the process may need to be
modified and or the monitoring period extended.
Once all groups are satisfied the process is performing
13th to 20th
Report to MC appropriately, the metrics gathered will be summarised and
April 2010
presented back to the MC.

Notes:
 This project will not generate all protocol and report templates to be used. The main
validations to be carried out routinely will be generated; all others will be the responsibility
of the Validation Departments and the user groups to generate. Any templates that are
generated are required to be approved by Validation and Quality Assurance before they
can be used.

 Due to time constraints to meet deadlines only the following main types of method
validation that will be considered by this project; assay (HPLC), dissolution (HPLC and UV),
unspecified impurities (HPLC), specified impurities (HPLC) and residual solvents. Of the
282 validations started between January 2007 and February 2009, 70 were for analytical
methods (this number excludes micro method suitability’s, method transfers, standard
expiries and identification tests). The breakdown of the 70 analytical methods were as
follows:
o 14 Assays (20%)
o 9 Dissolutions (13%)
o 16 Related Substances (23%)
o 13 Residual Solvents (19%)
o 11 Moisture content (protocol template already exists)
o 7 other various methods, including particle size, in vitro release testing and other
methods which details could not be ascertained from the validation database.

 Concentrating on the tests identified will mean that we will be realising approximately 70 to
75% of the possible benefits, which is the majority of the benefit.

Page 5 of 8
 This project will not update all SOPs for method validation. The method validation SOPs not
identified above will be the responsibility of the user groups to update. The relevant SOPs
must be updated and mastered before a new template can be used.

 At the end of the end of the 3 month and 6 month monitoring periods a summary report will
be requested from those departments that have been monitoring the process. These will
be used to evaluate the performance of the new process.

 It would be the responsibility of the Validation Department and the user groups to generate
all remaining protocol and report templates. The templates still required would be for Assay
(UV), Identification tests, GC methods other than residual solvents, Micro Method Suitability
and Method comparisons, Moisture Content and Particle Size. Of these tests, the main
tests would be Assay by UV, ID tests and the Micro Method Suitability and Method
comparisons. All relevant SOPs need to be updated before new templates can be released
for use. As the process frees resource this can be utilised to generate those protocol and
report templates that are outstanding and update any supporting SOPs. Any other
templates not identified above would be expected to be generated as they were required.

 All protocols and reports generated are expected to follow the new process.

This is still under consideration. If extra resource can be supplied by the user groups then there is the
possibility that more templates and SOPs can be created ‘up-front’.

Page 6 of 8
Method Validation Process Implementation Timeline:

Resource Requirements:
The following tasks will require resource from the respective departments:
 Baseline Generation (Validation & Quality Assurance)
o Validation & Quality Assurance to document time for review until document sign-on/approval, using data from timesheets and
metrics
o Assuming this exercise will take approximately 4hrs over the 3 week period

 Review of protocol and report templates (Tech Support, NPD, Regulatory Affairs, Validation & Quality Assurance)
o There will be 5 protocols to review and 4 reports (providing the unspecified and specified impurity reports can be merged)
o Electronic documents will be sent out to all departments and reviewers will be required to provide their feedback using track
changes and comment functions in MS Word
o Assuming approximately 1hr will be required for each document reviewed

 Update of the relevant SOPs


o SMEs will need to be consulted for input on the update of SOPs 446, 447, 449 & 1394
o A series of meetings will be scheduled to gather input into SOP updates – approximately 4hrs per SOP
Page 7 of 8
Resource Requirements (continued):
 Review of updated SOPs
o Tech Support, NPD, Regulatory Affairs, Validation and Quality Assurance will all be involved in the review of the SOPs
o Assuming approximately 1-2hrs per SOP to be reviewed

 Development of Performance Metrics


o Each group involved in the new process will be responsible for developing their own metrics system (this topic needs further
discussion)
o Tech Support & NPD; develop an internal RFT metric and an internal metric to measure document cycle time
o Quality Assurance to develop an RFT metric for method validation documents received from user groups
o Validation to develop a metric to measure process cycle time – from issue of a Validation number to the filing of the completed
method validation
o Assuming approximately 8hrs per group will be required to develop performance metrics

 Development of training modules


o SMEs will be consulted for input into the training modules

 Delivery of Training
o Trainers required from (Tech Support and NPD)
o Training will take as long as required. At the moment theorising that there may be 3 2hr training sessions

 CRN sign-off and SOP mastering


o Assumed it would take Quality Assurance approximately 20hrs for this task

 Implementation of performance metrics


o There has been 2hrs planned for Tech Support, NPD, Validation and Quality Assurance to train their staff in how to use the
performance metric systems

 Monitoring of Performance Metrics


o 1hr a week has been planned for updating the performance metrics over the monitoring periods

Page 8 of 8

You might also like