Professional Documents
Culture Documents
Master Test Plan
Master Test Plan
FOR
<PROJECT NAME>
Prepared by : Date :
REVISION HISTORY
Table of Contents
A Introduction...................................................................................................................................................5
A.1 Overview.............................................................................................................................................................. 5
A.2 Acronyms.............................................................................................................................................................5
B Test Methodology...........................................................................................................................................6
B.1 Test Objectives.....................................................................................................................................................6
C Test Strategy..................................................................................................................................................7
C.1 Summary.............................................................................................................................................................7
C.2 Project Test Approach.........................................................................................................................................7
C.2.1 W Model............................................................................................................................................................................7
C.2.2 Test Approach Overview..................................................................................................................................................8
C.2.3 Test Planning.....................................................................................................................................................................8
C.2.4 Knowledge Transfer.........................................................................................................................................................8
C.2.5 Test Preparation................................................................................................................................................................8
C.2.6 Internal Review.................................................................................................................................................................9
C.2.7 Client QA Review (Optional)...........................................................................................................................................9
C.2.8 Test Data Preparation......................................................................................................................................................9
C.2.9 Test Execution and Administration.................................................................................................................................9
C.3 Build Process.....................................................................................................................................................10
C.4 Unit Test Level...................................................................................................................................................10
C.4.1 Purpose and Objective...................................................................................................................................................10
C.4.2 Unit Test Approach Description....................................................................................................................................10
C.4.2.1 Plan..........................................................................................................................................................................10
C.4.2.2 Prepare.....................................................................................................................................................................10
C.4.2.3 Execute.....................................................................................................................................................................11
C.4.3 Entry/Exit Criteria, Completion Deliverables and Suspension..................................................................................11
C.5 Integration/Functional Test Level....................................................................................................................11
C.5.1 Purpose and Objective...................................................................................................................................................11
C.5.2 Integration/Functional Test Approach Description.....................................................................................................11
C.5.2.1 Plan...........................................................................................................................................................................11
C.5.2.2 Prepare.....................................................................................................................................................................12
C.5.2.3 Execute.....................................................................................................................................................................12
C.5.2.4 Report......................................................................................................................................................................12
C.5.3 Entry/Exit Criteria, Completion Deliverables and Suspension.................................................................................12
C.6 User Acceptance Test Level (UAT)...................................................................................................................13
A Introduction
A.1 Overview
The Master Test Plan (MTP) defines and documents the processes and controls to be used for the project/ODC. The
plan will provide sufficient checks and cross checks during the project to ensure that the final deliverable(s) meet the
documented requirements. The MTP describes the scope of testing, responsibilities of participants and administration
for testing the project/product.
The MTP defines the high-level testing approach and procedures for each test level applicable in the ODC/project.
Various testing types being conducted within the ODC maybe :
• Unit Testing
• Functional Testing
• Regression Testing
• Any other
This MTP also addresses Test Management, types of tests, Test Tools, and test team requirements.
Note : The types of testing defined here are only examples. These should be updated as per contractual requirements
The Detail Test Plan (DTP) document further defines the test approach for each level of testing and describes the
specifics for the testing environments. Project/Work pack overview and scope of project/work-pack with regards to
testing is covered in DTP document. The DTP also includes the test cases and test procedures for test execution, the
entry and exit criteria, and the roles and responsibilities of the test team. The Detail Test Plan(s) is a separate
document and is to be created for each project/work-pack (executed from the ODC) for which testing needs to be
carried out.
A.2 Acronyms
MTP Master Test Plan
DTP Detail Test Plan
ODC Offshore Development Center
HLD High Level Design
QA Quality Assurance
UTC Unit Test Cases
FTC Functional Test Cases
UT Unit Testing
RTM Requirement Traceability Matrix
UAT User Acceptance Testing
B Test Methodology
B.1 Test Objectives
Testing is defined as the systematic execution of an application, its components and procedures with the intent of
finding defects. Testing also demonstrates that the system works from a functional, performance and hardware
perspective, and that the interfaces between systems function correctly.
C Test Strategy
C.1 Summary
The testing strategy defines the objectives of all test stages and the techniques that apply. The testing strategy also
forms the basis for the creation of a standardized documentation set, and facilitates communication of the test process
and its implications outside of the test discipline. Any test support tools introduced should be aligned with, and in
support of, the test strategy. This section below provides a summary of the overall test approach and the specifics of
the approach to each level of testing.
C.2.1 W Model
Testing is least costly and most effective if it is performed throughout the whole life cycle, in parallel with every stage of
development. This strand of testing in parallel with development is represented in the W model shown below:
The W model is a natural evolution from the V Model of testing. The V model illustrates the layered and phased nature
of software testing, but lists only dynamic test stages like unit and system testing. The W model, supports testing of all
deliverables at every stage of the development life cycle. The W-model promotes the idea that for every activity that
generates a project deliverable, each of those deliverables should have an associated validation activity. Where it
differs from the V-model is that it promotes both static testing (through inspections/reviews) of early document or code
deliverables and dynamic test stages of software deliverables.
The value of the W-model approach is that it focuses attention on all project deliverables. By matching the deliverables
with product risks, the types of testing required to be incorporated into the test strategy can be identified at an early
stage.
The importance of intermediate control points is especially noticeable for deliverables that require a considerably
longer time to produce. The control points help in performing Corrective Actions during the process rather than at the
end of the process when the deliverable is ready. Once the deliverable is produced the effort required to perform
Corrective Action to remove Non-Conformities could turn out to be enormous.
1. Test Planning
2. Knowledge Transfer
3. Test Preparation
4. Test Execution
5. Test Reporting
These steps are followed at the overall project/ODC level.
First, develop a Master Test Plan to identify the test levels (E.g. Unit, Regression, Functional and UAT) and the
strategy and approach for the testing identified.
Next, detail test plans (DTP) are created to identify the major project components and for the various types of
testing to be conducted. These test plans are to address in detail in Detail Test Plan:
1. What is in scope and out of testing scope
2. What is to be tested
3. Where testing is to occur
4. When testing is scheduled
5. Who is testing
6. How it is tested
The detail test plan and test scripts/test cases are prepared by the Quality Assurance Team.
Test scripts/test cases are prepared for each of the functions to be tested and are mapped to the business
requirements/functional specifications. The RTM should be updated with the test scripts/ cases to ensure all
requirements have been covered by the test scripts/ cases. Test scripts/test cases are created for execution at all
levels of testing identified in the DTP. Test cases are created using the template which needs to be agreed at the
commencement of the project.
Note: Explain the Build Process for Testing to be done. Explain the location of code, the test bed etc., who will create
the Build, When and how often.
C.4.2.1 Plan
White Box Testing is the most common testing technique used by programmers at the Unit test level. In the White Box
test approach, the programmers have an inside view of the system. Their concern is focused on “how it is done” NOT
“what is done”. White Box testing is logic oriented and the focus is on the execution of all possible paths of control flow
through the program. White box testing consists of testing paths, branch by branch, to produce predictable results.
C.4.2.2 Prepare
Unit test cases prepared by testing team members are reviewed by the development team. While writing UTCs,
developers should select a combination of White box testing techniques appropriate for the application. Too exhaustive
combination of these techniques leads to an unmanageable number of test cases so number of UTC are created
based on function points as well as on complexity of the application. Update the RTM to map UTC with requirements
from SRS/HLD to be assured that all the scenarios and functionality in the requirements is covered by application and
UTC.
C.4.2.3 Execute
Unit Test is the initial check of a program(s) in the development environment. As problems are discovered, defects
discovered in Unit Testing are logged into defect tracking system and then they are corrected by the programmer and
retested. These defects are later quantified and are used to evaluate the quality of product. When the exit criteria and
exit deliverable (see Entry/Exit Criteria below) has been met, programs are then integrated together and migrated to
next level of testing.
Table 2 Unit Test Entry/Exit Criteria, Completion Deliverables and Suspension Criteria
Entry Criteria Exit Criteria Completion Deliverables Suspension Criteria
Completed Source code has Source code with compile/link options Design change/flaw
Design been reviewed, Updated design documentation
Document errors corrected
and compilation Updated Requirement Traceability
Source Code Matrix
Complete completed without
errors Unit Test Report
Unit Test
Cases
Functional Tests demonstrate that the system functions as designed. Its focus is on the functionality of the system, as
seen externally. The key to success in Functional Test is the existence of a well-defined and documented design and
business requirements or Software Requirement Specifications. The main purpose of this testing is to verify that
software meets specification.
C.5.2.1 Plan
A testing technique known as Black Box Testing is commonly planned for this level of testing. Black Box Testing is
focused on “what is done” as opposed to the White Box Testing of “how it is done”. Functional test analysts must also
plan to utilize various Black Box testing techniques such as Equivalence Partitioning, Boundary value Analysis and
Comparison Testing along with Error Processing technique mentioned above. Functional testing will consist of both
automated and manual testing based on project requirements. It is suggested that Regression testing (if required) be
done through automated test scripts.
C.5.2.2 Prepare
A Functional Test Plan will be created by the Testing Team with inputs from the Development Team. The test plan will
incorporate the test strategy as defined in this document, and the testing techniques identified above. Once the SRS
is reviewed, tester will start FTC creation in parallel to development phase. Gap analysis is prepared and RTM is
updated to map FTC with corresponding requirements. After FTC review, all the functional gaps, inaccurate functional
specs are covered and FTCs are updated accordingly.
C.5.2.3 Execute
Functional Test demonstrates that the system performs functionally and that the system interacts correctly within a
simulated production environment, without any adverse effect on other systems or interfaces. Defects identified are
logged into a defect tracking tool with severity of each defect.
C.5.2.4 Report
Time taken to write and execute the FTC is tracked. Problem logs are formally recorded and tracked. The process for
recording and tracking problem logs is located in the Test Management section of this document. When the exit criteria
and exit deliverables have been met, functionally tested and integrated code is promoted to Acceptance environment.
defined in the design have been closed Actual test results Change in project scope
at the Unit level. and verified. System Test Summary Test environment not
Software migrated to Report available
system test System Test Sign-off Large number of high
environment severity or cutover critical
Gap Analysis report to
The trace out gaps in the problem logs opened in a
Functional/Integratio development based on short time period with no
n Test Cases are the requirements trend towards improvement
available. document Unacceptable turn-around
Updated Requirement time.
Traceability Matrix Gaps in functional areas in
requirements and
developed code
C.6.2.1 Plan
Customer personnel with diverse job functions and methodologies use the system as they would in their specific daily
work environment. Customers simulate live operation of the business system, including all business situations that
might reasonably occur. Testing will focus on verifying that the system meets the business requirements and that the
system performs as expected
C.6.2.2 Prepare
A UAT Detail Test Plan will be created by the UAT Test Team. Test cases are made available by us to Client QA team
or created by them and are provided to system/business users for testing.
C.6.2.3 Execute
Testing is demonstrative rather than destructive, i.e., demonstrating that the system works, not on debugging.
Destructive, comprehensive testing and debugging already occurred during Unit, Integration and Functional testing.
The test cases are executed and the test results reviewed and documented. Any defects will be reported to the UAT
Test Coordinator by the customer(s) for handling. These are logged and categorized on severity level.
C.6.2.4 Report
The users log defects in a defect tracking tool and assigns them to the Development Team Lead/ Project Manager who
would then assign it to the appropriate development TM for fixing. The process for recording defects is located in the
Test Management section of this document. When defects are corrected, the business user (who logged the defect) is
notified and the correction re-introduced into the testing cycle for validation by the customer(s). Once the User
Acceptance Test exit criteria and exit deliverables have been met, customer sign-off is obtained.
D Test Management
It is the role of test management to ensure that new or modified service products meet the business requirements for
which they have been developed or enhanced. The purpose of test management is to ensure that a testing strategy is
both devised and applied that is efficient, effective and economic as described above. Test management is also
concerned with both test resource and test environment management as covered in other sections of this document.
Test monitoring and assessment – ongoing monitoring and assessment of the integrity of the development and
construction. The status of the configuration items should be reviewed against the phase plans and test
progress reports prepared providing some assurance of the verification and validation activities.
Product assurance – the decision to negotiate the acceptance testing program and the release and
commissioning of the service product is subject to the ‘product assurance’ role being satisfied with the
outcome of the verification activities. Product assurance may oversee some of the test activity and may
participate in process reviews.
Most of the test management related activities are covered in other part of this document. This section mainly focuses
on problems logging and reporting section of test management.
Logged defects can be in any given status at one time. Various defect statuses are
New (New logged defect)
Resolved (Resolved by developer)
Closed (Closed by tester after testing)
Re-Opened (Re-Opened by tester after testing)
Note: Document the Defect Resolution process here. Enumerate the defect tracking tool to be used, Defect categories
and status of resolutions should be described here. Mention how defects will be tracked to closure once identified by
the testers.
Test Tools
D.3 Test Tools List
Test Tools
Planned Date of Acquisition
Test Tool Purpose/Use for New Test Tool
Appendix G
Project Glossary
Term Definition