Professional Documents
Culture Documents
Project Traceability Matrix
Project Traceability Matrix
Purpose
This template is used to monitor how custoemer requirements are met from definition to final acceptance.
Some fields have a proposed validation (see list in Parameters).
If required, the options can be changed and adapted to project needs.
Field Instructions
ID This ID is automatically generated in the list
Category Category can refer to a business unit or a customer organisational area
Type Examples of group of types:
Functional, Architectural, Others
WRICEFS
Req Description Short description of the requirement, a wider descrciption is in the Ref document
Ref ID The number by which the document is referenced for instance in a document.
While the ID is generated in this table only, the Ref ID is how the the client is referring to the requirement.
Ref Document Document in which a detailed description of the requirement can be found
MoSCoW Options (data validation is pre-defined):
Must Have Critical requirement which is required for the project to be successful
Shoud Have Important requirement which is less critical for the implementation and can be satisfied by workarounds
Could Have Desirable requirement, assumed to take few effort, which may be dropped in case of resource and time limitations
Won't Have Low priority requirement which will not be implemented
They could be added if time and resources allow and other higher priority requirements are already implemented.
They may be considered for future released
Status Green, Yellow, Red (data validation is pre-defined)
Progress Available choices (data validation pre-defined):
Req Definition, Development, Unit Test, Application Test, Integration Test, Acceptance Test, Accepted
Test ID Test case ID
Test result Possible test case results for this requirement (data validation is pre-defined):
Not Ready - Not
TheReady: - Not
test case Run:
is not yetThe testtocase
ready was not run as planned, for instance testers were not available or there is a schedule
be executed
Examples: The test case has not been defined yet or implementation is still in progress or test data are to be prepared yet
Not Run The test case was not run as planned (different from "Blocked" below)
Examples:The testers were not available or there is a schedule slippage
Blocked The test case was not possible to execute
Examples: Failures in other parts of the systems or other pre-required test cases not run or missing input data
Passed The test case worked as expected, with no issues
Caution The test case almost worked as expected, but there are some minor issues (passed with warnings or conditionally passed)
Failed The test case did not worked as expected, there are issues
If the test case works as expected for this requirement, but fails in some other parts outside the requirement, it is still "Passed".
Test date Date when the test case was last executed or when the info on the test case is provided (e.g. when the test case was Blocked)
Notes Additional Notes, if required.
Customer Sponsor
Customer Project Manager
Project Background
Project Objectives
Updated by
Date
Reviewed / Checked by
Date
Version
Not Ready 0-
Not Run 0-
Blocked 0-
Passed 0-
Caution 0-
Failed 0-
N/A 0-
Total 0