Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 22

State of Michigan

(Insert Project, Program or Organization Name Here)


Test Strategy

1. General Information

System or Project
Creation Date:
ID/Acronym:
Client Agency: Modification Date:
Author(s): DTMB Authorized by:

2. Privacy Information
This document may contain information of a sensitive nature. This information should not be given to
persons other than those who are involved with this system/project or who will become involved during
its lifecycle.

3. Revision History
The Project Manager will maintain this information and provide updates as required. All updates to the
Project Management Plan and component plans should be documented in this section.
Revision Date Author Section(s) Summary

4. Overview
The purpose of the Test Strategy is to communicate the overall testing approach to all appropriate
stakeholders (project team, business owners, sponsors and management).  This document identifies the
types of tests that will be performed, if any tools are needed to execute the tests, when each of those
tests are performed, who is responsible for those tests and how the results of those tests are to be
recorded and communicated.  By identifying and defining these testing assets earlier in the process, it
should assist with budget and resource estimations/allocations. This document may pertain to a single
project, a program or an organization.
Definitions for test types and test roles can be found in Appendix A.

5. Testing Strategy
5.1. Test Environments
This section describes number, name, purpose, and location for the test environment landscape.
Include a diagram of the test environments, location and relationship (how, when code moves from
one environment to the next). Replace the example with a diagram of the projects or organizations’
environment.

Project Test Strategy 1 of 22 SEM 0600 (Rev. 07/2017)


<Insert diagram or a table of test environments with a description for each environment>

5.2. Testing Tools


This section describes the approach being used for testing (manual and automated), traceability,
and test metric reporting). The table listed here has examples of various tools and is for reference. If
the organization does not have a standard tool identified check the Architectural Roadmap before
making a selection. If the tool selected is not listed here, insert the name of the tool selected.

Tool Function/Purpose Suggested Tools (currently Cost


on the Architectural
Roadmap
Test Planning, Team Foundation Server Yes
Test Cases,
User Story or Requirement to Test Case
traceability,
Test execution reporting, etc.
Test Planning, HP ALM Yes
Test Cases,
User Story or Requirement to Test Case
traceability,
Test execution reporting, etc.
Test Planning, Rational Quality Manager Yes
Test Cases,
User Story or Requirement to Test Case
traceability,
Test execution reporting, etc.
Test Planning, SUITE Templates No
Test Cases,
User Story or Requirement to Test Case
traceability,

Project Test Strategy 2 of 22 SEM 0600 (Rev. 07/2017)


Test execution reporting, etc.
Test Automation WorkSoft Yes
Test Automation HP UFT Yes
Test Automation Selenium No
Test Automation Rational Functional Tester,
Test Automation MS Test Manager Yes
Performance Testing Rational Performance Test Yes
Performance Testing HP Load Runner Yes
Performance Testing DynaTrace Yes
Performance Testing Team Foundation Server Yes
Virtualization Rational (RTVS) Yes
ADA Compliance (eMich) AccVerify, JAWS Free - limited
functionality
Yes - full functionality
<Other>
<Indicate the Tool/SUITE Templates that will be used>
Tool Function/Purpose Suggested Tools (currently
on the Architectural
Roadmap
Test Planning, Team Foundation Server, HP
ALM, Rational Quality
Manager, SUITE Templates
Test Case or User Story creation Team Foundation Server, HP
ALM, Rational Quality
Manager, SUITE Templates
Requirement to Test traceability, Team Foundation Server, HP
ALM, Rational Quality
Manager, SUITE Templates
Test Metrics reporting Team Foundation Server, HP
ALM, Rational Quality
Manager, SUITE Templates
Test Automation WorkSoft, HP-UFT, Selenium,
Rational Functional Tester,
MS Test Manager
Performance Testing Rational Performance Test,
HP Load Runner, DynaTrace,
Team Foundation Server
ADA Compliance AccVerify, JAWS, eMichigan

<Other>

5.3. Test Data Management


List the approach that will be used to create and maintain the test data.
 Approach for creating test data (programs, copy production etc.)
 Frequency/approach for refreshing test data

Project Test Strategy 3 of 22 SEM 0600 (Rev. 07/2017)


 Requirements for data masking when using copies of production test data
 Approach for creating conversion data
 Other information needed to manage test data for the project
<Describe the approach to the creation and maintenance of test and conversion data>

5.4. Roles and Responsibility


Select from the appendix all the test roles planned for the project. List the responsibilities of each
role. If the project has identified a test role not identified in the Test Roles table, add it to the table
below.
Tester Role Tester Responsibility Rate

5.5. Estimated Resources


Based on the phases or number of deploy product increments test cycles that are planned complete
the table below to account for the resource that will be needed for testing activities.
Phase/Product Type of resource needed for the Estimated number Percent
Increment test phase or product increment of resources allocation
(i.e. Test Analyst, Developer, needed (by type) *
Business Analyst, Product Owner,
End Users, etc.)

* Consult Test Center of Excellence or Test Manager for an industry guideline on the ratio of
developers to testers.

5.6. Metrics & Reports


Projects need to provide both test execution and defect metrics throughout each test phase or
deploy product increment test cycle.
Projects should complete the following table to understand and plan for collecting and reporting the
metrics associated to test execution, defect logging, defect correction, etc.
Metric What Phase / Reporting Source of Truth Audience
Product Frequency
Increment will be
collected in:

5.7. Test Automation


Describe how test automation will be addressed in each phase/product increment of the project
(refer to the Test Type Detail section below for suggestions).
Project Test Strategy 4 of 22 SEM 0600 (Rev. 07/2017)
5.8. Critical Success factors
Describe the activities or elements that are necessary for the test phases to be successful.
Examples may include:
 Proper Staffing of qualified test resources
 Enablement of Automation tools
 Availability of stable test environments
 Proper scheduling for test planning and execution activities

Project Test Strategy 5 of 22 SEM 0600 (Rev. 07/2017)


6. Test Type Detail
The information contained in this table represents industry best practices. Projects and organizations must decide which test types will be included in the
project or organizational test strategy and if any test types must be customized to fit the project or organization’s needs.
The tables are separated by development methodology, agile or waterfall. Remove the section(s) that do not apply based on the project approach or
organizations default development methodologies. Modify remaining sections to reflect the test activities of the project and/or organization.

Sprint Testing / Product Increment Testing (Agile)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
Unit Test Verification (Test Work has Unit Test Code Iterative during Developer Developer Developer PC Unit test In code Problems None %Test Code
it was built right) begun on a user Coverage goal the sprint case repository resolved Coverage
Unit Test is the story <xx %> within sprint
process of testing
individual units of
functionality. Unit
tests verify that
individual system
components
support the
related functional,
non-functional
(technical), and
interface
requirements as
represented in the
User Story
Functional Verification (Test - All code to All documented Iterative during User Story Tester/ Development Written Test In Test All identified None N/A
User Story it was built right) - satisfy the user acceptance the sprint. Developer/ Business Sandbox (AKA Case of the Case problems
Test Functional testing story has been criteria met. Tester/ Analyst/ Dev Test) User Story Repository resolved
is the process of completed and Business Alternate (automate within sprint
testing tested Analyst Developer when before passing
each function of possible). to PO
the user story in - 100% of Unit
accordance to Test cases
their respective passed
user story
acceptance - Unit Test Code
criteria. Coverage goal

Project Test Strategy 6 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
met
Product Verification - (Test - 100% - PO executes Iterative during Product Product PO Lab Written Test In Test Problems None N/A
Owner (PO) it was built right) Documented all documented the sprint Owner/ Owner / End Case of the Case resolved
Functional Functional testing Functional Test User Story test Business User Subject User Story Repository within sprint,
User Story is each function of Cases passed cases for all Analyst Matter (automate before Sprint
Acceptance the user story in features Expert (SME) when Review/Demo
Test accordance to developed possible)
their respective within the
user story sprint
acceptance - 100%
criteria by the PO. Documented
test cases pass

Sprint Demo Verification - (Test User Story has PO Once per sprint Scrum Team Product Sandbox or User Story Automated Defects Accepted - 4UP Status
(Review) it was built right) successfully approves/accep demonstrates Owner and PO Lab and User Backlog logged as new Product Report –
Final review by passed: ts stories in the completed invited story tool; defect stories Owner metrics
the PO of the automated work observers acceptance Project in Product updates reported
functionality of - Unit Test Backlog Tool criteria repository Backlog status in
each user story in or reviewed Product the tool to - Planned
accordance with - Functional - e-mail sent Backlog Item "accepted"
the user story Test and acceptance Solution (PBI) (or project - Accepted /
acceptance received and presented equivalent) Approved
criteria - PO User Story stored for review Feedback
Acceptance logged as a or responds - Moved to
Validates - that it Test All feedback new user to e-mail the next sprint
meets the needs items entered story in the (not complete
of the end users Product Product or defect
Backlog as new Backlog (new encountered)
user stories PBI) Report in
story points or
number of
stories

Project Test Strategy 7 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
Regression Verification - Manual – 100% of Manual – Once Product Automation PO Lab All PO user Test Tool Manual – Defect is Manual on
Test Regression testing Product expected per Deploy Owner / Engineer/Pro story test for Repository Agreed upon logged as the deploy
ensures that Increment build results Product Business duct Owner / User Stories or defects new defect product
existing complete achieved Increment Analyst / Test End User accepted to Excel Test resolved PBI increment
functionality is not (repeated until Analyst/ SME (may date Case before (DPI) Closure
broken with the all agreed to Automation vary (manual or template deploying report
introduction of defects Engineer depending automated) or code
new code. Both corrected) on what is Suite Test Automated
new and being tested) Case Automated – on 4 Up Status
presumably Automated - Template defect logged Report with
unchanged once per day as defect PBI, sprint metrics
functionality is (generally during and defects
executed and non-working fixed before -Number of
validated. hours) and at next test cycle regression
Deploy Product (when not test
if automated – run Increment fixed before executions
daily next test cycle
the test Number of
removed from regression
automated test
script executions
that failed

Number of
defects
recorded

Project Test Strategy 8 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
Product Verification (Test - All sprints for All (PI E 2 E) At the end of a Product Product PO Lab or SIT Business Testing Defects Defects Total Tests
Increment it was built right) - the Product test scenarios Product Owner Owner Environment Test Tool assessed for Logged as Executed
(PI) (End 2 Increment are passed and Increment or or Scenarios for Repository impact Critical new defect
End) End 2 End and/or complete no End User SME End User the Product or / High / PBI’s Total Tests
System - All user Critical / High (Repeat testing if SME Increment Excel Test Medium / Low Passed
When the Integration Test stories for the defects are defects need Maybe be Case (optionally)
product (SIT) is the process Product open remediating) assisted by Regression template fixed during Total Defects
increment is of testing the Increment – Tester / Tests or testing phase Recorded
NOT being complete accepted by the Additional SIT Business Suite Test
deployed to application in a PO criteria - All Analyst SIT – Case Total Defects
production fully integrated - 100 % integration test additional Template Corrected
testing automated cases passed SIT Test
environment that regression tests and no Critical / Cases
System mimics the real- passed High defects
Integration world use, - Unit Test Code are open
Test (SIT), meeting business Coverage for
User scenarios the PI (> xx %)
Acceptance including the Optionally PO
Test (UAT) interaction with may require all
external systems. medium / low
When the defects
product is User Acceptance resolved before
being Test (UAT) is the completion
deployed to process of
production software users
testing the
software to make
sure it can handle
required tasks in
real-world
scenarios,
according to User
Story specification

Project Test Strategy 9 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
User Verification (Test - All sprints for All SIT test When deploying Product Product UAT UAT Test Testing Defects: Total Tests
Acceptance it was built right) the Product scenarios a Product Owner Owner Environment Cases Tool Critical / High Executed
Test (UAT) - and Increment are passed Increment to or or Repository must have
when Validation (We complete and production Business SME End User Automated fixed during Total Tests
deploying PI built the right - All End 2 End No Critical / SME Regression Or testing period Passed
to thing) testing High / Medium (Repeat testing if Test scripts
production User Acceptance complete (if Defects critical or high Maybe be SUITE Total Defects
Test (UAT) is the separate phase outstanding defects needed assisted by UAT Test templates Recorded
May be a process of of testing) remediating) Tester / Cases Medium /
final UAT software users - 100% of Unit Business (optional) Or Low maybe Total Defects
phase of testing the Test cases Analyst fixed during Corrected
testing after software to make passed Project’s testing period
the final PI is sure it can handle - 100 % Excel Test
complete required tasks in automated Case
real-world regression tests template or
scenarios, passed Added to the
according to User - Unit Test Code Product
Story specification Coverage for Backlog as
the Product (> defect stories
xx %) for resolution
in future
sprints

Accessibility Validation (Built - Product has - All ADA When deploying e-Michigan e-Michigan UAT ADA e-Michigan Out of Defects ADA
(ADA the right thing)) - on-line /mobile compliance to production Team Environment Compliance Test tools compliance logged as Compliance
Compliance) ADA Compliance features issuers (PM / BA standards test results new defect Testing:
prior to states that addressed (Repeat testing if coordinates) must be PBI’s
deploy to electronic and - PI is being or critical or high remediated Total Tests
production information deployed to defects needed Executed
technology must production - written remediating or
be accessible to authorization Total Tests
May perform people with - UAT complete from PO to - written Passed
preliminary disabilities in and all deploy with authorization
review of accordance with prioritized known issues from Product Total Defects
ADA Americans with defects Owner or Recorded
compliance Disabilities Act remediated Sponsor to
by checking (ADA) Standards deploy with Total Defects
compliance published by DOJ. known issues Corrected
during the
build process

Project Test Strategy 10 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
Conversion Verification (Test - All sprints - All conversion Iterative until all Product Product Conversion Conversion Testing All Defects Defects Conversion:
it was built right) - building test scenarios conversion test Owner / End Owner / End or User Tool remediated logged as Total Tests
conversion passed pass User SME User SME UAT Stories / Test Repository new defect Executed
The data features have Environment Cases / Or PBI’s
conversion completed Scenarios Or Total Tests
programs Passed
Workaround
populate the new - 100% of User Maybe be Maybe be SUITE
documented
databases Story Test cases assisted by assisted by templates Total Defects
correctly passed Tester / Tester / Recorded
Business Business Or
Analyst Analyst Total Defects
Project’s Corrected
Excel Test
Case Others as
template defined in the
Detail Test
Plan

Performance Validation (We - End 2 End Performance Once before Performance Performance UAT (mirror of Performance Automated All non- Defects Performance:
Test built the right testing /SIT meets or release to Test Specialist Test Production) Test Performanc conformances logged as
thing) /UAT has been exceeds production Or Specialist objectives e Testing remediated new defect Total Tests
(note: Only A type of non- started and Technical Or and Tool PBI’s Executed
changes to functional test system code documented (Repeat until all Lead Technical performance repository Or
address whose purpose is base is non-functional performance Or Lead test scripts Total Tests
performance to identify and fix considered to requirement goals met) Automation Or Or Passed
- written
issues can be system be in a ‘stable’ for response Engineer Automation
authorization
made to the performance state time Engineer Application Total Defects
from Product
environment issues before the code Recorded
Owner to
during this system goes live. - Load volumes repository
deploy with
test phase) have been Total Defects
known issues
documented Corrected

Others as
defined in the
Detail Test
Plan

Post Deploy Verification & RFC designated Product Owner Once Product Production Reuse Test Case If issues can Update RFC Deploy:
Validation Validation of Deploy to and designated Owner selected UAT Repository be resolved as
deployed Production testers have Test immediately – appropriate Success
functionality concluded the Scenarios correct issues Or
and Data system is ready and cases Backed out
Conversions (if for use by the Or
planned) have intended

Project Test Strategy 11 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit Frequency Test Creator Test Environment Associated Location of Strategy for Record of Metric
Criteria Executor Artifact Artifacts issues found issues Reported
during stage
been audience Execute back
completed out
procedures
Add others                      
as needed

                       

Project Test Strategy 12 of 22 SEM 0600 (Rev. 07/2017)


Phased Based Testing (Waterfall)
Test Type Definition Entry Criteria Minimum Exit When and How Test Creator Test Environment Associated Location of Strategy for Record Metric
Criteria Testing Occurs Executor Artifact Artifacts issues found of Reported
during stage issues
Unit Test Verification (Test it Technical • 100% of Unit Testing occurs as Developer Developer Developer PC Unit test Problem None None
was built right) design Test cases components are case If automated,
the developed documents are passed completed. Testing Tool
component works completed. Repository
correctly from the
developer’s
perspective. Component is
developed

String Test String Testing is • 100% of Unit All String testing is • Developer Developer DevTest String Test Testing Tool Problem None Percent
conducted at the Test cases documented conducted after Plans Repository Completed
sub process level to passed for the acceptance all components
ensure the process component criteria met. comprising a Or
tasks and passed complete
transactions of the business SUITE
sub process String test function have templates
interact correctly. It plans have been developed
is typically an been reviewed and have passed Or
exhaustive test and approved. unit testing.
focusing on the Project’s Excel
system functions Test Case
within the context template
of a business
process.  Successful
completion of
String Test efforts
establishes
readiness for
Integration Testing.

Project Test Strategy 13 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit When and How Test Creator Test Environment Associated Location of Strategy for Record Metric
Criteria Testing Occurs Executor Artifact Artifacts issues found of Reported
during stage issues
System System Integration • 100% of All SIT test System Test Lead, Test Analyst QA SIT Test Testing Tool Defects: Defect Total Test
Integration Test (SIT), and/or String Test cases executed Integration Test Analyst or Business Environment Cases Repository Critical/High Logs Cases
Test (SIT) - End 2 End testing, is plans executed and Testing occurs Analysts must have Executed
the process of No Critical/High after all string Automated Or fixed during
testing the Test Planning is Defects exist. testing has been and Manual testing period Total Test
complete completed. completed. Regression SUITE Cases
application in a fully Product meets Typically, all Test scripts templates Passed/Failed
integrated testing Environment is agreed upon release code is
environment that ready for quality migrated to the UAT Test Or Medium / Total Defects
mimics real world testing. standards. QA environment Cases Low maybe (often broken
use, meeting simultaneously (optional) Project’s Excel fixed during down by
business scenarios Integrating to mimic how Test Case testing period criticality,
including the systems are code will be template or business
interaction with ready for deployed to deferred to a function, etc.)
external systems. testing. production. future release
The purpose of SIT Defect
is to verify that the Detection
solution works end Trend
to end, as designed.

User The last phase of • 100% of All UAT Test UAT should be Business Business UAT UAT Test Testing Tool Defects: Defect Total Test
Acceptance the System Cases passed the last phase of SMEs, End SMEs, End Environment Cases Repository Critical/High Logs Cases
Test (UAT) - software testing pr Integration Test testing within Users – often Users, must have Executed
ocess. During UAT, Cases executed and the release cycle. assisted by Automated Or fixed during
the software users Test Analysts and/or testing period Total Test
test the software to UAT Test Product meets manual SUITE Cases
make sure it can Planning agreed upon Regression templates Passed/Failed
handle required completed. quality Test scripts
tasks in real-world standards Or Medium / Total Defects
scenarios, UAT Test Low maybe (often broken
according to Environment Project’s Excel fixed during down by
specifications. ready for Test Case testing period criticality,
testing. template or business
deferred to a function, etc.)
future release
Defect
Detection
Trend

Accessibility Validation (Built - Product has - All ADA Prior to e-Michigan e-Michigan UAT ADA e-Michigan Out of Defect ADA
(ADA the right thing)) - on-line /mobile compliance deploying to Team Environment Compliance Test tools compliance Log Compliance

Project Test Strategy 14 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit When and How Test Creator Test Environment Associated Location of Strategy for Record Metric
Criteria Testing Occurs Executor Artifact Artifacts issues found of Reported
during stage issues
Compliance) The developed features issues production (PM / BA standards test results Testing:
prior to screens have been addressed coordinates) must be
deploy to coded correctly to - SIT complete or (Repeat testing if remediated Total Tests
production support the State of and all critical or high Executed
Michigan's prioritized - written defects needed or
direction on ADA defects authorization remediating Total Tests
May build compliance. remediated from - written Passed
preliminary Sponsor/Releas authorization
review of ADA e Owner to from Total Defects
compliance deploy with Sponsor/Relea Recorded
into the known issues se Owner to
Business deploy with Total Defects
Requirements known issues Corrected
/Technical
Design phases
Conversion Conversion Testing - Conversion - All conversion Iterative until all Test Analyst / Test Conversion Conversion Testing Tool All Defects Defect Conversion:
validates that data test plan has test scenarios conversion test End User Analyst / or Test Cases Repository remediated Log Total Tests
conversion been passed pass SME/Business End User QA Executed
programs populate completed. Analyst SME/Busines or Or Or
the new databases s Analyst UAT Total Tests
correctly and - System Environment SUITE Passed
Workaround
completely. Integration templates
documented
Testing has Total Defects
and approved
started. Or Recorded

Project’s Excel Total Defects


Test Case Corrected
template

Performance The system - SIT /UAT has Performance Once before Performance Performanc Any Performance Automated All non- Defect Performance:
Test performs to the been started meets or release to Test Specialist e Test environment Test Performance conformances Log
documented non- and system exceeds production Or Specialist that mirrors objectives Testing Tool remediated Total Tests
(note: Only functional code base is documented Technical Or production in and repository Executed
changes to requirement for considered to non-functional (Repeat until all Lead Technical size and performance Or
address response time be in a ‘stable’ requirement performance Lead configuration test scripts Or Total Tests
performance state for response goals met) Passed
- written
issues can be time Application
authorization
made to the - Load volumes code Total Defects
from
environment have been repository Recorded
Sponsor/Relea
during this documented
se Owner to
test phase) Total Defects
deploy with
Corrected
known issues
Others as

Project Test Strategy 15 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit When and How Test Creator Test Environment Associated Location of Strategy for Record Metric
Criteria Testing Occurs Executor Artifact Artifacts issues found of Reported
during stage issues
defined in the
Detail Test
Plan
Regression Regression Test Manual – SIT 100% of Manual – Once Test Analyst / Test QA Manual or Testing Tool Manual – Defect Manual on
Test detect if any build complete expected per System End User Analyst / Environment Automated Repository Agreed upon is the deploy
previously working results Integration SME/ End User or Automated Test Cases defects logged product
code is no longer Automated - achieved and/or User Automation SME/ Testing Or resolved as new increment
working due to Automation Acceptance test Engineer Automation Environment before defect (DPI) Closure
emerging code Environment is phases Engineer SUITE deploying PBI report
changes. ready for templates code
testing. Automated - Automated
iteratively as Or Automated – on 4 Up
planned. defect logged Status Report
Project’s Excel as defect PBI, with sprint
Test Case and defects metrics
template fixed before
next test cycle -Number of
(when not regression
fixed before test
next test cycle executions
the test
removed from Number of
automated regression
script test
executions
that failed

Number of
defects
recorded

Post Deploy Verification & RFC designated All operational Once after all Release Developers/ Production Operational Test Case If issues can Update Deploy:
Validation Validation of Deploy to readiness code has been Owner Test Readiness Repository be resolved RFC as
deployed Production criteria have deployed to Analyst /End Criteria immediately – approp Success
functionality been verified production and User SME and/or UAT correct issues riate Or
Data and the system data conversions Cases Backed out
Conversions (if is ready for use have happened. Or
planned) have by the intended
been audience Execute back
completed out
procedures
Operational
Readiness
criteria has

Project Test Strategy 16 of 22 SEM 0600 (Rev. 07/2017)


Test Type Definition Entry Criteria Minimum Exit When and How Test Creator Test Environment Associated Location of Strategy for Record Metric
Criteria Testing Occurs Executor Artifact Artifacts issues found of Reported
during stage issues
been met.
Add others as
needed

Project Test Strategy 17 of 22 SEM 0600 (Rev. 07/2017)


7. Approval Information
Approvers Signatures (must be authorized approvers/members of the project Change Control Board)

Role Name/Title Signature Date


Business Owner
(Authorized
Approver)
DTMB System
Owner (if
identified)
Project Manager
Project Test
Manager

Project Test Strategy 18 of 22 PMM (Rev. 04/2017)


8. Appendix
8.1. Test Type Definitions
“Verification ensures that ‘you built it right"
"Validation ensures that ‘you built the right thing"
Definition
Test Type
ADA Compliance states that electronic and information technology must be accessible
ADA Compliance to people with disabilities in accordance with Americans with Disabilities Act (ADA)
Standards published by DOJ.
Non-attended testing - scripted tests that can be executed repeatedly with known input
Automated Test / output conditions. Automated testing tools can be utilized for functional, regression
and performance testing.
Functional testing is the process of testing each function of the software application in
accordance to their respective Functional Specifications. In this testing, each and every
functionality of the system is tested by providing appropriate input, verifying the output
Functional and comparing the actual results with the expected results. This testing involves
checking of User Interface, APIs, Database, security, client/ server applications and
functionality of the Application under Test. The testing can be done either manually or
using automation.
A type of non-functional test whose purpose is to identify and fix system performance
issues before the system goes live. This generally includes load testing, stress testing,
stability or volume testing, throughput testing, and ongoing performance monitoring.
Performance
Performance testing also characterizes the performance of the application and
infrastructure under test, providing project stakeholders with data that will guide and
drive related capacity planning activities.

The purpose of Regression testing is to ensure that existing functionality is not broken
with the introduction of new code. Both new and presumably unchanged functionality
Regression Test is executed and validated. Regression testing can be performed across all test stages
and must be conducted in System Integration Test (SIT) and User Acceptance Test
(UAT).
Given the frequency of code changes being introduced using an agile approach, daily
regression testing is necessary and is usually automated.
The Regression Suite should be reviewed every cycle to validate if automated scripts
need to be removed or added. It is important for the regression test suite to be a solid
representation of all the activities that go on in a true production environment.

The objective of security testing is to uncover vulnerabilities of a system and determine


how protected the data and resources are from external and internal threats. This type
of non-functional test is performed in accordance to the security specifications and is
Secure Coding used to measure the effectiveness of the system’s authorization mechanism, the
(appscan tool) strength of authentication, the integrity of the data, and the availability of the system in
the event of an attack and level of non-repudiation.
Smoke testing is performed during SIT as well as during UAT immediately following a

Project Test Strategy 19 of 22 PMM (Rev. 04/2017)


Definition
Test Type
code migration and is designed to detect potential show-stopping defects which may
Smoke Test impede tester activities in SIT or UAT. This is an initial testing effort making sure build
and environment is stable enough to continue test execution. Prior to the start of test
execution, a list of very important test scripts can be identified that will constitute the
smoke test scripts.
String Testing is a development level test that is performed by developers after
completing unit testing. In this a business process test conducted at the sub process
level to ensure the process tasks and transactions of the sub process interact correctly.
String Testing It is typically an exhaustive test focusing on the system functions within the context of a
business process.  Successful completion of String Test efforts establishes readiness for
Integration Testing.

System Integration
Test (SIT)

(AKA End 2 End)


End 2 End and/or System Integration Test (SIT) is the process of testing the complete
Previously used
application in a fully integrated testing environment that mimics the real-world use,
terms:
meeting business scenarios including the interaction with external systems. This
Systems and
involves utilizing network communications and simulating interactions with other
Standards Testing
applications or hardware in a manner matching that of the production environment.
Integration Testing The purpose of SIT is to verify that the solution works end to end, as designed.

Quality Assurance
Testing

System Testing
The last phase of the software testing process. During UAT, the software users test the
User Acceptance software to make sure it can handle required tasks in real-world scenarios, according to
Test (UAT) specifications.
Unit Test is the process of testing individual units of functionality. Unit tests verify that
Unit test individual system components support the related functional, non-functional
(technical), and interface requirements as represented in the system specifications. This
testing is performed by developers which ensures that the smallest testable module of
a solution successfully performs a specific task(s). Unit test are “checked-in” with the
corresponding code, and may be automated to execute with each build.

8.2. Test Roles

The Test Manager provides supervision and guidance and is responsible for
prioritizing, and coordinating the testing activities of a test team. The Test
Test Lead/Manager Manager monitors resource allocation for staff and contractors to align with
testing priorities and maintain effectiveness of testing operations.

Project Test Strategy 20 of 22 PMM (Rev. 04/2017)


The Tester provides expertise in the planning, constructing and execution of software
quality check. Responsible for applying business and functional knowledge to meet the
team’s overall test objectives. Has expertise in the testing principles, processes and
methods for agile methods
The tester is also responsible for ensuring that the testing standards, guidelines, and
Test Analyst testing methodology are applied as specified in the projects test plan and that all testing
results are easily accessible and understandable.
The Tester may perform defect coordination functions, ensuring that test defects are
tracked to closure and that the defect repository is kept up-to-date with the current
status. Also included in the defect coordination function is the creation and distribution
of test status metrics as required by the project. The Tester interact with the Product
Owner and other domain Subject Matter Experts to coordinate testing efforts.
Automation The Automation Engineer is a trained experienced tester that has received additional
Engineer training in tools used to design, build and execute automated testing scripts.
Subject Matter Expert - A user, generally agency personnel with extensive knowledge of
SME the business domain, expectations, and variations. The SME can test the system
identifying the normal and unusual business situations that occur in general use of the
application.
Template Revision History

1. General Information
2. Privacy Information
3. Revision History
4. Overview

Project Test Strategy 21 of 22 PMM (Rev. 04/2017)


5. Testing Information
5.1. Test Environments
5.2. Testing Tools
5.3. Test Data Management
5.4. Conversion Test Data Management
5.5. Testing Resources (Human)
5.6. Testing Metrics
5.7. Test Automation
5.8. Test Critical Success Factors
6. Test Type Detail
7. Approvals
8. Appendix
8.1. Test Type Definitions
8.2. Test Role

Project Test Strategy 22 of 22 PMM (Rev. 04/2017)

You might also like