Professional Documents
Culture Documents
(Insert Project, Program or Organization Name Here) : State of Michigan Test Strategy
(Insert Project, Program or Organization Name Here) : State of Michigan Test Strategy
1. General Information
System or Project
Creation Date:
ID/Acronym:
Client Agency: Modification Date:
Author(s): DTMB Authorized by:
2. Privacy Information
This document may contain information of a sensitive nature. This information should not be given to
persons other than those who are involved with this system/project or who will become involved during
its lifecycle.
3. Revision History
The Project Manager will maintain this information and provide updates as required. All updates to the
Project Management Plan and component plans should be documented in this section.
Revision Date Author Section(s) Summary
4. Overview
The purpose of the Test Strategy is to communicate the overall testing approach to all appropriate
stakeholders (project team, business owners, sponsors and management). This document identifies the
types of tests that will be performed, if any tools are needed to execute the tests, when each of those
tests are performed, who is responsible for those tests and how the results of those tests are to be
recorded and communicated. By identifying and defining these testing assets earlier in the process, it
should assist with budget and resource estimations/allocations. This document may pertain to a single
project, a program or an organization.
Definitions for test types and test roles can be found in Appendix A.
5. Testing Strategy
5.1. Test Environments
This section describes number, name, purpose, and location for the test environment landscape.
Include a diagram of the test environments, location and relationship (how, when code moves from
one environment to the next). Replace the example with a diagram of the projects or organizations’
environment.
<Other>
* Consult Test Center of Excellence or Test Manager for an industry guideline on the ratio of
developers to testers.
Sprint Demo Verification - (Test User Story has PO Once per sprint Scrum Team Product Sandbox or User Story Automated Defects Accepted - 4UP Status
(Review) it was built right) successfully approves/accep demonstrates Owner and PO Lab and User Backlog logged as new Product Report –
Final review by passed: ts stories in the completed invited story tool; defect stories Owner metrics
the PO of the automated work observers acceptance Project in Product updates reported
functionality of - Unit Test Backlog Tool criteria repository Backlog status in
each user story in or reviewed Product the tool to - Planned
accordance with - Functional - e-mail sent Backlog Item "accepted"
the user story Test and acceptance Solution (PBI) (or project - Accepted /
acceptance received and presented equivalent) Approved
criteria - PO User Story stored for review Feedback
Acceptance logged as a or responds - Moved to
Validates - that it Test All feedback new user to e-mail the next sprint
meets the needs items entered story in the (not complete
of the end users Product Product or defect
Backlog as new Backlog (new encountered)
user stories PBI) Report in
story points or
number of
stories
Number of
defects
recorded
Accessibility Validation (Built - Product has - All ADA When deploying e-Michigan e-Michigan UAT ADA e-Michigan Out of Defects ADA
(ADA the right thing)) - on-line /mobile compliance to production Team Environment Compliance Test tools compliance logged as Compliance
Compliance) ADA Compliance features issuers (PM / BA standards test results new defect Testing:
prior to states that addressed (Repeat testing if coordinates) must be PBI’s
deploy to electronic and - PI is being or critical or high remediated Total Tests
production information deployed to defects needed Executed
technology must production - written remediating or
be accessible to authorization Total Tests
May perform people with - UAT complete from PO to - written Passed
preliminary disabilities in and all deploy with authorization
review of accordance with prioritized known issues from Product Total Defects
ADA Americans with defects Owner or Recorded
compliance Disabilities Act remediated Sponsor to
by checking (ADA) Standards deploy with Total Defects
compliance published by DOJ. known issues Corrected
during the
build process
Performance Validation (We - End 2 End Performance Once before Performance Performance UAT (mirror of Performance Automated All non- Defects Performance:
Test built the right testing /SIT meets or release to Test Specialist Test Production) Test Performanc conformances logged as
thing) /UAT has been exceeds production Or Specialist objectives e Testing remediated new defect Total Tests
(note: Only A type of non- started and Technical Or and Tool PBI’s Executed
changes to functional test system code documented (Repeat until all Lead Technical performance repository Or
address whose purpose is base is non-functional performance Or Lead test scripts Total Tests
performance to identify and fix considered to requirement goals met) Automation Or Or Passed
- written
issues can be system be in a ‘stable’ for response Engineer Automation
authorization
made to the performance state time Engineer Application Total Defects
from Product
environment issues before the code Recorded
Owner to
during this system goes live. - Load volumes repository
deploy with
test phase) have been Total Defects
known issues
documented Corrected
Others as
defined in the
Detail Test
Plan
Post Deploy Verification & RFC designated Product Owner Once Product Production Reuse Test Case If issues can Update RFC Deploy:
Validation Validation of Deploy to and designated Owner selected UAT Repository be resolved as
deployed Production testers have Test immediately – appropriate Success
functionality concluded the Scenarios correct issues Or
and Data system is ready and cases Backed out
Conversions (if for use by the Or
planned) have intended
String Test String Testing is • 100% of Unit All String testing is • Developer Developer DevTest String Test Testing Tool Problem None Percent
conducted at the Test cases documented conducted after Plans Repository Completed
sub process level to passed for the acceptance all components
ensure the process component criteria met. comprising a Or
tasks and passed complete
transactions of the business SUITE
sub process String test function have templates
interact correctly. It plans have been developed
is typically an been reviewed and have passed Or
exhaustive test and approved. unit testing.
focusing on the Project’s Excel
system functions Test Case
within the context template
of a business
process. Successful
completion of
String Test efforts
establishes
readiness for
Integration Testing.
User The last phase of • 100% of All UAT Test UAT should be Business Business UAT UAT Test Testing Tool Defects: Defect Total Test
Acceptance the System Cases passed the last phase of SMEs, End SMEs, End Environment Cases Repository Critical/High Logs Cases
Test (UAT) - software testing pr Integration Test testing within Users – often Users, must have Executed
ocess. During UAT, Cases executed and the release cycle. assisted by Automated Or fixed during
the software users Test Analysts and/or testing period Total Test
test the software to UAT Test Product meets manual SUITE Cases
make sure it can Planning agreed upon Regression templates Passed/Failed
handle required completed. quality Test scripts
tasks in real-world standards Or Medium / Total Defects
scenarios, UAT Test Low maybe (often broken
according to Environment Project’s Excel fixed during down by
specifications. ready for Test Case testing period criticality,
testing. template or business
deferred to a function, etc.)
future release
Defect
Detection
Trend
Accessibility Validation (Built - Product has - All ADA Prior to e-Michigan e-Michigan UAT ADA e-Michigan Out of Defect ADA
(ADA the right thing)) - on-line /mobile compliance deploying to Team Environment Compliance Test tools compliance Log Compliance
Performance The system - SIT /UAT has Performance Once before Performance Performanc Any Performance Automated All non- Defect Performance:
Test performs to the been started meets or release to Test Specialist e Test environment Test Performance conformances Log
documented non- and system exceeds production Or Specialist that mirrors objectives Testing Tool remediated Total Tests
(note: Only functional code base is documented Technical Or production in and repository Executed
changes to requirement for considered to non-functional (Repeat until all Lead Technical size and performance Or
address response time be in a ‘stable’ requirement performance Lead configuration test scripts Or Total Tests
performance state for response goals met) Passed
- written
issues can be time Application
authorization
made to the - Load volumes code Total Defects
from
environment have been repository Recorded
Sponsor/Relea
during this documented
se Owner to
test phase) Total Defects
deploy with
Corrected
known issues
Others as
Number of
defects
recorded
Post Deploy Verification & RFC designated All operational Once after all Release Developers/ Production Operational Test Case If issues can Update Deploy:
Validation Validation of Deploy to readiness code has been Owner Test Readiness Repository be resolved RFC as
deployed Production criteria have deployed to Analyst /End Criteria immediately – approp Success
functionality been verified production and User SME and/or UAT correct issues riate Or
Data and the system data conversions Cases Backed out
Conversions (if is ready for use have happened. Or
planned) have by the intended
been audience Execute back
completed out
procedures
Operational
Readiness
criteria has
The purpose of Regression testing is to ensure that existing functionality is not broken
with the introduction of new code. Both new and presumably unchanged functionality
Regression Test is executed and validated. Regression testing can be performed across all test stages
and must be conducted in System Integration Test (SIT) and User Acceptance Test
(UAT).
Given the frequency of code changes being introduced using an agile approach, daily
regression testing is necessary and is usually automated.
The Regression Suite should be reviewed every cycle to validate if automated scripts
need to be removed or added. It is important for the regression test suite to be a solid
representation of all the activities that go on in a true production environment.
System Integration
Test (SIT)
Quality Assurance
Testing
System Testing
The last phase of the software testing process. During UAT, the software users test the
User Acceptance software to make sure it can handle required tasks in real-world scenarios, according to
Test (UAT) specifications.
Unit Test is the process of testing individual units of functionality. Unit tests verify that
Unit test individual system components support the related functional, non-functional
(technical), and interface requirements as represented in the system specifications. This
testing is performed by developers which ensures that the smallest testable module of
a solution successfully performs a specific task(s). Unit test are “checked-in” with the
corresponding code, and may be automated to execute with each build.
The Test Manager provides supervision and guidance and is responsible for
prioritizing, and coordinating the testing activities of a test team. The Test
Test Lead/Manager Manager monitors resource allocation for staff and contractors to align with
testing priorities and maintain effectiveness of testing operations.
1. General Information
2. Privacy Information
3. Revision History
4. Overview