Template Test Plan

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 18

Department or Program Name

[SYSTEM/APPLICATION NAME]
Quality Assurance
Test Plan
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Project QA/Tet P!an
Template Guidelines
To aid in the creation of a successfully completed Project QA/Test Plan, please review the following
guidelines. For additional instructions and information, please refer to the University ervices Program
!anagement "ffice. #emove these guidelines from the completed document.
Purpose The Project QA/Test Plan is an um$rella plan that encompasses the entire testing
re%uired for a project. &t is the highest level testing plan that documents the testing
strategy for the project, descri$es the general approach that will $e adopted in testing ,
provides the overall structure and philosophy for any other re%uired testing documents,
and defines what will $e tested to assure the re%uirements of the product, service, or
system 'i.e., the delivera$les( are met.
)ased on the si*e, comple+ity, and type of project, subsequent test plans may $e
created and used during the testing process to verify that the delivera$le wor,s
correctly, is understanda$le, and is connected logically. These test plans include the
following categories and types. Testing areas/coordinators would have templates
specific to these plans.
Types of quality assurance planning include-
Process QA plans document how the project is meeting %uality and compliance
standards ensuring the right documentation e+ists to guide project delivery and
corrective actions, and
Requirements/application test plans document how the product, service or
system meets stated $usiness and technical re%uirements, and how it will wor,
within the defined operational/$usiness processes and wor, flow.
Test plans typically used for development efforts include-
Unit Test Plans (UT( documents what the programmer is coding for a particular
screen or unit.
ntegrated !ystems Test Plan '!T( documents how the system will $e tested to
assure major errors are fi+ed and how end.to.end testing $etween one.off
applications will occur.
ntegration Application Test Plans (AT( documents how applications will test
end.to.end all new functionality across all applications.
User Acceptance Testing 'UAT( documents how the users, who will $e
supporting the new and e+isting functionality in production, will test this new
functionality prior to production installation.
"perations nterface Readiness Test Plan '"R( documents how the
operations user will test all new and e+isting functionality.
Testing approaches that may $e used to test or validate projects that have high ris, or
impact include-
Regression Testing ensures all fi+es identified during &AT and UAT were made
to the system and did not impair ,ey e+isting functionality (used mainly with new
major software development projects).
#onversion ntegration Application Testing is used to simulate the first couple
of processing days following a conversion.
Prototype is a techni%ue used either in the design, $uild, or testing stage to
construct a model of the product, service, or system to verify a function, a design,
how a particular module or program wor,s, etc.
Proof of #oncept is used to test an idea in a controlled environment to test new
features.
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Alpha Testing tests a new product, service, or system $efore implementation
where it may have major impacts and the project team wants to identify major
pro$lems/$ugs $efore implementation 'or goes into production(.
$eta Testing differs from Alpha Testing in the amount of testing and clean.up
that needs to $e performed. The project should $e ready for implementation.
Pilot is a limited use of the product, service, or system $y a remote site, su$set of
the customer $ase, etc. This testing techni%ue is used to wor, out logistical
pro$lems, validate the delivera$le, and minimi*e impact $y limited rollout.
Parallel Testing occurs when the old and the new product, service, or system
are running simultaneously to allow the project customer to chec, that the
delivera$le is wor,ing per specifications.
!tress/%oad Testing ensures the system will perform relia$ly during full
production and heavy wor,loads.
"perational/$usiness Readiness Testing wal,s through the
operational/$usiness processes and wor,flows to ensure that procedures,
documentation, reconciliation, training, and wor, flows are complete and correct.
"&nership The Project !anager is responsi$le for ensuring that all testing plans are created and
identifies them under one um$rella plan 'i.e., the !aster Project QA/Test Plan(. The
project testing team lead's( develop the necessary su$se%uent test plans.
'hen
Phase- /esign
tage- Planning
The Project QA/Test Plan is completed during the /esign phase of the olution
/elivery 0ife 1ycle. &t should $e updated anytime additional information or project
changes affect its content.
&t is a re%uired delivera$le on 0arge and !edium projects, and a $est practice on Fast
Trac, projects. For additional guidance, the Project 1lassification 2or,sheet is
availa$le.
Template
#ompletion
3ote- Te+t
within ( )
$rac,ets need
to $e replaced
with project.
specific
information.
4. /o not include the Template 5uidelines in your final document. 6nter the project
information in the page header and footer, title page, and document contri$utors
and version control.
7. 1omplete the document utili*ing suggested te+t where applica$le and entering
te+t/fields where shown within 8$lue te+t9 $rac,ets. 3ote that the $lue te+t is
3"T to $e included in your final document. &ts purpose is to either provide
guidance for completing the document, or to show where te+t/fields are to $e
input.
:. "nce changes are made to your document and you;re ready to finali*e, ensure
that you update your Ta$le of 1ontents 'T"1( section. To Update the T"1- &f
you are unsure how to do this, place your mouse arrow to the left of the first entry
in the Ta$le of 1ontents section and clic, the left $utton once. "nce the entire
section is highlighted, move the mouse arrow anywhere within the highlighted
section and clic, the right $utton once. &n the drop.down menu, choose Update
Field and Page 3um$ers "nly or 6ntire Field as needed. 3ote that you might
need to repeat the aforementioned steps to change the font $ac, to Tahoma 4<
pt.
=. The !aster Test Plan is to $e retained with other project.related documentation
and maintained in accordance with the $usiness line;s records retention policy.
*mpo&erment
+ !calability
This template is provided as a guideline to follow in producing the minimum $asic
information needed to successfully complete a Project QA/Test Plan in meeting P!"
guidelines and illustrates the art and science of project management. Project
!anagers are empowered to use this template as needed to address any specific
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
re%uirements of the proposed project at hand. The amount of detail included in the
template will depend on the si*e and comple+ity of the project.
mportant
,otice
As this template may change, it is highly recommended that you access a $lan,
template from the Program !anagement "ffice we$site
'http-//www.uservices.umn.edu/pmo/( each time you need one for a new project and
not merely use one from a previous project $y changing the old te+t.
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
DOCUMENT IN"ORMATION AND APPRO#ALS
-*R!", .!T"R/
#er$on % Date Re&$e' () Reaon *or c+ange
,-. //01/,0 Aaron Demenge PMO Re&$e2
0"#U1*,T APPR"-A%!
Appro&er Name Project Ro!e S$gnat3re/E!ectron$c Appro&a! Date
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
TA(LE O" CONTENTS
Intro'3ct$on----------------------------------------------------------------------------------------------------------------------,
Scope...................................................................................................................................................... 1
Test Objectives........................................................................................................................................ 1
Testing Goals.......................................................................................................................................... 1
Quality ................................................................................................................................................ 1
Reliability............................................................................................................................................ 2
Project Q3a!$t) A3rance----------------------------------------------------------------------------------------------0
Fast Track Project Required Documents.................................................................................................
Tet Met+o'o!og)------------------------------------------------------------------------------------------------------------0
!ntrance "riteria......................................................................................................................................
!#it "riteria.............................................................................................................................................. $
Test !#ecution......................................................................................................................................... $
Unit Testing......................................................................................................................................... 3
Functional Testing............................................................................................................................... 3
Regression Testing............................................................................................................................. 3
Integration Testing.............................................................................................................................. 3
Interface Testing................................................................................................................................. 3
Destructive Testing............................................................................................................................. 3
User acceptance testing..................................................................................................................... 4
Test Scenarios......................................................................................................................................... %
Test Script Development......................................................................................................................... %
De&ect Reporting...................................................................................................................................... '
Tet En&$ronment-------------------------------------------------------------------------------------------------------------4
Requirements.......................................................................................................................................... '
Testing Plat&orm...................................................................................................................................... '
Uer Acceptance Tet P!an---------------------------------------------------------------------------------------------4
De&inition................................................................................................................................................. '
Testing Requirements............................................................................................................................. '
Testers(Participants................................................................................................................................. )
Testing Sc*edule..................................................................................................................................... )
A3mpt$on an' R$5--------------------------------------------------------------------------------------------------6
+ssumptions............................................................................................................................................ )
Risks........................................................................................................................................................ ,
7o/No8go Meet$ng------------------------------------------------------------------------------------------------------------1
Ro!e an' Repon$9$!$t$e---------------------------------------------------------------------------------------------1
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
S$gn8o** an' Ac5no2!e'gement-------------------------------------------------------------------------------------:
Tet D$rector ; De*ect Trac5$ng Proce--------------------------------------------------------------------,.
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
INTRODUCTION
Scope
The overall purpose of testing is to ensure the >name of application? application meets all of its technical,
functional and $usiness re%uirements. The purpose of this document is to descri$e the overall test plan
and strategy for testing the >name of application? application. The approach descri$ed in this document
provides the framewor, for all testing related to this application. &ndividual test cases will $e written for
each version of the application that is released. This document will also $e updated as re%uired for each
release.
Tet O9ject$&e
The %uality o$jectives of testing the >name of application? application are to ensure complete validation of
the $usiness and software re%uirements-
@erify software re%uirements are complete and accurate
Perform detailed test planning
&dentify testing standards and procedures that will $e used on the project
Prepare and document test scenarios and test cases
#egression testing to validate that unchanged functionality has not $een affected $y changes
!anage defect trac,ing process
Provide test metrics/testing summary reports
6nsure the application is certified for release into the University of !innesota production
environment
chedule 5o/3o 5o meeting
#e%uire sign.offs from all sta,eholders
Tet$ng 7oa!
The goals in testing this application include validating the %uality, usa$ility, relia$ility and performance of
the application. Testing will $e performed from a $lac,.$o+ approach, not $ased on any ,nowledge of
internal design or code. Tests will $e designed around re%uirements and functionality.
Another goal is to ma,e the tests repeata$le for use in regression testing during the project lifecycle, and
for future application upgrades. A part of the approach in testing will $e to initially perform a Amo,e Test;
upon delivery of the application for testing. mo,e Testing is typically an initial testing effort to determine
if a new software version is performing well enough to accept it for a major testing effort. For e+ample, if
the new software is crashing fre%uently, or corrupting data$ases, the software is not in a sta$le enough
condition to warrant further testing in its current state. This testing will $e performed first. After acceptance
of the $uild delivered for system testing, functions will $e tested $ased upon the designated priority
'critical, high, medium, low(.
Quality
Quality software is reasona$ly $ug.free, meets re%uirements and/or e+pectations, and is maintaina$le.
Testing the %uality of the application will $e a two.step process of independent verification and validation.
First, a verification process will $e underta,en involving reviews and meetings to evaluate documents,
plans, re%uirements, and specifications to ensure that the end result of the application is testa$le, and that
re%uirements are covered. The overall goal is to ensure that the re%uirements are clear, complete,
detailed, cohesive, attaina$le, and testa$le. &n addition, this helps to ensure that re%uirements are agreed
to $y all sta,eholders.
econd, actual testing will $e performed to ensure that the re%uirements are met. The standard $y which
the application meets %uality e+pectations will $e $ased upon the re%uirements test matri+, use cases and
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page 1
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
test cases to ensure test case coverage of the re%uirements. This testing process will also help to ensure
the utility of the application B i.e., the design;s functionality and Cdoes the application do what the users
needDE
Reliability
#elia$ility is $oth the consistency and repeata$ility of the application. A large part of testing an application
involves validating its relia$ility in its functions, data, and system availa$ility. To ensure relia$ility, the test
approach will include positive and negative '$rea,.it( functional tests. &n addition, to ensure relia$ility
throughout the iterative software development cycle, regression tests will $e performed on all iterations of
the application.
PRO<ECT QUALITY ASSURANCE
All project artifacts are posted to the 6P! project harePoint site, located at- >enter harePoint site U#0?
"at Trac5 Project Re=3$re' Doc3ment
Project Artifacts #omplete
Project Proposal in 6P!
&nitiation Phase 1hec,list
Project 2)
Project 1harter
)usiness #e%uirements /ocument
Project Plan #eview 1hec,list
Analysis Phase 1hec,list
/esign #eview 1hec,list
1onceptual &T Architecture #eview 1hec,list
Application Architecture /esign
ystem Architecture /esign
1ode #eview 1hec,list
&mplementation Plan 1hec,list
QA Test Plan
Test Planning 1hec,list
/eployment #eadiness Assessment 1hec,list
User Acceptance ign "ff
ervice 0evel Agreement and 1hec,list
0essons 0earned
1lose out #eport
TEST MET>ODOLO7Y
Entrance Cr$ter$a
All $usiness re%uirements are documented and approved $y the $usiness users.
All design specifications have $een reviewed and approved.
Unit testing has $een completed $y the development team, including vendors.
All hardware needed for the test environment is availa$le.
The application delivered to the test environment is of relia$le %uality.
&nitial smo,e test of the delivered functionality is approved $y the testing team.
1ode changes made to the test site will go through a change control process.
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
E?$t Cr$ter$a
All test scenarios have $een completed successfully.
All issues prioriti*ed and priority 4 issues resolved.
All outstanding defects are documented in a test summary with a priority and severity
status.
5o/3o.go meeting is held to determine accepta$ility of product.
Tet E?ec3t$on
The test e+ecution phase is the process of running test cases against the software $uild to verify that the
actual results meet the e+pected results. /efects discovered during the testing cycle shall $e entered into
the project harePoint Team ite /efect list or Quality 1enter 'offered $y "&T(. "nce a defect is fi+ed $y
a developer, the fi+ed code shall $e incorporated into the application and regression tested.
These following testing phases shall $e completed 'if applica$le(-
Unit Testing
Unit testing is performed $y the report developers at U ervices &T and "&T in their development
environment. The developers ,now and will $e testing the internal logical structure of each software
component. A description of the unit testing should $e provided to the project team.
Functional Testing
Functional testing focuses on the functional re%uirements of the software and is performed to confirm that
the application operates accurately according to the documented specifications and re%uirements, and to
ensure that interfaces to e+ternal systems are properly wor,ing.
Regression Testing
#egression testing shall $e performed to verify that previously tested features and functions do not have
any new defects introduced, while correcting other pro$lems or adding and modifying other features.
Integration Testing
&ntegration testing is the phase of software testing in which individual software modules are com$ined and
tested as a group. &n its simplest form, two units that have already $een tested are com$ined into a
component and the interface $etween them is tested. &n a realistic scenario, many units are com$ined into
components, which are in turn aggregated into even larger parts of the program. The idea is to test
com$inations of pieces and eventually e+pand the process to test your modules with those of other
groups. 6ventually all the modules ma,ing up a process are tested together.
Interface Testing
This testing follows a transaction through all of the product processes that interact with it and tests the
product in its entirety. &nterface testing shall $e performed to ensure that the product actually wor,s in the
way a typical user would interact with it.
Destructive Testing
/estructive testing focuses on the error detection and error prevention areas of the product. This testing
is e+ercised in an attempt to anticipate conditions where a user may encounter errors. /estructive testing
is less structured than other testing phases and is determined $y individual testers.
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page $
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
User acceptance testing
User acceptance testing activities will $e performed $y the $usiness users. The purpose of this testing will
$e to ensure the application meets the users; e+pectations. This also includes focuses on usa$ility and
will includeF appearance, consistency of controls, consistency of field naming, accuracy of drop down field
information lists, spelling of all field name/data values, accuracy of default field values, ta$ se%uence, and
error/help messaging
Tet Scenar$o
)elow are the high.level scenarios that will $e tested. These scenarios are derived from the
#e%uirements !atri+ and Use 1ases. From these, detailed test scripts will $e created.

0

2

,
u
m
b
e
r
Test !cenario 0escription
T
e
s
t

!
c
r
i
p
t

R
e
f
e
r
e
n
c
e
T
e
s
t
i
n
g
#
o
m
p
l
e
t
e
3
A%% !TAT*0 R*QUR*1*,T! *4!T A,0 5U,#T",
<<4
>A test scenario is almost li,e a story Ga user enters into the application from
login window $y entering valid user name and password. After entering he
will clic, on module Payslip and clic,s on latest payslip feature to view his
latest payslipG. Any test scenario will contain a specific goal.?
<<7
!*#URT/
<<: >Add description of re%uirements.?
<<=
0ATA -A%0AT",
<<H >Add description of re%uirements.?
<<I
*,-R",1*,T
<<J >Add description of re%uirements.?
<<K
,T*R5A#*!
<<L >Add description of re%uirements.?
<4<
Tet Scr$pt De&e!opment
Test script design is the central focus of a software %uality assurance process. A test script is defined as a
written specification descri$ing how a single or group of $usiness or system re%uirement's( will $e tested.
The test script consists of a set of actions to $e performed, data to $e used, and the e+pected results of
the test. The actual results of the test are recorded during test e+ecution. Test scripts will also $e updated
as testing proceeds.
Test cripts written for this project include the following-
Test cript &/
Test 1ases verified
#e%uirements verified
Purpose of test
Any dependencies and/or special set.up instructions re%uired for performing the test
Test description and steps
6+pected results
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page %
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
De*ect Report$ng
&ssues/defects are trac,ed for resolution with the following guidelines-
&ssues will $e reported $ased upon documented re%uirements.
&ssues will $e trac,ed $y the testing team, reported and entered into Quality 1enter.
&ssues will $e fi+ed $y the development team $ased on the priority/severity assigned $y
the test lead.
All critical/priority 4 defects will $e fi+ed $efore release to production.
ee the /efect Trac,ing Process at the end of this document for detailed instructions on how to log and
trac, defects in Quality 1enter.
TEST EN#IRONMENT
Re=3$rement
1lient erver Technical #e%uirements-
!i+ed $rowsers supported '&nternet 6+plorer, Firefo+, !o*illa(
"racle /ata$ase
1lient Platform- P1 and !acintosh
Production server location-
Testing Platform
/es,top P1 B the application supports all A.5rade $rowsers for 2indows and !ac
operating systems , as defined $y MahooN;s 5raded )rowser upport standards.
http-//developer.yahoo.com/yui/articles/g$s/ 2indows 7<<</&6I may $e e+cluded.
Test server location-
USER ACCEPTANCE TEST PLAN
De*$n$t$on
The overall purpose of testing is to ensure the >name of application? application performs at an accepta$le
level for the customer. This section outlines the detailed plan for user acceptance testing of this
application.
This test plan will $e used to record the customer;s sign off of the documented scenarios. /etailed test
scripts/cases have $een developed and will $e used to record the results of user testing. This document
is a high level guide, and is not intended as a replacement for any specific user acceptance testing
procedures that individual areas might have.
Tet$ng Re=3$rement
Testing will ta,e place in >insert location?. ome testers may choose to perform some
testing from their regular wor,stations where it is possi$le. Test results must still $e coordinated
with others.
UAT will ta,e place $eginning on >insert date?.
&dentified testing participants will receive instructions prior to the start of testing.
&dentified testing participants will perform the e%uivalent of their normal $usiness function
in the upgraded environment.
Test scripts/cases and scenarios will $e prepared prior to the start of UAT.
Test participants will conduct the tests and document results.
/efects will $e entered into Test /irector and trac,ed $y the Test 0ead.
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page '
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Teter/Part$c$pant
Testing participants should include representatives from all areas involved in the application. There are
$enefits to including representatives from across all areas to validate the systems functions $efore the
upgrade goes live in production.
The $est candidates for UAT are-
taff directly impacted $y the upcoming system and $usiness process changes.
Fre%uent users of the application and functions planned in test scripts/cases.
&ndividuals with a sound understanding of $usiness processes in the areas they
represent.
&ndividuals with the necessary time to commit to this endeavor.
2illing to e+periment 'to try various methods to see what wor,s and what doesn;t wor,(.
Patient and have a tolerance for am$iguity.
Teter Name Department/Area Repreent$ng Area o* Tet$ng "oc3
Tet$ng Sc+e'3!e
All upgraded functionality and test data will $e migrated to the test environment prior to the start of user
acceptance testing.
Act$&$t) Lea' Repon$9$!$t) Date
&dentify and select testers for UAT
/evelop test scenarios and scripts/cases
@alidate participants availa$ility for testing
#eview scenarios/scripts for accuracy,
completeness and se%uence 'confirm test data is
correct(
6nsure UAT 0a$ des,tops configured for testing
UAT environment validation
Testing $y UAT participants
ASSUMPTIONS AND RIS@S
A3mpt$on
The )usiness team has reviewed and accepted functionality identified in the $usiness
re%uirements and software re%uirements documents.
Project change control process in place to manage re%uirements.
1ode wal,throughs/reviews will $e completed $y the development team.
Unit testing will $e completed $y the development team prior to release to the test team.
Testers will test what is documented in the re%uirements.
The test team will have a separate test environment to perform testing.
All changes to re%uirements will $e communicated to the test team.
#esources identified in this plan are availa$le to test the application and resolve defects
and address issues as they are raised $y the test team.
That the delivery of the product to production contains all setup, etc., that is necessary for
optimum performance in the production site.
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page )
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
Project sponsors, $usiness and technical, will provide actiona$le guidance on defect
prioriti*ation and resolution.
The UAT environment will $e availa$le and des,tops will $e availa$le to perform testing.
R$5
cope creep 'last minute addition of new re%uirements( impacts deadlines for
development team and test team.
Aggressive target date increases the ris, of defects $eing migrated to production. &f
development timelines are not met, this will directly impact the testing timelines.
Oey resources have completing priorities ma,ing availa$ility less than scheduled.
Any downtime of the test system will significantly impact the testing cycle.
0oad testing is not $eing completed on a consistent $asisF true performance of the
application may not $e ,nown until release to production.
7O/NO87O MEETIN7
"nce the test team has completed the test cycle, a 5o/ 3o.go meeting is scheduled as part of the
implementation planning under launch readiness. This meeting is attended $y the project manager,
$usiness team, test lead, technical lead, and any other sta,eholders.
The test lead will provide a testing summary and list all outstanding unresolved defects and any
associated ris,s with releasing the product to production. All outstanding issues are discussed at that time
$efore a decision is made to push to production. A written sign.off form is signed $y all team mem$ers as
listed a$ove. The list of outstanding issues is also attached to the sign.off form.
ROLES AND RESPONSI(ILITIES
Reo3rce T)pe Repon$9$!$t$e Name
!ponsor Provides 5o/3o 5o authori*ation
that product is ready for release as part of
implementation planning and launch process
Prioriti*es issues and defects,
and manage technical resources
!a,es decisions on unresolved
issues
Project 1anager Provides guidance on the overall
project
1oordinates and develops project
schedule
0iaison with $usiness to ensure
participation and ownership
Trac,s all project activities and
resources, ensuring project remains within
scope
Facilitates identifying and $ringing
closure to open issues
1ommunicates project status
!ubject 1atter
*6perts
/efine $usiness re%uirements
and e+pected results for $usiness acceptance
6+ecute user acceptance testing
0ev Team %ead /esign application architecture
1reate technical design
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page ,
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
/ata$ase Administrator
0evelopers 2rite application code
#esolve defects
upport testers
$usiness %ead 2rite $usiness re%uirements, test
plan and test cases
!aintain re%uirements and defect
reporting in Test /irector
0ead testing cycle
QA %ead !aintain project in Test /irector
2rite test plan to include test
scenarios and cases
Facilitate testing
!aintain and manage defects in
Test /irector
$usiness Analyst 2rite $usiness re%uirements and
$uild test scripts
!aintain re%uirements in Test
/irector
0ead testing cycle and coordinate
test environment
Testers Perform user acceptance testing
SI7N8O"" AND AC@NOALED7EMENT
& understand that $y agreeing to participate in this testing through the e+ecution of the testing plan, &
approve of the activities defined and authori*e my department to participate as documented for the
successful implementation of this application in our department.
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page 3
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
PPPPPPPPPPPPPPPPPPPPPPPPPPPPPPP /ate- PPP/PPP/PPP
#esource 3ame
Title or #esponsi$ility
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page 4
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
TEST DIRECTOR ; DE"ECT TRAC@IN7 PROCESS
S3mmar)B
Screen name and s*ort description about t*e de&ect being reported2 usuall0
providing ke0 5ords 5it* 5*ic* to identi&0 and(or searc* &or t*e de&ect.
Detecte' ()B +uto populates 5it* t*e /ser 6D o& person logged in.
Detecte' on DateB +uto populates 5it* current date.
Se&er$t)B
Describes t*e degree o& impact t*at a de&ect *as on t*e operation o& t*e
application.
A$gne' ToB 6ndividual being assigned t*e de&ect &or &i#ing.
Detecte' $n (3$!'B
7uild 6D in 5*ic* t*e de&ect 5as &ound. 7uild 6D is an identi&ier &or t*e code
release2 assigned b0 8eb Development.
"$?e' $n (3$!'B
7uild 6D in 5*ic* t*e de&ect is &i#ed. 7uild 6D is an identi&ier &or t*e code
release2 assigned b0 8eb Development.
Pr$or$t)B
T*is &ield describes t*e impact t*e de&ect *as on t*e 5ork in progress and t*e
importance and order in 5*ic* a bug s*ould be &i#ed.
Stat3B 6ndicates t*e e#isting state o& a de&ect2 auto populates 5it* a de&ault o& 9:e5;
Decr$pt$onB
!nter description o& de&ect
+dd individual steps to reproduce. 6nclude all steps and screens t*at 5ere
accessed.
!nter e#act 5ords o& t*e error message.
Ema$! De*ectB
+&ter entering de&ect2 rig*t<click on it and select email to send to assigned
developer.
De*ect reo!3t$on
proceB
8*en t*e de&ect is opened2 it is assigned to t*e appropriate person2 status is
c*anged to 9+ssigned;.
Once t*e de&ect is &i#ed=
1. T*e developer to 5*om t*e de&ect is assigned 5ill update t*e de&ect
comments to document t*e &i# t*at 5as made. /ser 6D and Date is
automaticall0 added to t*e de&ect b0 clicking on 9+dd "omment;.
. T*e developer to 5*om t*e de&ect is assigned 5ill c*ange t*e status to
9Fi#ed;2 and 5ill c*ange t*e 9+ssigned To; &ield to t*e tester or de&ect
manager.
$. T*e tester 5ill retest t*e submitted de&ect.
%. 6& de&ect passes t*e retest2 t*e tester or de&ect manager 5ill c*ange
Status to 9"losed;.
'. 6& t*e de&ect is not &i#ed2 t*e tester 5ill c*ange t*e Status to 9+ssigned;
and enter t*e /ser6D o& t*e developer in t*e +ssigned To &ield.
). Once t*e de&ect *as been veri&ied as &i#ed2 t*e project manager >or
de&ect manager? 5ill update t*e status to 9"losed;.
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page 1.
SYSTEM/APPLICATION QUALITY ASSURANCE TEST PLAN
DE"INITIONS "OR DE"ECT PRIORITY AND SE#ERITY
PRIORITYB T*is &ield describes t*e impact t*e de&ect *as on t*e 5ork in progress and t*e importance
and order in 5*ic* a bug s*ould be &i#ed. T*is &ield is utili@ed b0 t*e developers and test engineers to
prioriti@e 5ork e&&ort on t*e de&ect resolution.
, ; Urgent (!oc5
Aor5
Furt*er development and(or testing cannot occur until t*e de&ect *as been
resolved.
0 ; Reo!&e ASAP
T*e de&ect must be resolved as soon as possible because it is impairing
development and(or testing activities.
C ; Norma! Q3e3e
T*e de&ect s*ould be resolved in t*e normal prioriti@ation and completion o&
de&ect resolution.
/ ; Lo2 Pr$or$t)
T*e de&ect is an anno0ance and s*ould be resolved2 but it can 5ait until a&ter
more serious de&ects *ave been &i#ed.
4 ; Tr$&$a! T*e de&ect *as little or no impact to development and(or testing 5ork.
SE#ERITYB T*is &ield describes t*e degree o& impact t*at a de&ect *as on t*e operation o& t*e
application.
, ; Cr$t$ca!
"ritical loss o& &unction. T*e de&ect results in s0stem cras*es2 t*e &ailure o& a
ke0 subs0stem or module2 a corruption or loss o& data2 or a severe memor0
leak.
0 ; Major
1ajor loss o& &unction. T*e de&ect results in a &ailure o& t*e s0stem2 subs0stem2
or module2 but t*e de&ect does not result in t*e corruption or loss o& signi&icant
data.
C ; Mo'erate
1oderate loss o& &unction. T*e de&ect does not result in a &ailure o& t*e s0stem2
subs0stem2 or module2 but t*e de&ect ma0 cause t*e s0stem to displa0 data
incorrectl02 incompletel02 or inconsistentl0.
/ ; M$nor
1inor loss o& &unction2 or anot*er problem 5*ere a 5orkaround is present.
T*ere are no data integrit0 issues.
4 ; Ua9$!$t)
T*e de&ect is related to t*e s0stem usabilit02 is t*e result o& non<con&ormance
to a standard2 or is related to t*e aest*etics o& t*e s0stem. T*ere is no loss o&
s0stem &unction.
6 ; En+ancement
T*e de&ect is a request &or an en*ancement2 i.e. it is not 5it*in t*e scope o& t*e
current project e&&ort.
- .1 Regents o& t*e /niversit0 o& 1innesota. +ll rig*ts reserved. Revised 1a0 .2 .1% Page 11

You might also like