SQE Assignment

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 25

SOFTWARE TESTING DOCUMENTATION

Assignment Objective

1. Survey Testing Documentation Practices in vogue at any major software firm.


Moreover, presents any sample templates of different organization for test documentation
available.

Abstract
1. Testing is an important part of developmental process. In waterfall model it is a
distinct phase whereas as in other processes it is embedded into the process. “The basic
idea of testing involves the execution of software and the observation of its behavior or
outcome. If a failure is observed, the execution record is analyzed to locate and fix the
fault(s) that caused the failure. Otherwise, it indicates that the software under testing is
more likely to fulfill its designated functions” [ CITATION Jef \p 67 \l 1033 ]. The effective testing
needs a formal process support and effective execution of testing requires sound planning
and documentation.
2. In pursuant to the assignment objective, several software firm including Microsoft,
IBM, Sun micro systems, Nokia and Mozilla were considered. The information available
through company websites and other indirect sources was collected and analyzed. These
sources only provide very broad information about the software Testing Tools and
Techniques used by these firms for software testing and specific information about testing
documentation they maintained is not directly available.
Assumption
3. While details of testing and testing documentation in vogue at various firms are
confidential, several indirect sources indicate that most of the software firms use Custom
Testing Documentation Standards. Moreover, these practices are greatly influenced by
IEEE Testing Documentation Standard 829[ CITATION Ala08 \l 1033 ][ CITATION ISO29119 \l
1033 ].

IEEE – 829 (1998/2008) –Test Documentation Standard


4. The standard deals with testing process and testing documentation. It describes a
set of basic test documents that are associated with the dynamic aspects of software
testing (i.e. the execution of procedures and code). It also defines the use and contents of
related test documentation (Test Design, Test Case, Test Procedure, Anomaly Report, Test
MCS-17 | Assignment 2 2
Log, Level Test Report, Interim Test Report, and Master Test Report). While the documents
described in the standard focus on dynamic testing, several of them may be applicable to
other testing activities.
5. This standard applies to all software-based systems. It applies to systems and
software being developed, acquired, operated, maintained, and/or reused [e.g., legacy,
modified, Commercial-Off-the-Shelf (COTS), Government-Off-the-Shelf (GOTS), or Non-
Developmental Items (NDIs)]. When conducting the test process, it is important to examine
the software and its interactions with the other parts of the system. This standard identifies
the system considerations that test processes and tasks address in determining system and
software correctness and other attributes (e.g., completeness, accuracy, consistency, and
testability), and the applicable resultant test documentation. Clause 5 of the standard is
inventory of the test documentation identified as needed based on the testing tasks. The
process selection of contents is covered in Clauses 6&7 whereas clauses 8 -17 give out
recommended test document contents. The purpose and outline structure are presented in
subsequent sections.
6. This standard uses the concept of integrity levels to determine the recommended
minimum testing tasks to be performed. The inputs and outputs of the indicated testing
tasks identify the test documentation needed. High integrity software requires a larger set of
test processes, a more rigorous application of testing tasks, and as a result, more test
documentation. The standards specify four integrity levels (1-4).

Basic Software Test Documents [ CITATION IEEE829 \l 1033 ]


7. The documents outlined in this standard cover test planning, test specifications, and
test reporting.

MCS-17 | Assignment 2 3
MCS-17 | Assignment 2 4
1

Figure 2: Test Documentation Review

1
Figure taken from IEEE-829-1998, still applicable in the context of new standard IEEE-829-2008
MCS-17 | Assignment 2 5
a. Master Test Plan (MTP) The purpose of the Master Test Plan (MTP) is to
provide an overall test planning and test management document for multiple
levels of test (either within one project or across multiple projects). A test plan
shall have the following structure:

MASTER TEST PLAN OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
1.4. System overview and key features
1.5. Test overview
1.5.1 Organization
1.5.2 Master test schedule
1.5.3 Integrity level schema
1.5.4 Resources summary
1.5.5 Responsibilities
1.5.6 Tools, techniques, methods, and metrics
2. Details of the Master Test Plan
2.1. Test processes including definition of test levels
2.1.1 Process: Management
2.1.1.1 Activity: Management of test effort
2.1.2 Process: Acquisition
2.1.2.1: Activity: Acquisition support test
2.1.3 Process: Supply
2.1.3.1 Activity: Planning test
2.1.4 Process: Development
2.1.4.1 Activity: Concept
2.1.4.2 Activity: Requirements
2.1.4.3 Activity: Design
2.1.4.4 Activity: Implementation
2.1.4.5 Activity: Test
2.1.4.6 Activity: Installation/checkout
2.1.5 Process: Operation
2.1.5.1 Activity: Operational test
2.1.6 Process: Maintenance
2.1.6.1 Activity: Maintenance test
2.2. Test documentation requirements
2.3. Test administration requirements
2.4. Test reporting requirements
3. General
3.1. Glossary
3.2. Document change procedures and history

b. Level Test Plan(s) Specifies for each LTP the scope, approach,
resources, and schedule of the testing activities for its specified level of

MCS-17 | Assignment 2 6
testing. Identify the items being tested, the features to be tested, the testing
tasks to be performed, the personnel responsible for each task, and the
associated risk(s). In the title of the plan, the word “Level” is replaced by
the organization’s name for the particular level being documented by the
plan (e.g., Component Test Plan, Component Integration Test Plan, System
Test Plan, and Acceptance Test Plan). The outline is as follow:-

LEVEL TEST PLAN OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
1.4. Level in the overall sequence
1.5. Test classes and overall test conditions
2. Details for this level of test plan
2.1 Test items and their identifiers
2.2 Test Traceability Matrix
2.3 Features to be tested
2.4 Features not to be tested
2.5 Approach
2.6 Item pass/fail criteria
2.7 Suspension criteria and resumption requirements
2.8 Test deliverables
3. Test management
3.1 Planned activities and tasks; test progression
3.2 Environment/infrastructure
3.3 Responsibilities and authority
3.4 Interfaces among the parties involved
3.5 Resources and their allocation
3.6 Training
3.7 Schedules, estimates, and costs
3.8 Risk(s) and contingency(s)
4. General
4.1 Quality assurance procedures
4.2 Metrics
4.3 Test coverage
4.4 Glossary
4.5 Document change procedures and history

c. Level Test Design. The purpose of the LTD is to specify any refinements of the
test approach (LTP Section 2.5) and to identify the features to be tested by this
design and its associated tests. The outline is as follow:-

MCS-17 | Assignment 2 7
LEVEL TEST DESIGN OUTLINE
1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details of the Level Test Design
2.1. Features to be tested
2.2. Approach refinements
2.3. Test identification
2.4. Feature pass/fail criteria
2.5 Test deliverables
3. General
3.1. Glossary
3.2. Document change procedures and history

d. Level Test Case. The purpose of the LTC is to define (an appropriate level of
detail) the information needed as it pertains to inputs to and outputs from the
software or software-based system being tested. The LTC includes all test
case(s) identified by the associated segment of the LTD (if there is one). The
outline is as follow:-

LEVEL TEST CASE OUTLINE


1. Introduction (once per document)
1.1. Document identifier
1.2. Scope
1.3. References
1.4. Context
1.5. Notation for description
2. Details (once per test case)
2.1. Test case identifier
2.2. Objective
2.3. Inputs
2.4. Outcome(s)
2.5. Environmental needs
2.6. Special procedural requirements
2.7. Inter-case dependencies
3. Global (once per document)
3.1. Glossary
3.2. Document change procedures and history

MCS-17 | Assignment 2 8
e. Level Test Procedure. The purpose of an LTP is to specify the steps for
executing a set of test cases or, more generally, the steps used to exercise a
software product or software-based system item in order to evaluate a set of
features. The outline is as follow:-

LEVEL TEST PROCEDURE OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
1.4. Relationship to other procedures
2. Details
2.1. Inputs, outputs, and special requirements
2.2. Ordered description of the steps to be taken to execute the test cases
3. General
3.1. Glossary
3.2. Document change procedures and history

MCS-17 | Assignment 2 9
f. Level Test Log. The purpose of the LTL is to provide a chronological record of
relevant details about the execution of tests. An automated tool may capture all
or part of this information. The outline is as follow:-

LEVEL TEST LOG OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details
2.1. Description
2.2. Activity and event entries
3. General
3.1. Glossary

g. Anomaly Report Outline. The purpose of the AR is to document any event that
occurs during the testing process that requires investigation. This may be called a
problem, test incident, defect, trouble, issue, anomaly, or error report. The outline
is as follow:-

ANOMALY REPORT OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details
2.1. Summary
2.2. Date anomaly discovered
2.3. Context
2.4. Description of anomaly
2.5. Impact
2.6. Originator’s assessment of urgency (see IEEE 1044-1993 [B13])
2.7. Description of the corrective action
2.8. Status of the anomaly
2.9. Conclusions and recommendations
3. General
3.1 Document change procedures and history

h. Level Interim Test Status Report Outline . The purpose of the LITSR is to
summarize the results of the designated testing activities and optionally to
provide evaluations and recommendations based on these results. It is

MCS-17 | Assignment 2 10
customary to replace the word “Level” in the title of the document with the
organization’s name for the particular test level, e.g. Acceptance Interim Test
Status Report. There is one defined format for the LITSR for each test level
identified by the organization. They may vary greatly in the level of detail. The
outline is as follow:-

LEVEL INTERIM TEST STATUS REPORT OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details
2.1. Test status summary
2.2. Changes from plans
2.3. Test status metrics
3. General
3.1. Document change procedures and history

i. Level Test Report Outline . The purpose of the LITSR is to summarize the
results of the designated testing activities and to provide evaluations and
recommendations based on these results. It is customary to replace the word
“Level” in the title of the document with the organization’s name for the particular
test level, e.g., Acceptance Interim Test Status Report. There is one defined
format for the LITSR for each test level identified by the organization. They may
vary greatly in the level of detail. The outline is as follow:-

LEVEL TEST REPORT OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details
2.1. Overview of test results
2.2. Detailed test results
2.3. Rationale for decisions
2.4. Conclusions and recommendations
3. General
3.1. Glossary
3.2. Document change procedures and history

MCS-17 | Assignment 2 11
j. Master Test Report Outline The purpose of the Level Test Report (LTR) is to
summarize the results of the designated testing activities and to provide
evaluations and recommendations based on these results. It is customary to
replace the word “Level” in the title of the document with the organization’s name
for the particular test level, e.g., Acceptance Test Report. There is one LTR for
each test level defined by the organization or project. Small projects may merge
reports for multiple levels. They may vary greatly in the level of detail of
documentation (e.g., a Unit Test Report may simply be a statement that it passed
or failed, whereas an Acceptance Test Report may be much more detailed). The
outline is as follow:-

MASTER TEST REPORT OUTLINE


1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details of the Master Test Report
2.1. Overview of all aggregate test results
2.2. Rationale for decisions
2.3. Conclusions and recommendations
3. General
3.1. Glossary
3.2. Document change procedures and history

Software Testing At Microsoft[ CITATION Ala08 \l 1033 ]


8. Microsoft takes a unique approach to software testing compared to industry norms.
We have more test engineers than developers, and we emphasize software engineering
skills with all testers. This unique approach goes all the way down to the title we give
testers: Software Development Engineer in Test (SDET). By design, this title is virtually the
same as the Software Development Engineer title used for developers. Microsoft recruits
500 Test Engineer every year and have very comprehensive training program for Test
Engineers [ CITATION Ala08 \p 73 \l 1033 ].
9. Microsoft hasn't wholeheartedly adopted any of Process Improvement programs like
CMMI, 6 Sigma, ISO - 9000 and Lean. Microsoft continues to take process improvement
programs seriously and often will "test" programs to get a better understanding of how the
process would work on Microsoft products. For example, Microsoft has piloted several
projects over the past few years using approaches based on Six Sigma and Lean. The
MCS-17 | Assignment 2 12
strategy in using these approaches to greatest advantage is to understand how best to
achieve a balance between the desire for quick results and the rigor of Lean and Six Sigma [
CITATION Ala08 \p 47 \l 1033 ].
10. Microsoft claims that its development process, the documentation of steps along the
way, the support its management team has for quality processes, and the
institutionalization of development process in documented and repeatable processes (as
well as document results) are all elements of the core ISO standards and that, in most
cases, it meets or exceeds these.
11. Microsoft follows a custom standard for test documentation. Exact details are not
available in open literature. However, it can be inferred that it uses a customized set of
standards for testing and test documentation which coincide with ISO/IEC 29119 (Part-3).
The later has been derived from IEEE-829-2008 test documentation standard [ CITATION
ISO29119 \l 1033 ].
12. Test Templates. As mentioned earlier, companies use custom test documentation
standards which are greatly influenced by the IEEE standards, the templates for test
documents in most of the cases are same as of IEEE presented in earlier sections. For the
sake of completeness, templates from Extreme Software Testing Inc are presented in as
per Annex A to this document. Software test templates from NASA[ CITATION God11 \l 1033 ]
[ CITATION God111 \l 1033 ] (9) (10) and others[ CITATION Flo11 \l 1033 ] [ CITATION eXt11 \l 1033 ]
are submitted in soft form. These templates are available at following links.

Conclusion
13. Having presented the glimpses of testing process in vogue at Microsoft and on the
basis of research of testing processes of IBM and Sun Micro Systems, it can be inferred
that these software giants are using their customized standards for testing documentation.
These standards are based on ISO/IEC 29119 (Part-3) which itself is derived from IEEE-
829-1998 standard. Most of the firms have customized the standards to achieve quick
results while maintaining the rigor of these standards.

MCS-17 | Assignment 2 13
1. Jeff, Tian. Software Quality Engineering Testing, Quality Assurance, and Improvement.

2. Alan, Pagen. How We test Software At Microsoft. s.l. : Microsoft, 2008.

3. ISO/IEC 29119 Software Testing. [Online] [Cited: april 24, 2011.]


http://softwaretestingstandard.org/index.php. ISO/IEC 29119.

4. IEEE-829, Software Engineering Technical Committee of IEEE computer. IEEE Standard for Software Test
Documentation. 2008.

5. Godard space flight center. Software assurancer. [Online] [Cited: Apr 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/checklists/pdf/software_test_plan.pdf.

6. Godard Space Flight center. Godard Space Flight center. [Online] [Cited: Apr 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/documents/pdf/software_quality_assurance_plan.pdf.

7. Florida's Statewide Systems Engineering . Florida's Statewide Systems Engineering . [Online] [Cited: Apr 25,
2011.] http://www.floridaits.com/SEMP/Files/PDF_Report/ApxJ.pdf.

8. eXtreme Software Testing. eXtreme Software Testing. [Online] [Cited: Aprill 25, 2011.]
http://extremesoftwaretesting.com/.

9. Software Test Plan - NASA. Software Test Plan - NASA. [Online] [Cited: 04 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/checklists/pdf/software_test_plan.pdf.

10. Software Test Report - NASA. Software Test Report - NASA. [Online] [Cited: 04 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/checklists/pdf/software_test_report.pdf.

MCS-17 | Assignment 2 14
Annex A

TEST TEMPLATES

SAMPLE - A MASTER SOFTWARE TEST PLAN DOCUMENT[ CITATION eXt11 \l 1033 ]

CONTENTS

1. Introduction
1.1. Purpose
1.2. Background
1.3. Scope
1.4. Project Identification
2. Software Structure
2.1. Software Risk Issues
3. Test Requirements
3.1 Features Not to Test
3.2 Metrics
4. Test Strategy
4.1. Test Cycles
4.2. Planning Risks and Contingencies
5.1. Testing Types
5.1.1. Functional Testing
5.1.2. User Interface Testing
5.1.3. Configuration Testing
5.1.4. Installation Testing
5.1.5. Volume Testing
5.1.6. Performance Testing
4.2. Tools
6. Resources
6.1. Staffing
6.2 Training Needs
7. Project Milestones
MCS-17 | Assignment 2 15
8. Deliverables
8.1. Test Assets
8.2. Exit criteria
8.3. Test Logs and Defect Reporting
9. References

MCS-17 | Assignment 2 16
SAPMLE - TEST STRATEGY DOCUMENT
CONTENTS
A good test strategy is the most important and can in some cases replace all test plan
documents.

1. INTRODUCTION
1.1 PURPOSE
1.2 FUNCTIONAL OVERVIEW
1.3 CRITICAL SUCCESS FACTOR
1.4 TESTING SCOPE (TBD)
Inclusions
Exclusions
1.5 TEST COMPLETION CRITERIA
2. TIMEFRAME
3. RESOURCES
4.1 TESTING TEAM SETUP
4.2 HARDWARE REQUIREMENTS
4.3 SOFTWARE REQUIREMENTS
5. APPLICATION TESTING RISKS PROFILE
6. TEST APPROACH
6.1 STRATEGIES
6.2 GENERAL TEST OBJECTIVES:
6.3 APPLICATION FUNCTIONALITY
6.4 APPLICATION INTERFACES
6.5 TESTING TYPES
6.5.1 Stability
6.5.2 System
6.5.3 Regression
6.5.4 Installation
6.5.5 Recovery

MCS-17 | Assignment 2 17
6.5.6 Configuration
6.5.7 Security
7. BUSINESS ARES FOR SYSTEM TEST
8. TEST PREPARATION
8.1 TEST CASE DEVELOPMENT
8.2 TEST DATA SETUP
8.3 TEST ENVIRONMENT
8.3.1 Database Restoration Strategies.
9. TEST EXECUTION
9.1 TEST EXECUTION PLANNING
9.2 TEST EXECUTION DOCUMENTATION
9.3 PROBLEM REPORTING
10. STATUS REPORTING
10.1 TEST EXECUTION PROCESS
10.2 PROBLEM STATUS
11. HANDOVER FOR USER ACCEPTANCE TEST TEAM
12. DELIVERABLES
13. APPROVALS
14. APPENDIXES
14.1 APPENDIX A (BUSINESS PROCESS RISK ASSESSMENT)
14.2 APPENDIX B (TEST DATA SETUP)
14.3 APPENDIX C (TEST CASE TEMPLATE)
14.4 APPENDIX D (PROBLEM TRACKING PROCESS)

MCS-17 | Assignment 2 18
SAMPLE - TEST EVALUATION REPORT DOCUMENT
CONTENTS
1. Objectives
2. Scope
3. References
4. Introduction
5. Test Coverage
6. Code Coverage
7. Suggested Actions
8. Diagrams

SAMPLE - QA PLAN DOCUMENT


CONTENTS

1. INTRODUCTION
1.1. OVERVIEW OF PROJECT X
1.2. PURPOSE OF THIS DOCUMENT
1.3. FORMAL REVIEWING
1.4. OBJECTIVES OF SYSTEM TEST
1.4.1. QUALITY ASSURANCE INVOLVEMENT
2. SCOPE AND OBJECTIVES
2.1. SCOPE OF TEST APPROACH - SYSTEM FUNCTIONS
2.1.1. INCLUSIONS
2.1.2. EXCLUSIONS
2.2. TESTING PROCESS
2.3. TESTING SCOPE
2.3.1. FUNCTIONAL TESTING
2.3.2. INTEGRATION TESTING

MCS-17 | Assignment 2 19
2.3.3. PERFORMANCE TESTING
2.3.4. REGRESSION TESTING
2.3.5. LOAD/STRESS TESTING
2.3.6. BUSINESS (USER) ACCEPTANCE TEST
2.4. BUILD TESTING
2.4.1. ENTRANCE CRITERIA
2.4.2. EXIT CRITERIA
3. TEST PHASES AND CYCLES
UNIT TESTING (CONDUCTED BY DEVELOPMENT):
INTEGRATION/FUNCTIONALITY TESTING:
REGRESSION TESTING:
NEGATIVE / POSITIVE TESTING:
AD HOC TESTING:
PERFORMANCE TESTING:
3.1. ORGANIZATION OF SYSTEM TESTING CYCLES
3.2. SOFTWARE DELIVERY
3.3. FORMAL REVIEWING
4. SYSTEM TEST SCHEDULE
5. RESOURCES - TESTING TEAM
5.1. HUMAN
5.2. HARDWARE
HARDWARE COMPONENTS REQUIRED
5.3. SOFTWARE TEST ENVIRONMENT SOFTWARE
ERROR MEASUREMENT SYSTEM
6. ROLES AND RESPONSIBILITIES
6.1. MANAGEMENT TEAM
6.2. TESTING TEAM SETUP
6.3. BUSINESS TEAM
6.4. DEVELOPMENT TEAM

MCS-17 | Assignment 2 20
7. ERROR MANAGEMENT & CONFIGURATION MANAGEMENT
8. STATUS REPORTING
8.1. STATUS REPORTING
9. ISSUES, RISKS, AND ASSUMPTIONS
9.1. ISSUES/RISKS
9.2. ASSUMPTIONS
10. FORMAL SIGNOFF
11. ERROR REVIEW
11.1. PURPOSE OF ERROR REVIEW TEAM.
11.2. ERROR REVIEW TEAM MEETING AGENDA.
11.3. CLASSIFICATION OF BUGS
11.4. PROCEDURE FOR MAINTENANCE OF ERROR MANAGEMENT SYSTEM.
11.5. QUALITY ASSURANCE MEASURES
(I) DATES.
(II) EFFORT.
(III) VOLUME.
(IV) QUALITY.
(V) TURNAROUND.

MCS-17 | Assignment 2 21
SAMPLE - USER ACCEPTANCE TEST (UAT) PLAN TABLE OF
CONTENTS
1. INTRODUCTION
1.1 PURPOSE
1.2 FUNCTIONAL OVERVIEW
1.3 CRITICAL SUCCESS FACTORS
1.4 UAT SCOPE
1.5 TEST COMPLETION CRITERIA
2. TIMEFRAME
3. RESOURCES
3.1 TESTING TEAM
3.2 HARDWARE TESTING REQUIREMENTS
3.3 SOFTWARE TESTING REQUIREMENTS
4. TEST APPROACH
4.1 TEST STRATEGY
4.2 GENERAL TEST OBJECTIVES:
4.3 BUSINESS AREAS FOR SYSTEM TEST
4.4 APPLICATION INTERFACES
5. TEST PREPARATION
5.1 TEST CASE DEVELOPMENT
5.2 TEST DATA SETUP
5.3 TEST ENVIRONMENT
6. UAT EXECUTION
6.1 PLANNING UAT EXECUTION
6.2 TEST EXECUTION DOCUMENTATION
6.3 ISSUE REPORTING
7. HANDOVER FOR UAT ACCEPTANCE COMMITTEE
8. ACCEPTANCE COMMITTEE
9. DELIVERABLES

MCS-17 | Assignment 2 22
10. APPROVALS
11. APPENDIXES
11.1 APPENDIX A (TEST CASE TEMPLATE)
11.2 APPENDIX B (SEVERITY STRATEGY)
11.3 APPENDIX C (ISSUE LOG)

MCS-17 | Assignment 2 23
SAMPLE - SOFTWARE RISK MANAGEMENT DOCUMENT
CONTENTS

1. Introduction
2. Terminology.
3. Risk Sources
4. Understanding the Risk
5. Risk Management Process Flow
5.1 Identifying the Risk
5.2 Analyze Risk
5.3 Risk Planning
5.4 Risk Tracking
5.5 Risk Controlling
5.6 Retiring Risks
6. Status Strategy:
7. Process Dependencies
8. Process Summary
9. Approvals
10. Appendixes
10.1 Appendix A 'Top 10 List template'.

MCS-17 | Assignment 2 24
SAMPLE - SOFTWARE LOAD TEST PLAN TABLE OF
CONTENTS
1. Introduction
1.1 Purpose
1.2 Background
1.3 Scope
1.4 Project Identification
2. Test Requirements
3. Test Strategy
3.1 Test objective
3.2 Type of test and type of virtual user
3.3 Test Approaches and Strategies
3.3.1 System analysis
3.3.2 Define the detailed testing check list
3.3.3 Developing Test Scripts
3.3.4 Creating test scenario
3.3.5 Monitoring Performance
3.3.6 Analyzing test result
3.4 Other Considerations
4. Resources
4.1 Workers
4.2 System
5. Project Milestones
6. Deliverables
6.1 Test Configuration
6.2 Test Logs
6.3 Test Reports

MCS-17 | Assignment 2 25

You might also like