Professional Documents
Culture Documents
SQE Assignment
SQE Assignment
SQE Assignment
Assignment Objective
Abstract
1. Testing is an important part of developmental process. In waterfall model it is a
distinct phase whereas as in other processes it is embedded into the process. “The basic
idea of testing involves the execution of software and the observation of its behavior or
outcome. If a failure is observed, the execution record is analyzed to locate and fix the
fault(s) that caused the failure. Otherwise, it indicates that the software under testing is
more likely to fulfill its designated functions” [ CITATION Jef \p 67 \l 1033 ]. The effective testing
needs a formal process support and effective execution of testing requires sound planning
and documentation.
2. In pursuant to the assignment objective, several software firm including Microsoft,
IBM, Sun micro systems, Nokia and Mozilla were considered. The information available
through company websites and other indirect sources was collected and analyzed. These
sources only provide very broad information about the software Testing Tools and
Techniques used by these firms for software testing and specific information about testing
documentation they maintained is not directly available.
Assumption
3. While details of testing and testing documentation in vogue at various firms are
confidential, several indirect sources indicate that most of the software firms use Custom
Testing Documentation Standards. Moreover, these practices are greatly influenced by
IEEE Testing Documentation Standard 829[ CITATION Ala08 \l 1033 ][ CITATION ISO29119 \l
1033 ].
MCS-17 | Assignment 2 3
MCS-17 | Assignment 2 4
1
1
Figure taken from IEEE-829-1998, still applicable in the context of new standard IEEE-829-2008
MCS-17 | Assignment 2 5
a. Master Test Plan (MTP) The purpose of the Master Test Plan (MTP) is to
provide an overall test planning and test management document for multiple
levels of test (either within one project or across multiple projects). A test plan
shall have the following structure:
b. Level Test Plan(s) Specifies for each LTP the scope, approach,
resources, and schedule of the testing activities for its specified level of
MCS-17 | Assignment 2 6
testing. Identify the items being tested, the features to be tested, the testing
tasks to be performed, the personnel responsible for each task, and the
associated risk(s). In the title of the plan, the word “Level” is replaced by
the organization’s name for the particular level being documented by the
plan (e.g., Component Test Plan, Component Integration Test Plan, System
Test Plan, and Acceptance Test Plan). The outline is as follow:-
c. Level Test Design. The purpose of the LTD is to specify any refinements of the
test approach (LTP Section 2.5) and to identify the features to be tested by this
design and its associated tests. The outline is as follow:-
MCS-17 | Assignment 2 7
LEVEL TEST DESIGN OUTLINE
1. Introduction
1.1. Document identifier
1.2. Scope
1.3. References
2. Details of the Level Test Design
2.1. Features to be tested
2.2. Approach refinements
2.3. Test identification
2.4. Feature pass/fail criteria
2.5 Test deliverables
3. General
3.1. Glossary
3.2. Document change procedures and history
d. Level Test Case. The purpose of the LTC is to define (an appropriate level of
detail) the information needed as it pertains to inputs to and outputs from the
software or software-based system being tested. The LTC includes all test
case(s) identified by the associated segment of the LTD (if there is one). The
outline is as follow:-
MCS-17 | Assignment 2 8
e. Level Test Procedure. The purpose of an LTP is to specify the steps for
executing a set of test cases or, more generally, the steps used to exercise a
software product or software-based system item in order to evaluate a set of
features. The outline is as follow:-
MCS-17 | Assignment 2 9
f. Level Test Log. The purpose of the LTL is to provide a chronological record of
relevant details about the execution of tests. An automated tool may capture all
or part of this information. The outline is as follow:-
g. Anomaly Report Outline. The purpose of the AR is to document any event that
occurs during the testing process that requires investigation. This may be called a
problem, test incident, defect, trouble, issue, anomaly, or error report. The outline
is as follow:-
h. Level Interim Test Status Report Outline . The purpose of the LITSR is to
summarize the results of the designated testing activities and optionally to
provide evaluations and recommendations based on these results. It is
MCS-17 | Assignment 2 10
customary to replace the word “Level” in the title of the document with the
organization’s name for the particular test level, e.g. Acceptance Interim Test
Status Report. There is one defined format for the LITSR for each test level
identified by the organization. They may vary greatly in the level of detail. The
outline is as follow:-
i. Level Test Report Outline . The purpose of the LITSR is to summarize the
results of the designated testing activities and to provide evaluations and
recommendations based on these results. It is customary to replace the word
“Level” in the title of the document with the organization’s name for the particular
test level, e.g., Acceptance Interim Test Status Report. There is one defined
format for the LITSR for each test level identified by the organization. They may
vary greatly in the level of detail. The outline is as follow:-
MCS-17 | Assignment 2 11
j. Master Test Report Outline The purpose of the Level Test Report (LTR) is to
summarize the results of the designated testing activities and to provide
evaluations and recommendations based on these results. It is customary to
replace the word “Level” in the title of the document with the organization’s name
for the particular test level, e.g., Acceptance Test Report. There is one LTR for
each test level defined by the organization or project. Small projects may merge
reports for multiple levels. They may vary greatly in the level of detail of
documentation (e.g., a Unit Test Report may simply be a statement that it passed
or failed, whereas an Acceptance Test Report may be much more detailed). The
outline is as follow:-
Conclusion
13. Having presented the glimpses of testing process in vogue at Microsoft and on the
basis of research of testing processes of IBM and Sun Micro Systems, it can be inferred
that these software giants are using their customized standards for testing documentation.
These standards are based on ISO/IEC 29119 (Part-3) which itself is derived from IEEE-
829-1998 standard. Most of the firms have customized the standards to achieve quick
results while maintaining the rigor of these standards.
MCS-17 | Assignment 2 13
1. Jeff, Tian. Software Quality Engineering Testing, Quality Assurance, and Improvement.
4. IEEE-829, Software Engineering Technical Committee of IEEE computer. IEEE Standard for Software Test
Documentation. 2008.
5. Godard space flight center. Software assurancer. [Online] [Cited: Apr 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/checklists/pdf/software_test_plan.pdf.
6. Godard Space Flight center. Godard Space Flight center. [Online] [Cited: Apr 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/documents/pdf/software_quality_assurance_plan.pdf.
7. Florida's Statewide Systems Engineering . Florida's Statewide Systems Engineering . [Online] [Cited: Apr 25,
2011.] http://www.floridaits.com/SEMP/Files/PDF_Report/ApxJ.pdf.
8. eXtreme Software Testing. eXtreme Software Testing. [Online] [Cited: Aprill 25, 2011.]
http://extremesoftwaretesting.com/.
9. Software Test Plan - NASA. Software Test Plan - NASA. [Online] [Cited: 04 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/checklists/pdf/software_test_plan.pdf.
10. Software Test Report - NASA. Software Test Report - NASA. [Online] [Cited: 04 25, 2011.] http://sw-
assurance.gsfc.nasa.gov/disciplines/quality/checklists/pdf/software_test_report.pdf.
MCS-17 | Assignment 2 14
Annex A
TEST TEMPLATES
CONTENTS
1. Introduction
1.1. Purpose
1.2. Background
1.3. Scope
1.4. Project Identification
2. Software Structure
2.1. Software Risk Issues
3. Test Requirements
3.1 Features Not to Test
3.2 Metrics
4. Test Strategy
4.1. Test Cycles
4.2. Planning Risks and Contingencies
5.1. Testing Types
5.1.1. Functional Testing
5.1.2. User Interface Testing
5.1.3. Configuration Testing
5.1.4. Installation Testing
5.1.5. Volume Testing
5.1.6. Performance Testing
4.2. Tools
6. Resources
6.1. Staffing
6.2 Training Needs
7. Project Milestones
MCS-17 | Assignment 2 15
8. Deliverables
8.1. Test Assets
8.2. Exit criteria
8.3. Test Logs and Defect Reporting
9. References
MCS-17 | Assignment 2 16
SAPMLE - TEST STRATEGY DOCUMENT
CONTENTS
A good test strategy is the most important and can in some cases replace all test plan
documents.
1. INTRODUCTION
1.1 PURPOSE
1.2 FUNCTIONAL OVERVIEW
1.3 CRITICAL SUCCESS FACTOR
1.4 TESTING SCOPE (TBD)
Inclusions
Exclusions
1.5 TEST COMPLETION CRITERIA
2. TIMEFRAME
3. RESOURCES
4.1 TESTING TEAM SETUP
4.2 HARDWARE REQUIREMENTS
4.3 SOFTWARE REQUIREMENTS
5. APPLICATION TESTING RISKS PROFILE
6. TEST APPROACH
6.1 STRATEGIES
6.2 GENERAL TEST OBJECTIVES:
6.3 APPLICATION FUNCTIONALITY
6.4 APPLICATION INTERFACES
6.5 TESTING TYPES
6.5.1 Stability
6.5.2 System
6.5.3 Regression
6.5.4 Installation
6.5.5 Recovery
MCS-17 | Assignment 2 17
6.5.6 Configuration
6.5.7 Security
7. BUSINESS ARES FOR SYSTEM TEST
8. TEST PREPARATION
8.1 TEST CASE DEVELOPMENT
8.2 TEST DATA SETUP
8.3 TEST ENVIRONMENT
8.3.1 Database Restoration Strategies.
9. TEST EXECUTION
9.1 TEST EXECUTION PLANNING
9.2 TEST EXECUTION DOCUMENTATION
9.3 PROBLEM REPORTING
10. STATUS REPORTING
10.1 TEST EXECUTION PROCESS
10.2 PROBLEM STATUS
11. HANDOVER FOR USER ACCEPTANCE TEST TEAM
12. DELIVERABLES
13. APPROVALS
14. APPENDIXES
14.1 APPENDIX A (BUSINESS PROCESS RISK ASSESSMENT)
14.2 APPENDIX B (TEST DATA SETUP)
14.3 APPENDIX C (TEST CASE TEMPLATE)
14.4 APPENDIX D (PROBLEM TRACKING PROCESS)
MCS-17 | Assignment 2 18
SAMPLE - TEST EVALUATION REPORT DOCUMENT
CONTENTS
1. Objectives
2. Scope
3. References
4. Introduction
5. Test Coverage
6. Code Coverage
7. Suggested Actions
8. Diagrams
1. INTRODUCTION
1.1. OVERVIEW OF PROJECT X
1.2. PURPOSE OF THIS DOCUMENT
1.3. FORMAL REVIEWING
1.4. OBJECTIVES OF SYSTEM TEST
1.4.1. QUALITY ASSURANCE INVOLVEMENT
2. SCOPE AND OBJECTIVES
2.1. SCOPE OF TEST APPROACH - SYSTEM FUNCTIONS
2.1.1. INCLUSIONS
2.1.2. EXCLUSIONS
2.2. TESTING PROCESS
2.3. TESTING SCOPE
2.3.1. FUNCTIONAL TESTING
2.3.2. INTEGRATION TESTING
MCS-17 | Assignment 2 19
2.3.3. PERFORMANCE TESTING
2.3.4. REGRESSION TESTING
2.3.5. LOAD/STRESS TESTING
2.3.6. BUSINESS (USER) ACCEPTANCE TEST
2.4. BUILD TESTING
2.4.1. ENTRANCE CRITERIA
2.4.2. EXIT CRITERIA
3. TEST PHASES AND CYCLES
UNIT TESTING (CONDUCTED BY DEVELOPMENT):
INTEGRATION/FUNCTIONALITY TESTING:
REGRESSION TESTING:
NEGATIVE / POSITIVE TESTING:
AD HOC TESTING:
PERFORMANCE TESTING:
3.1. ORGANIZATION OF SYSTEM TESTING CYCLES
3.2. SOFTWARE DELIVERY
3.3. FORMAL REVIEWING
4. SYSTEM TEST SCHEDULE
5. RESOURCES - TESTING TEAM
5.1. HUMAN
5.2. HARDWARE
HARDWARE COMPONENTS REQUIRED
5.3. SOFTWARE TEST ENVIRONMENT SOFTWARE
ERROR MEASUREMENT SYSTEM
6. ROLES AND RESPONSIBILITIES
6.1. MANAGEMENT TEAM
6.2. TESTING TEAM SETUP
6.3. BUSINESS TEAM
6.4. DEVELOPMENT TEAM
MCS-17 | Assignment 2 20
7. ERROR MANAGEMENT & CONFIGURATION MANAGEMENT
8. STATUS REPORTING
8.1. STATUS REPORTING
9. ISSUES, RISKS, AND ASSUMPTIONS
9.1. ISSUES/RISKS
9.2. ASSUMPTIONS
10. FORMAL SIGNOFF
11. ERROR REVIEW
11.1. PURPOSE OF ERROR REVIEW TEAM.
11.2. ERROR REVIEW TEAM MEETING AGENDA.
11.3. CLASSIFICATION OF BUGS
11.4. PROCEDURE FOR MAINTENANCE OF ERROR MANAGEMENT SYSTEM.
11.5. QUALITY ASSURANCE MEASURES
(I) DATES.
(II) EFFORT.
(III) VOLUME.
(IV) QUALITY.
(V) TURNAROUND.
MCS-17 | Assignment 2 21
SAMPLE - USER ACCEPTANCE TEST (UAT) PLAN TABLE OF
CONTENTS
1. INTRODUCTION
1.1 PURPOSE
1.2 FUNCTIONAL OVERVIEW
1.3 CRITICAL SUCCESS FACTORS
1.4 UAT SCOPE
1.5 TEST COMPLETION CRITERIA
2. TIMEFRAME
3. RESOURCES
3.1 TESTING TEAM
3.2 HARDWARE TESTING REQUIREMENTS
3.3 SOFTWARE TESTING REQUIREMENTS
4. TEST APPROACH
4.1 TEST STRATEGY
4.2 GENERAL TEST OBJECTIVES:
4.3 BUSINESS AREAS FOR SYSTEM TEST
4.4 APPLICATION INTERFACES
5. TEST PREPARATION
5.1 TEST CASE DEVELOPMENT
5.2 TEST DATA SETUP
5.3 TEST ENVIRONMENT
6. UAT EXECUTION
6.1 PLANNING UAT EXECUTION
6.2 TEST EXECUTION DOCUMENTATION
6.3 ISSUE REPORTING
7. HANDOVER FOR UAT ACCEPTANCE COMMITTEE
8. ACCEPTANCE COMMITTEE
9. DELIVERABLES
MCS-17 | Assignment 2 22
10. APPROVALS
11. APPENDIXES
11.1 APPENDIX A (TEST CASE TEMPLATE)
11.2 APPENDIX B (SEVERITY STRATEGY)
11.3 APPENDIX C (ISSUE LOG)
MCS-17 | Assignment 2 23
SAMPLE - SOFTWARE RISK MANAGEMENT DOCUMENT
CONTENTS
1. Introduction
2. Terminology.
3. Risk Sources
4. Understanding the Risk
5. Risk Management Process Flow
5.1 Identifying the Risk
5.2 Analyze Risk
5.3 Risk Planning
5.4 Risk Tracking
5.5 Risk Controlling
5.6 Retiring Risks
6. Status Strategy:
7. Process Dependencies
8. Process Summary
9. Approvals
10. Appendixes
10.1 Appendix A 'Top 10 List template'.
MCS-17 | Assignment 2 24
SAMPLE - SOFTWARE LOAD TEST PLAN TABLE OF
CONTENTS
1. Introduction
1.1 Purpose
1.2 Background
1.3 Scope
1.4 Project Identification
2. Test Requirements
3. Test Strategy
3.1 Test objective
3.2 Type of test and type of virtual user
3.3 Test Approaches and Strategies
3.3.1 System analysis
3.3.2 Define the detailed testing check list
3.3.3 Developing Test Scripts
3.3.4 Creating test scenario
3.3.5 Monitoring Performance
3.3.6 Analyzing test result
3.4 Other Considerations
4. Resources
4.1 Workers
4.2 System
5. Project Milestones
6. Deliverables
6.1 Test Configuration
6.2 Test Logs
6.3 Test Reports
MCS-17 | Assignment 2 25