Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

Test Plan Document: Online Shopping System

TITLE:
RADO ONLINE SHOPPING SYSTEM

BY:
LUMUMBA ELDORADO O.

REG NO:
20/05357

EMAIL:
2005357@students.kcau.ac.ke / eldoradolumumba@gmail.com

A REPORT:
Submitted in partial fulfillment of the requirements for the degree

BACHELOR OF SCIENCE IN INFORMATION SECURITY AND FORENCICS

BISF 2208

Security and Forensics Project

FACULTY OF COMPUTING AND INFORMATION MANAGEMENT

SUPERVISED BY:
JUSTUS MUTIA

MAY-AUGUST 2023
Introduction

Purpose:

The purpose of this test plan is to outline the strategy and approach for testing the Online Marketing
System, hereinafter referred to as "the system." This document will provide an overview of the
system, define the scope of testing, identify the responsibilities of the testing team, and detail the
test scenarios and procedures to ensure the system's functionality, reliability, and performance

Scope:

The scope of this test plan covers all aspects of testing for the Online Marketing System, including
functional, non-functional, and performance testing. The system is designed to facilitate marketing
activities, including campaign management, user segmentation, and analytics.

Objectives:

The primary objectives of this test plan are as follows:

1. To verify that the Online Marketing System meets its specified requirements.
2. To identify and address any defects or issues in the system.
3. To ensure the system's functionality, usability, security, and performance meet user
expectations.
4. To establish a clear and standardized testing process for ongoing system improvements.

System Overview

Provide a high-level description of the online marketing system, including its main features and
components.

Roles and Responsibilities:

The following roles and responsibilities are assigned for the testing of the Online Marketing System:

 Test Manager: Responsible for overall test planning, coordination, and reporting.
 Test Analysts: Responsible for creating test cases, executing tests, and reporting defects.
 Development Team: Responsible for addressing and resolving reported defects.
 Business Analysts: Available for clarifications on requirements and to provide input on test
scenarios.
 Users and Stakeholders: Responsible for user acceptance testing and providing feedback.

Testing Environment:

The testing environment will replicate the production environment as closely as possible. This
includes:

 Hardware: Similar configurations to production servers.


 Software: The same operating systems and software versions as in the production
environment.
 Test Data: Representative and anonymized data sets for testing purposes.
 Test Tools: Testing and automation tools as required for test execution and reporting.

Test Schedule:
Testing will be conducted in multiple phases, including unit testing, integration testing, system
testing, and user acceptance testing. The schedule for each phase will be outlined in a separate test
schedule document.

Risks and Assumptions:

 Assumption: The development team will provide builds of the system on schedule for
testing.
 Risk: Delays in development or unavailability of necessary resources could impact the testing
schedule. Mitigation measures will be in place to address these risks as they arise.

Approach:

The testing approach will encompass the following:

 Requirements Review: Careful review of system requirements and user stories to ensure
test coverage.
 Test Case Design: Creation of detailed test cases, including positive and negative scenarios.
 Test Execution: Systematic execution of test cases and tracking of test results.
 Defect Reporting: Prompt reporting and tracking of defects with clear descriptions and steps
to reproduce.
 Regression Testing: Regular regression testing to ensure that new changes do not introduce
regression defects.
 User Acceptance Testing: Involvement of users and stakeholders in UAT to validate system
functionality against business needs.
 Performance Testing: Assessing the system's performance under various load conditions.
 Security Testing: Evaluating the system's security measures to protect user data and prevent
unauthorized access.

Deliverables:

The following deliverables will be produced as part of this test plan:

 Test cases and test scripts.


 Test execution logs.
 Defect reports.
 Test summary report.

Exit Criteria:

Testing will be considered complete when the following criteria are met:

 All test cases have been executed and passed.


 High-priority defects have been resolved and retested.
 User acceptance testing has been successfully completed.
 The system meets the defined acceptance criteria.

Test Schedule:

The test schedule will be detailed in the separate test schedule document.

Approval:
This test plan requires approval from the project manager, development team lead, and relevant
stakeholders before testing commences.

Revision History:

Version Week Description

1.0 1-3 Initial draft

1.1 4-6 Revised [error checking and perfection]

...

You might also like