Lab 1

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 33

FPT SOFTWARE

FLORIST
Test Plan

Project Code: FOW


Document Code: FOW_TP – v1.1

FPT University, 02-11-2023


FOW-Test plan v1.1

RECORD OF CHANGE
*A - Added M - Modified D - Deleted
Effectiv Changed Items A* Change Description New Version
e Date M, D
31/10 Introduction A Add information for Purpose 1.0
31/10 Introduction A Add information for Organization 1.0
31/10 Introduction D Delete guideline. 1.0
31/10 Introduction A Add information for Background 1.0
Information.
31/10 Introduction A, M Add, Modify Introduction, 1.0
Requirement For Test.
31/10 Introduction A Add content for Scope Of 1.0
Testing.
31/10 Introduction A Add information for Constraints. 1.0
31/10 Introduction A Add information for Risk List. 1.0
31/10 Requirement A Add table of Test Item 1.0
For Test
01/11 Requirement A Add information for Acceptance 1.1
For Test Test Criteria
01/11 Test Strategy A, M Add, Modify Test Strategy: Test 1.1
Types Tools, Human Resource,
System
01/11 Test Strategy A Add information for Test Stage 1.1
01/11 Test Strategy A Add information for Functional 1.1
Testing
01/11 Test Strategy A Add information for Performance 1.1
Testing
01/11 Test Strategy A Add information for Security and 1.1
Access Control Testing.
01/11 Test Strategy A Add information for Regression 1.1
Testing
01/11 Test Milestones A, M Add, Modify table of Test 1.1
Milestones
01/11 Deliverables A, M Add, Modify table of 1.1

Internal Use 2/33


FOW-Test plan v1.1

Deliverables
01/11 Test Plan A, M Modify, Review 1.1

Internal Use 3/33


FOW-Test plan v1.1

SIGNATURE PAGE

ORIGINATOR: Nguyen Viet Hoai Nam October 31, 2023


Test Leader

REVIEWERS: Khuat Nhu Khoa November 1, 2023


Project Manager

Do Nhat Minh November 1, 2023


Quality Assurance

Pham Huong Ly November 1, 2023


Quality Assurance

Tran Thanh Huyen November 1, 2023


Quality Assurance

APPROVAL: Khuat Nhu Khoa November 1, 2023


Project Manager

Internal Use 4/33


FOW-Test plan v

TABLE OF CONTENTS
1. INTRODUCTION
1.1. Purpose
1.2. Background Information
1.3. Scope Of Testing
1.4. Constraints
1.5. Risk List
2. REQUIREMENTS FOR TEST
2.1. Test items
3. TEST STRATEGY
3.1. Test Types
3.2. Test Stage
3.3. Tools
4. RESOURCE
4.1. Human Resource
4.2. System
5. TEST MILESTONES
6. DELIVERABLES

Internal Use 5
FOW-Test plan v

1. INTRODUCTION
1.1. Purpose
1.1.1. Purpose
● Describes scope and approach: The test plan outlines what will be tested, how it
will be tested, types of testing, and objectives. This provides direction for QA and
development teams.
● Identifies resources required: The plan lists roles, tools, test environments, and
other resources needed to execute testing activities. This helps teams coordinate
schedules, budgets, equipment, etc.
● Contains estimates: The plan provides estimates on testing effort, cycle time, and
defect rates. These metrics help teams plan capacity.
● Clarifies testing milestones: The plan defines milestones for different testing
phases and aligns them with overall development milestones.
● Allows tracking and reporting: With well-defined scope and estimates, the test
plan facilitates tracking progress and generating test reports.
● Reduces product risk: Following a structured test plan minimizes the risk of
missing critical defects or coverage gaps.
● Defines when testing is complete: Test exit criteria are defined upfront to establish
completeness.
● Record for further use: Important aspects like test estimation, test scope, test
strategy are documented in Test Plan, so it can be reviewed by Management Team
and re-used for other projects.

1.1.2. Organization
No Test Stages Description
1 INTRODUCTION The section describes the purpose and organization of the
documents including the total of sections, the main content
of each section.
2 REQUIREMENT The section identifies those items (use cases, functional
FOR TEST requirements, non-functional requirements) that have been
identified as targets for testing and what will be tested.
3 TEST STRATEGY The section presents the recommended approach to the
testing of the target-of-test in which state clearly the type of

Internal Use 6
FOW-Test plan v

No Test Stages Description


test being implemented, the test objectives and how the test
is conducted.
4 TEST The section states significant points or events within the
MILESTONES testing process that mark critical junctures in a project's
timeline.
5 DELIVERABLES The section specifics items, documents, or outcomes that
are expected to be produced or completed during the testing
process
i.
1.2. Background Information
1.2.1. Target-Of-Test
The Flower Shop website is the system that has to be tested, in this test plan, our
group detail the target of our testing as the following:
No Test Stages Description

1 Component-Level Testing Verifying the functionalities and behaviors of


individual code units aims to verify the
functionalities and behaviors of individual code
units.

2 Integration Testing The interaction between multiple components or


subsystems ensures that all system components
collaborate efficiently.

3 System Testing Evaluate the overall performance and integration of


various components ensuring that all its
components.

4 User Acceptance Testing Delivery to the stakeholders and end-users of the


(UAT) system to verify that it meets their needs and
expectations, and that it's suitable for their use.

Internal Use 7
FOW-Test plan v

ii.

1.2.2. About Project


The system to be tested is the Flower Shop website, an online ecommerce store
that allows customers to purchase flower arrangements and bouquets. This will be
the first version of the Flower Shop website built in scope of subject PRJ301 -
Summer 2023.
The major functions of the website include:
● Product catalog browsing and searching
● Shopping cart management
● Customer checkout
● Order tracking system
● Inventory management system
● Account registration and management
● Administration portal
The website has been developed using a decoupled architecture with a JSP
frontend and Java Servlet backend. The backend interfaces with a SQL Server
database for persistence.
Core features of the website include attractive flower bouquet designs, fast
checkout process, and a smooth order tracking program. The website aims to
capture the online floral gift market. Testing efforts will focus on ensuring all
critical website functionality is validated.

1.3. Scope Of Testing


1.3.1. Target Of Test’s Functionality
Functional items will be verified, passed, validated, approved by Group 1
development team, including the requirements of the following primary functions:

Code Function Name

FN-01 Login

FN-02 Sign Up

FN-03 Delete Product

Internal Use 8
FOW-Test plan v

FN-04 Add Product

FN-05 Update Product

FN-06 Search Product

FN-07 View Product Detail

FN-08 Add Category

FN-09 Delete Category

FN-10 CheckOut

FN-11 View List Product

FN-12 Filter Product

1.3.2. Test Stage


No Test Stages Description
1 Unit Test Unit Test will be performed by Group 1.
Unit testing in the Florist project aims to verify the functionalities
and behaviors of individual code units, such as functions, modules,
and classes, in isolation. It ensures that each unit performs as
expected, adheres to requirements, and works independently of
other components.
2 Integration The Integration Test will be performed by Group 1.
Test This integration test ensures that all system components collaborate
efficiently to deliver a seamless experience to end-users while
maintaining the integrity and functionality of the flower-web
project.
3 System Test The System Test will be executed by the Group 1 team.
Testers will perform complete, end-to-end system testing staged in
the pre-production environment to validate that functions and
system interfaces perform properly in the production environment.
4 Acceptance The Acceptance Test will be executed by Lecturer Ta Dinh Tien.
Test This acceptance ensures that the product or system functions

Internal Use 9
FOW-Test plan v

No Test Stages Description


correctly and fulfills the requirements and expectations set by the
end-users or stakeholders.
Group 1’s responsibilities during Acceptance Test phase are:
● Facilitate completion on the application deployment
● Support fixing bugs
● Support Final User Acceptance Test
iii.

1.3.3. Test Assumption


The following assumptions are made for test process:
● Verification from Group 1 project team for test execution, documenting and
results.
● Group 1 validates and approves the final software product, test procedures and
results.
● Tests will be executed on specific hardwares and softwares as defined in Section
4.2.
● Requirements for the test are limited to functional and non-functional
requirements specified in Section 2 of this document.

1.4. Constraints
● Unit Test Inspection Prerequisite: Test execution can be performed when the
system passes Unit Test Inspection.
● Resource Limitation: Constraints on testing resources (hardware, software,
testing tools) may affect the extent or depth of test coverage.
● Scheduling Conflicts: Limitations related to strict project schedules could impact
the duration or frequency of testing cycles.
● Data Limitations: Access to a limited set of test data that does not represent the
entirety of potential real-world scenarios can be a constraint.
● Tool Availability: Constraints in accessing or using specific testing tools might
impact the efficiency and depth of the testing process.
● Time limitation: Testing time is restricted to 1 week sprints with only core
workflows being validated.

Internal Use 10
FOW-Test plan v

1.5. Risk List

# Risk Description Mitigation Contingencies

1 Resource Inadequate resources Allocate resources If resource constraints


Constraints (personnel, tools, judiciously, consider occur, consider
infrastructure) for outsourcing, or prioritize redistributing tasks,
testing. critical testing phases. schedules.

2 Schedule Delays Delays in the project Regular monitoring of the Adjust the testing phase
schedule might project timeline, early by reducing the scope o
compress testing time. identification of delays, prioritizing critical test
and re-adjustment of cases.
testing priorities.

3 Incomplete Test Lack of diverse and Develop or acquire If test data is insufficien
Data comprehensive test data. diverse test data sets to create replace data.
ensure thorough testing.

4 Lack of Inadequate knowledge Invest in appropriate Group members help


Training or training on new training and skill each other answer
testing tools or development for our team. questions, ensuring
methodologies. there's a balance in the
team's skill set to handl
various situations.

5 Lack of Inadequate or missing Enforce strict In case of insufficient


Documentation documentation of test documentation standards documentation, review
cases, plans, or results. and regular reviews to the existing
ensure comprehensive and documentation and
accurate records. identify areas that are
unclear or missing, fill
the gaps. Regularly
conduct audit checks fo

Internal Use 11
FOW-Test plan v

# Risk Description Mitigation Contingencies

documentation
compliance.

6 Members have In some cases, members Define clear tasks for each All members discuss
argued, frequently engage in member and agree on resolving the conflict.
conflicted with arguments and conflicts ideas before starting
others, leads to with one another. These work.
stressful disputes often revolve
working around differing
environments opinions, conflicting
interests, or various
other reasons.

Internal Use 12
FOW-Test plan v

2. REQUIREMENTS FOR TEST


2.1. Test Items
2.1.1. Functional Items
Code Function Name

FN-01 Login

FN-02 Sign Up

FN-03 Delete Product

FN-04 Add Product

FN-05 Update Product

FN-06 Search Product

FN-07 View Product Detail

FN-08 Add Category

FN-09 Delete Category

FN-10 CheckOut

FN-11 View List Product

FN-12 Filter Product

iv.

Internal Use 13
FOW-Test plan v

2.1.2. Acceptance Test Criteria


No Test Stages Qualified ratios
1 Unit Test To pass this stage, all unit test cases must be tested and passed
100%. All defects should be fixed and re-tested. Average of 4
bugs/KLOC.
2 Integration To pass this stage, all test cases must be tested and passed 100%.
Test All defects should be fixed and re-tested. Average of 4
bugs/KLOC.
3 System Test To pass this stage, all test cases must be tested and passed 100%.
All defects should be fixed and re-tested. Average of 4
bugs/KLOC.
4 Acceptance Acceptance Test will be conducted and approved by the Lecturer of
Test subject PRJ301 (Mr Ta Dinh Tien).

No Test Item Qualified ratios


1 Functional requirement Sufficient number of requirements defined in
section 2.1.1 according to PRJ301 subject
requirements

3. TEST STRATEGY
3.1. Test Types
3.1.1. Functional Testing
3.1.1.1. Function Testing
Test Objective: To ensure that 100% of user interactions on the Flower Web
application, including navigation, data entry, processing, and
retrieval, are successfully executed without any errors or
glitches.
Technique: - Testers will create test scenarios against the
requirements provided by the customer. Test scenarios
will be created based on black box test technique.
- Testers execute tests based on test scenarios and create
reports. Common defects will be collected for an

Internal Use 14
FOW-Test plan v

improved checklist.
- Test procedure process:
1. Valid Data Testing:
- Execute each function using a range of valid
input data sets.
- Verify that the application performs the
expected operations accurately.
- Check if the system responds appropriately to
valid data inputs without any unexpected
behaviors.
2. Invalid Data Testing:
- Test the same functions with invalid data inputs,
ensuring variations in different types of invalid
data.
- Confirm that the system displays the relevant
error or warning messages for each specific
invalid input.
- Check that the application does not execute
unexpected operations with invalid data.
3. Error and Warning Message Verification:
- Inspect the system's behavior when
encountering errors or warnings.
- Verify that the appropriate error or warning
messages are displayed as defined in the
requirements.
- Confirm that the messages are clear, helpful,
and assist the user in understanding and
rectifying the errors.
4. Outcome Verification:
- Check whether the system produces the
expected outcomes for both valid and invalid
data inputs.
- Ensure that the system performs as specified in

Internal Use 15
FOW-Test plan v

functional requirements.
Completion 1. Test Coverage: Ensure that 100% identified functional
Criteria: requirements, and non-functional requirements are tested
thoroughly.
2. Bug Resolution: No critical bugs should be present in the
application. All major and minor issues must be logged,
addressed, and resolved.
3. User Experience: Assess the user interface and experience to
ensure usability, clarity, and ease of navigation.
4. Acceptance Criteria: Obtain formal acceptance from
stakeholders by meeting the predetermined acceptance criteria.
Special Functional testing will NOT be started in case developers have
Considerations: not executed unit tests before passing applications to testers.
1.
3.1.1.2. User Interface Testing
Test Objective: 1. Navigation Testing:
- Verify that the navigation flows between different
windows and fields are correct and meet the
requirements. The success criterion is 100% accurate
navigation.
- Assess the response time for switching between
windows. The response time should be within an
acceptable range, such as under 1 second.
2. Window Object Evaluation:
- Confirm that window objects, including menus, adhere
to design standards. The success criterion is 100%
compliance with design standards.
- Verify that windows open and close smoothly. The
success criterion is 100% accurate window behavior.
3. Consistency and Compliance:
- Ensure that the layout and styling of elements across
different screens are consistent, meeting branding and
design guidelines. The success criterion is 100%

Internal Use 16
FOW-Test plan v

consistency in appearance.
- Validate that the behavior of elements, such as buttons
and links, is uniform throughout the application. The
success criterion is 100% uniform behavior.
4. Field-to-Field Navigation:
- Test the transition between fields for accuracy and
speed. The success criterion is 100% accurate and fast
field transitions.
- Evaluate the visibility and clarity of field focus
indicators. The success criterion is 100% visibility and
clarity of focus indicators.
5. Accessibility and Usability:
- Test the responsiveness of the interface to different
methods of interaction, such as keyboard commands,
mouse inputs, and touch gestures. The success criterion
is 100% responsiveness to all interaction methods.
Technique: - Create or modify tests for each window to verify
proper navigation and object states for each application
window and objects.
- Test procedure process:
1. Window Navigation Testing:
- Design test cases to ensure the correct flow of
navigation within the application. This includes
switching between various windows or screens,
testing tab sequences, and assessing the
navigation order.
2. Window Object States Testing:
- Evaluate the behavior of window objects when
interacting with them. This includes dropdown
menus, buttons, links, checkboxes, and other
elements to verify their states
- Validate the proper behavior of each window
object concerning user interactions. Ensure that

Internal Use 17
FOW-Test plan v

objects respond appropriately and as expected


to different actions (clicks, hovers, etc.).
3. State Consistency Testing:
- Confirm the consistency of object states across
different screens.
- Create test cases to verify that the same object
on different screens or locations within a
window maintains its state consistently.
4. Error State Testing:
- Test the system's behavior in scenarios where
there are errors in object states.
- Validate that appropriate error messages or
feedback are provided when users interact with
objects in invalid or unexpected ways.
5. Object State Transitions Testing:
- Test the transitions between different states of
the same object.
- Each window successfully verified to remain consistent
within acceptable standards: 100% windows adhere to the
Completion
predefined UI standards set for the application. This involves
Criteria:
confirming that each window and its elements comply with
standard guidelines.
Special
N/A
Considerations:
2.
3.1.1.3.Data and Database Integrity Testing
Test Objective: - Ensure database access methods and processes function
properly and without data corruption. For details:
1. Validate all CRUD operations for each database table -
100% coverage of CREATE, READ, UPDATE,
DELETE statements.
2. Verify referential integrity constraints are enforced on
all foreign key relationships - Test all cascading

Internal Use 18
FOW-Test plan v

update/delete scenarios.
3. Ensure indexes are utilized by executing explain plans
on sample queries - Confirm index usage for 95% of
critical queries.
4. Check for data leaks by ensuring query results contain
only the required fields - Run sample queries as
different user roles and validate no unauthorized data
access.
1. Database Access Method and Process Validation:
- Utilize all available database access methods and
processes (such as SELECT, INSERT, UPDATE,
DELETE).
- Invoke each method, feeding them with a combination
of valid and invalid data or requests. This could include
boundary values, null entries, or incorrect data types.
2. Database Inspection:
- After executing database processes, inspect the
Technique: database: Ensure the data has been accurately
populated as intended.
- Validate that all database events (INSERT, UPDATE,
DELETE) have occurred accurately without errors or
discrepancies.
- Review the returned data to confirm that the correct
data was retrieved for the intended reasons and queries.
- Cross-verify that the correct operations took place and
were saved accurately, such as updated values, new
entries, or deletions.
Completion All database access methods and processes function as
Criteria: designed and without any data corruption.

1. Testing may require a DBMS development environment or


Special drivers to enter or modify data directly in the databases.
Considerations: 2. Processes should be invoked manually.
3. Small or minimally sized databases (limited number of

Internal Use 19
FOW-Test plan v

records) should be used to increase the visibility of any non-


acceptable events.
3.
3.1.1.4. Business Cycle Testing
This test will not be implemented or executed. This test is not appropriate.
- Ensure proper target-of-test and background processes
function according to required business models and schedules.
For detail:
1. Specific Scenario Coverage: Conduct end-to-end tests
on at least 90% of the primary business cycles within
the application. .
2. Cycle Duration and Volume: Test scalability by
simulating business cycles with different transaction
volumes, from baseline to peak, ensuring the system's
Test Objective
ability to handle a 200% increase in transactions over
the expected peak load.
3. Error Handling: Assess the error handling capabilities
during peak loads. Validate that error rates stay below
2% and that the system recovers gracefully from
failures.
4. Batch Processing Testing: Verify that batch processing
adheres to the schedule and completes all jobs within
the expected time, with 100% accuracy in processing.
Technique: - Testing will simulate several business cycles by performing
the following:
1. The tests used for target-of-test’s function testing will
be modified or enhanced to increase the number of
times each function is executed to simulate several
different users over a specified period.
2. All time or date-sensitive functions will be executed
using valid and invalid dates or time periods.
3. All functions that occur on a periodic schedule will be
executed or launched at the appropriate time.
4. Testing will include using valid and invalid data to

Internal Use 20
FOW-Test plan v

verify the following:


+ The expected results occur when valid data is
used.
+ The appropriate error or warning messages are
displayed when invalid data is used.
5. Each business rule is properly applied.
1. Execution of Planned Tests: All the predetermined test
cases and scenarios for business cycle testing have been
successfully executed. This includes the verification of basic,
Completion alternate, and exceptional flow scenarios to ensure
Criteria: comprehensive coverage.
2. Defect Resolution: All identified defects, anomalies, and
issues discovered during the testing phase have been
addressed, resolved, and retested for validation.
1. System Dates and Events: Business cycle testing might
require specific support activities related to system dates and
scheduled events. This involves ensuring that the system
Special correctly handles events scheduled in the business cycles or
Considerations: any periodic activities related to the business model.
2. Business Model Adherence: Business model input is crucial
to identify and define the appropriate test requirements and
procedures.
v.

3.1.2. Performance testing


3.1.2.1. Performance Profiling:
This test will not be implemented or executed. This test is not appropriate.
Test Objective: Verify performance behaviors for designated transactions or
business functions under the following conditions:
1. Normal Anticipated Workload:
- Scenario: Simulate and assess the performance of
10,000 concurrent user interactions.
- Metrics to Evaluate:
+ Transaction response time aiming for an

Internal Use 21
FOW-Test plan v

average of 2 seconds or less.


+ Server resource utilization below 70% to
maintain scalability.
2. Anticipated Worst Case Workload:
- Scenario: Stress test the system with a simulated load
of 20,000 concurrent user interactions.
- Metrics to Evaluate:
+ Monitor transaction response time aiming to
maintain a maximum of 5 seconds per
transaction.
+ Server resource utilization should remain below
90% to prevent performance degradation.
1. Utilize Developed Test Procedures: Utilize test procedures
designed for Function or Business Cycle Testing to establish a
benchmark for single-user, single-transaction performance.
2. Data File and Script Modification: Modify data files to
increase the number of transactions or scripts to intensify the
number of iterations for each transaction, aiming to simulate
Technique:
increased workloads.
3. Single and Multiple Client Runs: Perform initial tests on a
single machine, executing scripts to benchmark single-user
performance. Subsequently, repeat these tests using multiple
clients, either virtual or actual, as per the considerations
outlined below.
1. Single Transaction or Single User: Successful completion of
the test scripts without encountering any failures, running
within the expected or required time allocation per transaction.
Completion
2. Multiple Transactions or Multiple Users: Successful
Criteria:
completion of the test scripts without any failures, running
within an acceptable time allocation for multiple transactions
or multiple users.
Special Comprehensive performance testing includes having a
Considerations: background workload on the server.

Internal Use 22
FOW-Test plan v

There are several methods that can be used to perform this,


including:
- “Drive transactions” directly to the server, usually in
the form of Structured Query Language (SQL) calls.
- Create “virtual” user load to simulate many clients,
usually several hundred. Remote Terminal Emulation
tools are used to accomplish this load. This technique
can also be used to load the network with “traffic”.
- Use multiple physical clients, each running test scripts
to place a load on the system.
Performance testing should be performed on a dedicated
machine or at a dedicated time. This permits full control and
accurate measurement.
The databases used for Performance Testing should be either
actual size or scaled equally.

3.1.2.2. Load Testing


This test will not be implemented or executed. This test is not appropriate.
Test Objective: - Verify performance behavior time for designated
transactions or business cases under varying workload
conditions. For details:
2. Scenario 1 - Normal Workload:
+ Users: Simulating 500 concurrent users
+ Transactions: Executing 1000 transactions per minute
+ Response Time: Acceptable response time for critical
transactions: Under 2 seconds
3. Scenario 2 - Peak Workload:
+ Users: Simulating 1000 concurrent users
+ Transactions: Executing 2000 transactions per minute
+ Response Time: Acceptable response time for critical
transactions: Under 3 seconds
4. Scenario 3 - Stress Workload:
+ Users: Simulating 1500 concurrent users

Internal Use 23
FOW-Test plan v

+ Transactions: Executing 2500 transactions per minute


+ Response Time: Acceptable response time for critical
transactions: Under 5 seconds
1. Load Testing Data Preparation:
- Use existing test suites developed for Function or
Business Cycle Testing.
- Modify data files or generate additional test datasets to
increase the number of transactions. This could
involve:
+ Scaling up the data to simulate real-time or
Technique: anticipated production data volume.
+ Increasing the frequency of transactions to
simulate various workload conditions.
2. Transaction Iterations:
- Execute test scripts with increased data loads to stress
the system.
- Increase the number of iterations for each transaction to
simulate a higher load.
Multiple Transactions or Multiple Users:
+ All load tests are successfully completed without any
Completion critical failures.
Criteria: + The system performs within an acceptable time frame,
maintaining responsiveness and stability despite
increased workload or user volume.
1. Dedicated Environment:
- Load testing should be performed on a dedicated
machine or during a specific time frame, ensuring
control over variables and accurate measurement of
Special
performance.
Considerations:
2. Database Consistency:
- Databases utilized during load testing should reflect the
actual size or be scaled appropriately to accurately
simulate real-world conditions.

Internal Use 24
FOW-Test plan v

3.1.2.3. Stress Testing


This test will not be implemented or executed. This test is not appropriate.
* Verify that the target-of-test functions properly and without
error under the following stress conditions:
1. Memory Condition Testing:
- Test Scenarios: Evaluate the system under various
memory capacities, such as: Assess the system's
behavior at 2% RAM capacity to determine its
responsiveness and detect any signs of failure.
2. Client Capacity Testing:
- Measure and record the system's response under
different client load scenarios: Simulate 1000
concurrent client connections to evaluate system
performance under extreme simultaneous usage.
Test Objective:
3. Multiple Users & Transactions:
- Transaction Stress Testing: Examine how the system
responds to multiple users performing simultaneous
transactions: Execute 10,000 transactions concurrently
against the same dataset to assess the system's behavior
under data contention.
4. Worst-case Transaction Volume:
- Apply the highest volume of transactions anticipated
from prior Performance Testing
Notes: The goal of Stress Testing might also be stated as
identify and document the conditions under which the system
FAILS to continue functioning properly.
Technique: 1. Utilize Developed Tests: Use the test scripts and data
developed during Performance Profiling or Load Testing to
create stress testing scenarios.
2. Resource Limitation Testing: Conduct tests on a single
machine to observe system behavior under constrained
resources:
- RAM and DASD Reduction:

Internal Use 25
FOW-Test plan v

+ Simulate low memory availability by restricting


RAM and DASD on the server.
3. Multi-Client Stress Tests: For stress tests involving multiple
users or peak load scenarios:
- Concurrent Test Execution:
+ Utilize multiple clients running similar or
complementary tests to imitate the worst-case
transaction volume or mix.
All planned tests are executed and specified system limits are
Completion reached or exceeded without the software failing or conditions
Criteria: under which system failure occurs outside of the specified
conditions.
1. Network Stress Tools: Tools may be necessary to
deliberately overload the network with an excessive amount of
messages or packets to stress the system adequately.
2. DASD Limitation: Reduce available Direct Access Storage
Special
Device (DASD) space temporarily to limit the expansion
Considerations:
capacity for the database.
3. Data Synchronization: Manage synchronization among
multiple clients when accessing the same data records or
accounts concurrently to mimic real-world stress scenarios.

3.1.2.4. Volume Testing


This test will not be implemented or executed. This test is not appropriate.
Test Objective: - Verify that the target-of-test successfully functions under the
following high volume scenarios:
1. Maximum Number of Concurrent Users: This could be
quantified by aiming to simulate: 5000 concurrent users
accessing the system simultaneously.
2. Database Size: If the system is expected to handle large
volumes of data, the objective could be to verify its
performance when the database reaches 1 million
records.

Internal Use 26
FOW-Test plan v

3. Extended Duration: Testing the system under stress


conditions for an extended duration: 24 hours, to ensure
consistent performance over time.
Technique: 1. Use Pre-existing Tests: Employ tests designed for
Performance Profiling or Load Testing as a foundation for
volume testing.
2. Multiple Client Simulation: Simulate multiple clients
executing either the same or varied transactions to produce a
high volume of transactions, reflecting the worst-case scenario
for an extended period. Refer to Stress Testing methods to
generate these conditions.
3. Database Size and Transactions: Scale the database to its
maximum size, or populate it with an extensive set of
representative data. Conduct multiple queries and reporting
transactions simultaneously for an extended period using
multiple clients.
Completion All planned tests have been executed and specified system
Criteria: limits are reached or exceeded without the software or
software failing.
Special The definition of an acceptable time period under high volume
Considerations: conditions as outlined above is subjective to the application's
specific performance and usage standards.

3.1.3. Security And Access Control Testing


Test Objective: 1. Application-level Security: Verify that users can
access only those functions or data for which they have
appropriate permissions based on their user type.
2. System-level Security: Ensure that only authorized
actors with access rights can enter the system and
applications.
Technique: 1. Application-level Security:
- Identification of User Types and Permissions:
+ Identify and list each distinct user type (e.g.,

Internal Use 27
FOW-Test plan v

admin, user, guest) and the specific functions or


data accessible to each user type.
+ Test Creation and Execution:Create test cases for
each user type's permissions and access levels.
Execute these tests to validate whether users have
appropriate access rights to various
functionalities and data.
+ Permission Adjustment Test: Modify the user
type of the same users and rerun tests to confirm
that the additional functions or data are correctly
accessible or denied based on the revised
permissions.
- System-level Access: <Include Special Considerations
Below>
Completion Criteria: For each known actor type the appropriate function or
data are available, and all transactions function as
expected and run in prior Application Function tests.
Special Access to the system must be reviewed or discussed
Considerations: with the appropriate network or systems administrator.
This testing may not be required as it may be a function
of network or systems administration.
i.

3.1.4. Regression Testing


Regression testing is to validate modified parts of the
Test Objective: software, to make sure that the modification does not cause
errors in other parts.
Technique: 1. Reuse of Test Cases:
- Utilize the existing set of test cases from the previous
test suite to conduct testing on a modified module or
software version.
2. Automated Testing with Rational Robot:
- Develop functional test scripts using Rational Robot for
the creation of automated tests.

Internal Use 28
FOW-Test plan v

- Define and establish an automated test execution


schedule.
3. Random Selection of Test Cases:
- Randomly select 80% of the test cases from the suite to
ensure broad coverage while reducing test redundancy.
4. Program-Analysis Infrastructure:
- Construct a program-analysis framework to conduct an
evaluation of the software changes. Use the analysis
results to identify the scope and areas for the regression
test suite.

Completion 1. All test cases are performed and passed.


Criteria: 2. All selected test cases are performed and passed.
Special
N/A
Considerations:

3.2. Test Stage


Stage of Test
Type of Tests
Unit Integration System Acceptance

Functional Testing X X X X

Performance Testing X X

Security and Access


X X
Control Testing

Regression Testing X X

3.3. Tools
Purpose Tool Vendor/In-house Version
Data and Database SQL Server Microsoft 2022
Integrity testing

Internal Use 29
FOW-Test plan v

Regression Testing Rational Robot IBM 7.0.3.5

Document Google Docs, Google 2023


Excel

3.4. Human Resource


This table shows the staffing assumptions for the project.
Worker/Doer Specific Responsibilities/Comments
Nguyen Viet Hoai Nam Manage test resources and assign test tasks
Create Test Plan, Test Cases (IT, ST), Test Scripts (IT, ST)
Review Test Data
Create Test Reports
Khuat Nhu Khoa Review Test Cases (IT, ST)
Create Test Data and Execute Test (IT, ST)
Report Test Results
Pham Huong Ly Review Test Cases (IT, ST)
Create Test Data and Execute Test (IT, ST)
Report Test Results
Do Nhat Minh Review Test Cases (IT, ST)
Create Test Data and Execute Test (IT, ST)
Report Test Results
Tran Thanh Huyen Review Test Cases (IT, ST)
Create Test Data and Execute Test (IT, ST)
Report Test Results

3.5. System
Hardware OS Version
Dell Gaming G3 Window 11 23H2

Software Version

Internal Use 30
FOW-Test plan v

Apache NetBeans 13
Edge 118.0.2088.76
Microsoft SQL Server 2022
Apache Tomcat Server 10.0.27

4. TEST MILESTONES
Testing of v1.0 should incorporate test activities for each of the test efforts identified
in the previous sections. Separate project milestones, which should be identified to
communicate project status accomplishments.
Milestone Task Effort Start Date End Date
Create Test Plan 5 10/26/23 11/01/23
Review & Update Test Plan 2 11/01/23 11/02/23
Create IT&ST Test Cases 13 11/02/23 11/10/23
Review & Update IT&ST Test 2 11/11/23 11/13/23
Cases
Create & Execute & Report UT 15 11/13/23 09/25/23
for Software Package version
1.0
Create test data for Software 1 11/26/23 11/28/23
Package version 1.0
UT Gate for Software Package 0.5 11/28/23 11/30/23
version 1.0
Execute IT for Software 16 11/30/23 12/16/23
Package version 1.0
Execute ST for Software 4 12/16/23 12/23/23
Package version 1.0
Create IT&ST Test Report for 1 12/23/23 12/25/23
Software Package version 1.0
Create & Execute & Report UT 22 12/25/23 01/15/24
for Software Package version
1.1
UT Gate for Software Package 0.5 01/15/24 01/16/24
version 1.1

Internal Use 31
FOW-Test plan v

Create test data for Software 1 01/16/24 01/18/24


Package version 1.1
Execute IT for Software 17 01/18/24 02/01/24
Package version 1.1
Execute ST for Software 5 02/01/24 02/05/24
Package version 1.1
Create IT&ST Test Report for 1 02/05/24 02/07/24
Software Package version 1.1
Create & Execute & Report UT 16 02/07/24 02/20/24
for Software Package version
1.2
UT Gate for Software Package 0.5 02/20/24 02/22/24
version 1.2
Create test data for Software 1 02/22/24 02/24/24
Package version 1.2
Execute IT for Software 20 02/24/24 03/09/24
Package version 1.2
Execute ST for Software 3 03/09/24 03/15/24
Package version 1.2
Create IT&ST Test Report for 0.5 03/15/24 03/16/24
Software Package version 1.2
Execute System Test for 8 03/16/24 03/21/24
Software Package final version
Create Test Report for Software 1 03/21/24 03/22/24
Package final version

5. DELIVERABLES
No Deliverables Delivered Date Delivered by Delivered to
1 Test Plan 11/02/23 Group 1 FPT
2 Test Reports for 12/25/23 Group 1 FPT
Software Package
version 1.0
3 Test Reports for 02/07/24 Group 1 FPT

Internal Use 32
FOW-Test plan v

No Deliverables Delivered Date Delivered by Delivered to


Software Package
version 1.1
4 Test Reports for 03/16/24 Group 1 FPT
Software Package
version 1.2

Internal Use 33

You might also like