Professional Documents
Culture Documents
Lab 1
Lab 1
Lab 1
FLORIST
Test Plan
RECORD OF CHANGE
*A - Added M - Modified D - Deleted
Effectiv Changed Items A* Change Description New Version
e Date M, D
31/10 Introduction A Add information for Purpose 1.0
31/10 Introduction A Add information for Organization 1.0
31/10 Introduction D Delete guideline. 1.0
31/10 Introduction A Add information for Background 1.0
Information.
31/10 Introduction A, M Add, Modify Introduction, 1.0
Requirement For Test.
31/10 Introduction A Add content for Scope Of 1.0
Testing.
31/10 Introduction A Add information for Constraints. 1.0
31/10 Introduction A Add information for Risk List. 1.0
31/10 Requirement A Add table of Test Item 1.0
For Test
01/11 Requirement A Add information for Acceptance 1.1
For Test Test Criteria
01/11 Test Strategy A, M Add, Modify Test Strategy: Test 1.1
Types Tools, Human Resource,
System
01/11 Test Strategy A Add information for Test Stage 1.1
01/11 Test Strategy A Add information for Functional 1.1
Testing
01/11 Test Strategy A Add information for Performance 1.1
Testing
01/11 Test Strategy A Add information for Security and 1.1
Access Control Testing.
01/11 Test Strategy A Add information for Regression 1.1
Testing
01/11 Test Milestones A, M Add, Modify table of Test 1.1
Milestones
01/11 Deliverables A, M Add, Modify table of 1.1
Deliverables
01/11 Test Plan A, M Modify, Review 1.1
SIGNATURE PAGE
TABLE OF CONTENTS
1. INTRODUCTION
1.1. Purpose
1.2. Background Information
1.3. Scope Of Testing
1.4. Constraints
1.5. Risk List
2. REQUIREMENTS FOR TEST
2.1. Test items
3. TEST STRATEGY
3.1. Test Types
3.2. Test Stage
3.3. Tools
4. RESOURCE
4.1. Human Resource
4.2. System
5. TEST MILESTONES
6. DELIVERABLES
Internal Use 5
FOW-Test plan v
1. INTRODUCTION
1.1. Purpose
1.1.1. Purpose
● Describes scope and approach: The test plan outlines what will be tested, how it
will be tested, types of testing, and objectives. This provides direction for QA and
development teams.
● Identifies resources required: The plan lists roles, tools, test environments, and
other resources needed to execute testing activities. This helps teams coordinate
schedules, budgets, equipment, etc.
● Contains estimates: The plan provides estimates on testing effort, cycle time, and
defect rates. These metrics help teams plan capacity.
● Clarifies testing milestones: The plan defines milestones for different testing
phases and aligns them with overall development milestones.
● Allows tracking and reporting: With well-defined scope and estimates, the test
plan facilitates tracking progress and generating test reports.
● Reduces product risk: Following a structured test plan minimizes the risk of
missing critical defects or coverage gaps.
● Defines when testing is complete: Test exit criteria are defined upfront to establish
completeness.
● Record for further use: Important aspects like test estimation, test scope, test
strategy are documented in Test Plan, so it can be reviewed by Management Team
and re-used for other projects.
1.1.2. Organization
No Test Stages Description
1 INTRODUCTION The section describes the purpose and organization of the
documents including the total of sections, the main content
of each section.
2 REQUIREMENT The section identifies those items (use cases, functional
FOR TEST requirements, non-functional requirements) that have been
identified as targets for testing and what will be tested.
3 TEST STRATEGY The section presents the recommended approach to the
testing of the target-of-test in which state clearly the type of
Internal Use 6
FOW-Test plan v
Internal Use 7
FOW-Test plan v
ii.
FN-01 Login
FN-02 Sign Up
Internal Use 8
FOW-Test plan v
FN-10 CheckOut
Internal Use 9
FOW-Test plan v
1.4. Constraints
● Unit Test Inspection Prerequisite: Test execution can be performed when the
system passes Unit Test Inspection.
● Resource Limitation: Constraints on testing resources (hardware, software,
testing tools) may affect the extent or depth of test coverage.
● Scheduling Conflicts: Limitations related to strict project schedules could impact
the duration or frequency of testing cycles.
● Data Limitations: Access to a limited set of test data that does not represent the
entirety of potential real-world scenarios can be a constraint.
● Tool Availability: Constraints in accessing or using specific testing tools might
impact the efficiency and depth of the testing process.
● Time limitation: Testing time is restricted to 1 week sprints with only core
workflows being validated.
Internal Use 10
FOW-Test plan v
2 Schedule Delays Delays in the project Regular monitoring of the Adjust the testing phase
schedule might project timeline, early by reducing the scope o
compress testing time. identification of delays, prioritizing critical test
and re-adjustment of cases.
testing priorities.
3 Incomplete Test Lack of diverse and Develop or acquire If test data is insufficien
Data comprehensive test data. diverse test data sets to create replace data.
ensure thorough testing.
Internal Use 11
FOW-Test plan v
documentation
compliance.
6 Members have In some cases, members Define clear tasks for each All members discuss
argued, frequently engage in member and agree on resolving the conflict.
conflicted with arguments and conflicts ideas before starting
others, leads to with one another. These work.
stressful disputes often revolve
working around differing
environments opinions, conflicting
interests, or various
other reasons.
Internal Use 12
FOW-Test plan v
FN-01 Login
FN-02 Sign Up
FN-10 CheckOut
iv.
Internal Use 13
FOW-Test plan v
3. TEST STRATEGY
3.1. Test Types
3.1.1. Functional Testing
3.1.1.1. Function Testing
Test Objective: To ensure that 100% of user interactions on the Flower Web
application, including navigation, data entry, processing, and
retrieval, are successfully executed without any errors or
glitches.
Technique: - Testers will create test scenarios against the
requirements provided by the customer. Test scenarios
will be created based on black box test technique.
- Testers execute tests based on test scenarios and create
reports. Common defects will be collected for an
Internal Use 14
FOW-Test plan v
improved checklist.
- Test procedure process:
1. Valid Data Testing:
- Execute each function using a range of valid
input data sets.
- Verify that the application performs the
expected operations accurately.
- Check if the system responds appropriately to
valid data inputs without any unexpected
behaviors.
2. Invalid Data Testing:
- Test the same functions with invalid data inputs,
ensuring variations in different types of invalid
data.
- Confirm that the system displays the relevant
error or warning messages for each specific
invalid input.
- Check that the application does not execute
unexpected operations with invalid data.
3. Error and Warning Message Verification:
- Inspect the system's behavior when
encountering errors or warnings.
- Verify that the appropriate error or warning
messages are displayed as defined in the
requirements.
- Confirm that the messages are clear, helpful,
and assist the user in understanding and
rectifying the errors.
4. Outcome Verification:
- Check whether the system produces the
expected outcomes for both valid and invalid
data inputs.
- Ensure that the system performs as specified in
Internal Use 15
FOW-Test plan v
functional requirements.
Completion 1. Test Coverage: Ensure that 100% identified functional
Criteria: requirements, and non-functional requirements are tested
thoroughly.
2. Bug Resolution: No critical bugs should be present in the
application. All major and minor issues must be logged,
addressed, and resolved.
3. User Experience: Assess the user interface and experience to
ensure usability, clarity, and ease of navigation.
4. Acceptance Criteria: Obtain formal acceptance from
stakeholders by meeting the predetermined acceptance criteria.
Special Functional testing will NOT be started in case developers have
Considerations: not executed unit tests before passing applications to testers.
1.
3.1.1.2. User Interface Testing
Test Objective: 1. Navigation Testing:
- Verify that the navigation flows between different
windows and fields are correct and meet the
requirements. The success criterion is 100% accurate
navigation.
- Assess the response time for switching between
windows. The response time should be within an
acceptable range, such as under 1 second.
2. Window Object Evaluation:
- Confirm that window objects, including menus, adhere
to design standards. The success criterion is 100%
compliance with design standards.
- Verify that windows open and close smoothly. The
success criterion is 100% accurate window behavior.
3. Consistency and Compliance:
- Ensure that the layout and styling of elements across
different screens are consistent, meeting branding and
design guidelines. The success criterion is 100%
Internal Use 16
FOW-Test plan v
consistency in appearance.
- Validate that the behavior of elements, such as buttons
and links, is uniform throughout the application. The
success criterion is 100% uniform behavior.
4. Field-to-Field Navigation:
- Test the transition between fields for accuracy and
speed. The success criterion is 100% accurate and fast
field transitions.
- Evaluate the visibility and clarity of field focus
indicators. The success criterion is 100% visibility and
clarity of focus indicators.
5. Accessibility and Usability:
- Test the responsiveness of the interface to different
methods of interaction, such as keyboard commands,
mouse inputs, and touch gestures. The success criterion
is 100% responsiveness to all interaction methods.
Technique: - Create or modify tests for each window to verify
proper navigation and object states for each application
window and objects.
- Test procedure process:
1. Window Navigation Testing:
- Design test cases to ensure the correct flow of
navigation within the application. This includes
switching between various windows or screens,
testing tab sequences, and assessing the
navigation order.
2. Window Object States Testing:
- Evaluate the behavior of window objects when
interacting with them. This includes dropdown
menus, buttons, links, checkboxes, and other
elements to verify their states
- Validate the proper behavior of each window
object concerning user interactions. Ensure that
Internal Use 17
FOW-Test plan v
Internal Use 18
FOW-Test plan v
update/delete scenarios.
3. Ensure indexes are utilized by executing explain plans
on sample queries - Confirm index usage for 95% of
critical queries.
4. Check for data leaks by ensuring query results contain
only the required fields - Run sample queries as
different user roles and validate no unauthorized data
access.
1. Database Access Method and Process Validation:
- Utilize all available database access methods and
processes (such as SELECT, INSERT, UPDATE,
DELETE).
- Invoke each method, feeding them with a combination
of valid and invalid data or requests. This could include
boundary values, null entries, or incorrect data types.
2. Database Inspection:
- After executing database processes, inspect the
Technique: database: Ensure the data has been accurately
populated as intended.
- Validate that all database events (INSERT, UPDATE,
DELETE) have occurred accurately without errors or
discrepancies.
- Review the returned data to confirm that the correct
data was retrieved for the intended reasons and queries.
- Cross-verify that the correct operations took place and
were saved accurately, such as updated values, new
entries, or deletions.
Completion All database access methods and processes function as
Criteria: designed and without any data corruption.
Internal Use 19
FOW-Test plan v
Internal Use 20
FOW-Test plan v
Internal Use 21
FOW-Test plan v
Internal Use 22
FOW-Test plan v
Internal Use 23
FOW-Test plan v
Internal Use 24
FOW-Test plan v
Internal Use 25
FOW-Test plan v
Internal Use 26
FOW-Test plan v
Internal Use 27
FOW-Test plan v
Internal Use 28
FOW-Test plan v
Functional Testing X X X X
Performance Testing X X
Regression Testing X X
3.3. Tools
Purpose Tool Vendor/In-house Version
Data and Database SQL Server Microsoft 2022
Integrity testing
Internal Use 29
FOW-Test plan v
3.5. System
Hardware OS Version
Dell Gaming G3 Window 11 23H2
Software Version
Internal Use 30
FOW-Test plan v
Apache NetBeans 13
Edge 118.0.2088.76
Microsoft SQL Server 2022
Apache Tomcat Server 10.0.27
4. TEST MILESTONES
Testing of v1.0 should incorporate test activities for each of the test efforts identified
in the previous sections. Separate project milestones, which should be identified to
communicate project status accomplishments.
Milestone Task Effort Start Date End Date
Create Test Plan 5 10/26/23 11/01/23
Review & Update Test Plan 2 11/01/23 11/02/23
Create IT&ST Test Cases 13 11/02/23 11/10/23
Review & Update IT&ST Test 2 11/11/23 11/13/23
Cases
Create & Execute & Report UT 15 11/13/23 09/25/23
for Software Package version
1.0
Create test data for Software 1 11/26/23 11/28/23
Package version 1.0
UT Gate for Software Package 0.5 11/28/23 11/30/23
version 1.0
Execute IT for Software 16 11/30/23 12/16/23
Package version 1.0
Execute ST for Software 4 12/16/23 12/23/23
Package version 1.0
Create IT&ST Test Report for 1 12/23/23 12/25/23
Software Package version 1.0
Create & Execute & Report UT 22 12/25/23 01/15/24
for Software Package version
1.1
UT Gate for Software Package 0.5 01/15/24 01/16/24
version 1.1
Internal Use 31
FOW-Test plan v
5. DELIVERABLES
No Deliverables Delivered Date Delivered by Delivered to
1 Test Plan 11/02/23 Group 1 FPT
2 Test Reports for 12/25/23 Group 1 FPT
Software Package
version 1.0
3 Test Reports for 02/07/24 Group 1 FPT
Internal Use 32
FOW-Test plan v
Internal Use 33