Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 23

ESL MATCH SERVICE

Performance Test Plan


Version No 1.0
ESL Match Service

1. Revision History
Add the latest changes at the beginning of the table.

Date Version Description Author

04/10/2020 1.0 Draft Version Srikanth


Kota(SK08506)

Page 2 of 23
ESL Match Service

Contents

Page 3 of 23
ESL Match Service

1. Revision History............................................................................................................................2
2. Executive Summary.....................................................................................................................5
3. ESL MATCH SERVICE – An Introduction.............................................................................5
3.1. System overview.................................................................................................................5
4. Objectives.....................................................................................................................................7
5. Testing Scope...............................................................................................................................8
5.1. In-Scope..............................................................................................................................8
5.2. Out-of-Scope.......................................................................................................................9
6. Performance Test Cases/Business Process List........................................................................9
7. Service Level Agreements (SLAs)............................................................................................10
8. Assumptions...............................................................................................................................12
9. Risks and Dependencies............................................................................................................12
10. Environment Details.............................................................................................................13
11. Performance Test Methodology...........................................................................................14
11.1. Performance Test Strategy................................................................................................14
11.2. Test Tools..........................................................................................................................14
11.3. Load Test -1......................................................................................................................15
11.4. Stress Test.........................................................................................................................15
11.5. Endurance Test..................................................................................................................15
11.6. Validation and Execution..................................................................................................15
11.7. Pass/Fail Criteria...............................................................................................................16
11.8. Reporting...........................................................................................................................16
12. Test Entry/Exit Criteria.......................................................................................................17
13. Suspension/Resumption Criteria.........................................................................................18
14. Test Schedule.........................................................................................................................18
15. Test Deliverable.....................................................................................................................19
16. Roles and Responsibilities....................................................................................................20
17. Approvals................................................................................................................................20

Page 4 of 23
ESL Match Service

Abbreviations

Term Definition

NFRs Non Functional Requirements

SLA Service Level Agreements

CI CC TESTING Colruyt India Competency


Center TESTING

Page 5 of 23
ESL Match Service

2. Executive Summary
The purpose of the Performance Strategy document is to define the test
approach, methodology, directions, risks and assumptions of performance test
activities for ESL Match Service.

The performance Testing Strategy covers the delivery of the project and
support services to Colruyt in the completion of the Performance Testing
project.

This document will cover the following vital attributes required for performance
testing of ESL Match Service:
 Performance testing scope, objectives, milestones, and critical
success factors
 Project policies, risks, and assumptions
 Acceptance criteria and change management
 Overall strategy for conducting the performance testing activities
 Project schedule, deliverable
 Roles and Responsibilities

3. ESL MATCH SERVICE – An Introduction


We will store and provide all fic data as provided by the information provider
comply with the fic (food
information for consumers) legislation and gs1 standards and improve quality of
pim food. Information will be
provided to colruyt’s customers, So we need to realign with the gs1 data model
store and provide all data on all platforms (v2 services).

As a project, CI CC Performance Test team will perform performance testing


services. This service is exclusively focus on application performance
benchmarking and high availability. This service will be achieved by doing
simulated load test as well as server monitoring using automated
scripting/monitoring tools identified and approved by Colruyt.

Page 6 of 23
ESL Match Service

3.1. System overview


The component level diagram will provide the overall view of the system.
This diagram provides the depth and breadth of Performance Testing
boundaries. In addition, this snapshot gives the complete picture of
components where the tool agents are to be installed to probe the server
details during the performance and load testing.
The key components are:
 Interface details
 Other components
** Infrastructure High Level Architecture Design (Yet to update)

ARCHITECTURE DIAGRAM

Page 7 of 23
ESL Match Service

4. Objectives
The main objective of performance testing is to identify the bottlenecks of ESL
MATCH SERVICE under simulated load conditions. In addition, to ensure that
the Non Functional Requirements (NFRs) are meeting their Service Level
Agreements (SLAs) defined for Live ESL MATCH SERVICE environment.

Main objectives are to:


 Ensure the performance characteristics such as:
 Response Time
 Throughput (Web and App server)
 Error Rate
 Identify the system bottlenecks such as:
 CPU Time
 Memory Usage
 Disk I/O
 Determine system availability
 Determine system capacity

Page 8 of 23
ESL Match Service

5. Testing Scope
This section outlines the areas of the application as well as different aspects of
performance testing that are in-scope and out-of-scope for this project.

5.1. In-Scope

Scope
Included
Scope Area Description
(Yes/No)

Application Yes ESL MATCH SERVICE will be performance tested.

Technical Architecture Yes The technical architecture, including Webserver,


application server and database will fall under
performance testing while signed business process
scenario performance/load testing and monitoring.

API Interfaces Yes Services will be Performance Tested

Business Process Yes 4 signed-off business process identified for Performance


testing. Refer (Section 6) for Complete List

Automated Tools Yes Performance test will be conducted through Web


Performance Load Tester Tool and monitored by APM
Tool.

Execution of Test Cycle Yes Two iterations of test will be in-scope each type of test

Analysis Yes Analysis will be done on


 Load Test
Test Results will be monitored.
Recommendation will be provided for fine tuning the
performance related issues.(if found any)

Page 9 of 23
ESL Match Service

5.2. Out-of-Scope
The following are out-of-scope for performance testing:

 Business Process other than identified and listed in section 6 will


not be considered for any type of performance testing.

 Performance tuning for the external systems will not be


conducted.

 Batch Services – Typically its been executed once in a day and no


impact seen.

6. Performance Test Cases/Business Process List


Following are the key business scenarios and its weightage for the performance
test. Business scenarios and weight mix are agreed by the stakeholders during
preparation phase of performance testing. It is important to note that all
business scenarios will be different test cases in a single test script means it is not
clubbed together in a single test cases.
Sl No Business Scenario Percentage Allocation (Weightage)

Linking an ESL (Electronic Shelf Label) to a Product

1. Link-operation 100

** Note: Scenarios and distribution have to be agreed with Bruno Caby (154R) SA

Page 10 of 23
ESL Match Service

7. Service Level Agreements (SLAs)


This section illustrates the performance requirements and agreed expected
requirements.

The following are the key metrics will be measured during performance testing
and compared against the SLA. In addition, any SLA breaches will be taken into
defect logging.

All stakeholders agreed the metrics and NFRs during design phase which
influence for performance testing.

Generally, specific metrics are provided that define success on the


infrastructure side. In cases where no metrics are provided, best practices and
experience will be used to gauge success and make recommendations.
Metrics SLA
CPU Average CPU utilization <= 60% for all servers

Memory Available memory >= 2 GB

Percentage of error rate 0%

Response time <= 5 sec. in 99% of the cases

<= 3 sec in 75% of the cases

<= 1 sec in 50% of the cases

Expected load Expected average load = 80 links/minute

Expected peak load = 560 links/minute

SLAs are as per the Colruyt standards -- PCLC (Project Content Life Cycle)
Colruyt

Service level engagement for ESL Match Service

Page 11 of 23
ESL Match Service

(Section-6.1)
https://extranet-sp.colruytgroup.com/projects/2015/doc466.976/Service
Specification/FIC Data Sheet Retrieval Service_v1.docx

Page 12 of 23
ESL Match Service

8. Assumptions

This section describes the assumptions made throughout the performance


testing activities.

Assumptions Description

Environmental setup  On time availability of environment

 All unplanned outages/system downtime/any configuration changes


should be addressed upfront to the performance test team to avoid
test execution schedule slippage (AppServer, webserver, tool, net-
work, database etc)

Tools setup  On time availability of Tools – Performance Test tool, monitoring


tools

 Tools should be configured (Agent installations, System profile cre-


ation, server authentication) and accessible to testers

Support groups  Tools vendor support required on need basis

 Support from DBA, application manager, server admin, Dev group for
related performance issue discussions

Test Data setup  On time availability of quality data

 Assuming current volume and data in SYST environment is equiva-


lent to production volumes

Troubleshoot of issues  All the performance issues should be quick turnaround by respective
within performance testing team
window

Page 13 of 23
ESL Match Service

9. Risks and Dependencies


Following are the risks and dependencies identified for performance testing of ESL
MATCH SERVICE –.

Risks

Risk Impact Mitigation


Shared environment for Short test window and Agree with Project Team
performance testing Insufficient/incorrect test for Specific Time Window
data for performance
testing

Non signoff documents Impacts Quality and testing Inform PTM or FA for the
schedule relevant documents and
ensure proper sign-off

Any increase of Test Impacts on project timeline Ensure Proper Test runs to
cycle/test iteration reduce Test iterations

Dependencies

Dependencies Impact
System Testing Sign-Off Impacts on project timelines

Quality Test Data Impacts test execution and results

Stakeholders availabilities Impacts overall project schedule and decision


making

10. Environment Details


ESL Match Service performance test will be carried out in SYST environment.
This section talks about the environment URLs and its access for testing.

App Server -
Web Server -
DB Name -

Page 14 of 23
ESL Match Service

11. Performance Test Methodology

11.1. Performance Test Strategy


This section illustrates the Overall Performance Test Strategy for ESL MATCH
SERVICEPerformance Test.

Page 15 of 23
ESL Match Service

11.2. Test Tools


ESL MATCH SERVICE performance testing will be carried out by automated
tools identified by Colruyt as listed below in the table.

Performance Tool Type Version Product Purpose


jMeter Test Tool 3.3  Apache Used for generating load

APM Used for monitoring network


Dynatrace Monitoring Tool 6.7
Compuware response times

11.3. Load Test -1

Load Configuration

Total User Load 20

Total Test Duration 60 minutes

Think Time 4 secs

Ramp Up 4 Vusers/1 minute

Hold duration 30 sec

Ramp duration 120 sec

11.4. Stress Test

Page 16 of 23
ESL Match Service

11.5. Validation and Execution


The key steps within this phase are:

 Developed test scripts

 Validating test scripts with data provided

 Executing test scripts

 Collating and validating results

Developed Validate and Validating


Execute Test Results
Test Scripts
Test Scripts

11.6. Pass/Fail Criteria


This section defines the pass and fails criteria for test cases identified for
Performance Testing. This would form the basis for passing and failing test
cases during the test execution activity.

 Test case will be marked pass status if it meets the SLA of the response
time without exceeding the server utilization thresholds specified.

 If the test case does not meet the response time SLA or exceeds the
server utilization thresholds, it will be marked fail status and will be
logged as defect.

Page 17 of 23
ESL Match Service

11.7. Reporting
Analyse the captured data and compare the results against the metric’s ac-
cepted level to determine whether the performance of the application being
tested shows a trend toward or away from the performance objectives.

Analyse the measured metrics to diagnose potential bottlenecks. Based on


the analysis, if required, capture additional metrics in subsequent test
cycles.

The key steps within this phase are as follows:

Results Description

Reporting  Consolidated Test Summary Report – Publishing – All cycles


and Iterations
 Recommendations suggested and Benefit realized
 Potential Risk & Recommendations to mitigate
 Performance Defect Report
 General recommendations for maintenance phases
 Performance Improvement report after defect fixes
 Executive Summary Presentation

 Inform to respective team & stakeholders

Page 18 of 23
ESL Match Service

12. Test Entry/Exit Criteria


This section calling out the entry and exit criteria of performance testing

Entry Criteria Exit Criteria


Strategy: Performance Strategy/Plan should Coverage: 100% performance test should be
be signed off from respective stakeholders completed as per plan

Environment: Test Environment should be Reports: Any performance breaches will be logged as
available, stable in terms of required/defined performance defects and tracked against ALM system
hardware and software

AUT: Application stability should be ensured Out of Window: Testing would strictly follow project
by signed off plan. Any Outstanding performance defects beyond the
Unit/System/Functional/Integration Testing performance testing window would be given to Program
Reports Manager’s discretion for conditional sign off or No go

BPs: Agreed upon Business Process Summary Report: Complete performance summary
Scenarios should be in place report would be published before exit the test

Workloads and NFRs: Non Functional


Requirements, Workloads should be signed
off before start the performance testing

Tools: Test Tools should be evaluated and


available on time with required vendor
support for test execution and monitoring

Network: Network connectivity should be


established to access all remotely required
systems (RDC in Colruyt)

Test Data: Database used for the


performance testing must be populated with
production level data

Associates: Resources should be


knowledgeable and capable to analyze the
performance test measurements for
recommendations

Page 19 of 23
ESL Match Service

13. Suspension/Resumption Criteria


This section describes the conditions for suspending and resuming Performance
testing activities.

Suspension Criteria

The test case execution in a test cycle shall be suspended under the following
circumstances:

 Any listed assumptions are not met.

 If the transaction failure rate is more than 5%, performance test


execution will be suspended as it indicates the problem with scripts or
with the performance test environment.

 Environment downtime/unavailability.

Resumption Criteria

The suspended test cycle execution shall resume only if the following conditions
are met:

 Suspended performance test scripts are thoroughly examined to filter


out the defects in the scripts. If there are no defects in the script then
the performance test environment will be scrutinized to correct the
problems.

 If the transaction failure rate is below 5%, performance test execution


will be resumed.

14. Test Schedule


Test Schedule is the part of project plan in turn part of program milestones.
Performance team prepared the detailed performance project execution
schedule in an excel sheet format.

The planned schedule is provided below. However, this schedule is subject to


change based on the system availability and any change in input criteria.

Page 20 of 23
ESL Match Service

ESL Match Service Performance testing plan


Are TimeLines
a Activity Details Owner
Requirements Gathering     17/01/2021
     
Tools availability jMeter  
  APM  
Preparation

Performance Test Plan document 04/02/2021


preperation   PT
Identify performance scenarios   PT 12/02/2021
Finalize weightage and scenario mix   PT 12/02/2021
Approve Scenarios and mix   FA 13/02/2021
Environment Setup   PT 18/02/2021
Load syst - Setup data volumes   FA 18/02/2021
     
Script preparation   PT
Baseline Testing   PT
Execution

     
     
Test Cycle 1 - load test Execute PT
PT +
  Monitor Infra
  Record results PT
     
Finalize and publish report  
Report

PT
     
Review and sign off Performance
Testing   Colruyt

15. Test Deliverable


This section lists the key deliverable to be produced during this performance
testing project. The following are list of key deliverable and responsible team.

These deliverables will be bound to expected time line, and captured in the
project plan.

S.No Deliverable Responsible

1 Performance Test plan Documents Performance Testing Team

2 Test Scripts for Business Process Scenario identified Performance Testing Team

Page 21 of 23
ESL Match Service

3 Test Completion Report Performance Testing Team

16. Roles and Responsibilities


The following table provides the list of the key roles and responsibilities for this
engagement.

RACI Chart

RACI CHART

AHS /
Sl. No. Testing Tasks  Dev Perf Team FA  Project Team
InfraTeam

1 NFRs A R I

2 Performance Test Plan  I R  I I I

Performnace Test Scenario


3  I A   R I
identification

Application scenarios
4 I   A R, A I
walkthrough

Setting up Testing User


5    A R
Access

Test Environment Readiness


6  I A  R I
check. PT

7 Test Data Preparation   A, R  C

8 APM Integration    A   R

9 Test Script Preparation-   R, A  I I

10 Load / Stress test Execution   R , A I  I

11 Performance Tuning R A I R I

12 Results analysis R R I R I

13 Status Reporting I  A, R  I I I

14 Performance Closure Report I  A, R  I I I

17. Approvals
To meet SDLC Roles & Responsibilities Minimum Requirements for this
document is to get approval from respective stakeholders. Additional approvals

Page 22 of 23
ESL Match Service

not listed in the following table may be collected at the discretion of the project
team.

A signature means that the Customer approves the scope and approach of the
technical performance testing.

Note: For this document, approval does not necessarily indicate that it is ready
to archive. The contents may be updated as often as is appropriate to the
circumstances of the project. When updating an approved version of this
document, clear the previous approval information before gathering new
approvals.

Group/Team/Role Approver Name Approved (Y/N) Approval Date


Project Manager

Infrastructure Manager

Service Manager

Page 23 of 23

You might also like