Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 25

LEGRAND NORTH AMERICA

Software Center of Excellence

DLM DASHBOARD
Performance Test Plan
Version 1.1
09/24/2021
Performance Test Plan Sign-off
Table 1: Sign-off Detail
Name Role / Designation Signoff Date Signature
SHANA LONGO Product Manager
JASON BARTAN Project Manager
STEPHEN MACKAY Lead Developer
SATISH JAGADISAN QA Manager
JAKE SPARKS QA Lead

ii DLM DASHBOARD
Record of Changes
Information on how the development and distribution of the performance test plan were
carried out and tracked with dates. Use the table below to provide the version number, the
date of the version, the author/owner of the version, and a brief description of the reason for
creating the revised version

Table 2: Record of Changes


Version
Date Author/Owner Description of Change
Number
1.0 09/17/2021 Srishti Badola Draft version with available details
1.1 09/24/2021 Srishti Badola Updated after review
1.2 10/06/2021 Srishti Badola Updated for sprint 22
1.3 10/20/2021 Srishti Badola Updated for sprint 23

iii DLM DASHBOARD


Table of Contents
Performance Test Plan Sign-off................................................................................................ii

Record of Changes..................................................................................................................iii

Table of Contents....................................................................................................................iv

1. Executive Summary............................................................................................................1
1.1 Overview: Project Background and Scope....................................................................1

2. Application Architecture....................................................................................................2
2.1 Overview: System Architecture....................................................................................2
2.2 Architecture Diagram...................................................................................................2
2.3 Detailed information on each component...................................................................2

3. Performance Test Requirements........................................................................................3


3.1 Requirements...............................................................................................................3
3.1.1 Business NFR................................................................................................. 3
3.2 Detailed and Agreed NFR.............................................................................................3
3.3 NFR and NFT Matrix..................................................................................................... 3

4. Performance Test Planning................................................................................................5


4.1 Performance Test Approach.........................................................................................5
4.1.1 Performance Testing and Monitoring Tool Details........................................5
4.1.2 Performance Test Script Steps.......................................................................6

5. Performance Test Execution...............................................................................................8


5.1 Performance Test Summary.........................................................................................8
5.2 Performance Test Details.............................................................................................8
5.2.1 Load Test.......................................................................................................8
5.2.2 Stress Test..................................................................................................... 9
5.2.3 Endurance Test………………………………………………………………………………………….10

5.3 Performance Test Monitoring Metrics.......................................................................11


5.4 Performance Test Environment.................................................................................12
5.5 Assumptions, Constraints, Risks and Dependencies...................................................14
5.5.1 Assumptions................................................................................................14
5.5.2 Constraints.................................................................................................. 14
5.5.3 Risks.............................................................................................................15
5.5.4 Dependencies..............................................................................................15
6. Test Organization............................................................................................................17

iv DLM DASHBOARD
Appendix A: Acronyms...............................................................................................19

Appendix B: Referenced Documents...........................................................................21

v DLM DASHBOARD
1. Executive Summary

1.1 Overview: Project Background and Scope


DLM Dashboard application makes the process of building management flexible, simple, and
powerful. It provides our users with solutions that meet their needs and the growing demands
of the building intelligence market by involving them continuously through the development
process. It empowers our customers with the ability to view/create/edit relevant information to
the management of their lighting, occupancy, and energy systems. This application makes this
functionality available to our customers anytime. All information is subject to Legrand’s
defined security policy, where he/she can only view the information for which he/she is
authorized.

1.2 Scope
Scope of our performance testing are outlined as below:

 End to end testing based on SLA requirements outlined by Product Management.


Note: a. SLA requirement should meet
b. In case not meeting SLA exactly, Product owner (Shana) can decide if it is close
enough to proceed forward.
 Load test - Current user load for DLM Dashboard is 2 and 5 users are expected by the
end of the year.
 Endurance Test - To test how long the system can survive with anticipated user load
 Stress Test - To understand the threshold /breakpoint of the system.
 Monitoring + Result Analysis in JMeter and Application Insight monitoring already in
place.

6 DLM DASHBOARD
2. Application Architecture

2.1 Architecture Diagram

7 DLM DASHBOARD
2.2 Detailed information on each component
DLM Dashboard UI- Developers use the realm database exposed by GraphQL
endpoints to populate the data in UI.
Graph QL - GraphQL is used to fetch required details from MongoDB by providing query
& Variables, GraphQL queries will work based on realm functions and Custom resolvers
Realm Functions allow you to define and execute server-side logic. Function can be
written based on input type and payload type (Which are defined in custom resolvers)
Hot chocolate is configured for cluster & realm app which help to run all queries
MongoDB – All Segman data will be available in Segman collection & All Sitetopology
data will be available in Sitetopology Collection
MongoSink Connector – is used to sync data real-time from Pulsar to MongoDB for both
Segman and SiteTopology
Cloud analytics platform - the platform acts as live stream data analytics platform and is
used to get all messages from various devices/sensors to the Pulsar through EMQ Mqtt
Ingress.

8 DLM DASHBOARD
3. Performance Test Requirements

3.1 Requirements
To achieve the requirement of 99.9% up time and to execute on the SLA agreement we have
made with our customers. Attached is the Performance Score Metrics sheet or MOM in which
Performance Testing of specific or all the components was agreed.

3.1.1 Business NFR


This section contains all the non-functional requirements which come from the project team.

Table 3: Business NFR


User SLA/response
Business Transactions Transactions per hour
Load times

2-5 1440(2 user,5sec) – 720(2 user, 10sec)


Clean/fresh app load <5-10sec 3600(5 user,5 sec) – 1800(5 user, 10sec)

2-5 1440(2 user,5sec) – 720(2 user, 10sec)


App Login <5-10sec 3600(5 user,5 sec) – 1800(5 user, 10sec)

Choosing a project and 2-5 1440(2 user,5sec) – 720(2 user, 10sec)


Landing on Homepage <5-10sec 3600(5 user,5 sec) – 1800(5 user, 10sec)

Navigating through all 2-5


home page tabs and sub- 1440(2 users,5sec) – 720(2 user, 10sec)
tabs <5-10sec 3600(5 users,5 sec) – 1800(5 user, 10sec))

2-5 1440(2 users,5sec) – 720(2 user, 10sec)


Switch between Projects <5-10sec 3600(5 users,5 sec) – 1800(5 user, 10sec)

2-5 1440(2 user,5sec) – 720(2 user, 10sec)


Switch between Apps <5-10sec 3600(5 users,5 sec) – 1800(5 user, 10sec)

3.2 Detailed and Agreed NFR(Non- Functional Requirements)


After analyzing the business NFR, refine them and convert into quantitative NFR. Attach the NFR
document sheet in this section. SLA - DLM-Dashboard

9 DLM DASHBOARD
3.3 NFR and NFT Matrix
This section contains the non-functional test cases (scripts) and applicable non-functional requirement

Table 4: NFR-NFT Matrix

NFR SLA NFT Remarks

Test objective: Test the


functionality to ensure page is
being loaded within 5-10 sec from
the time it was hit with e.g. 2, 5,
Clean/fresh app load <5-10sec 100, 1000, 10000 users per sec.

Test objective: Test the


functionality to ensure login page
is being loaded within 5-10 sec
from the time it was hit with e.g.
App Login <5-10sec 2, 5, 100, 10000 users per sec.

Test objective: Test the


functionality to ensure after
choosing project Homepage being
loaded within 5-10 sec from the
Choosing a project and time it was hit with eg: 2, 5, 100,
Landing on Homepage <5-10sec 10000 users per sec.

Test objective: Test the


functionality to ensure navigating
Navigating through all to each page takes approx 5-10
home page tabs and sec from the time it was hit with
sub-tabs <5-10sec eg: 2, 5, 100, 10000 users per sec.

Test objective: Test the


functionality to ensure Project
switch happens within 5-10 sec
Switch between from the time it was hit with eg: 2,
Projects <5-10sec 5, 100, 10000 users per sec.

Test objective: Test the


functionality to ensure App
Switching happens within 5-10sec
from the time it was hit with eg: 2,
Switch between Apps <5-10sec 5, 100, 10000 users per sec.

10 DLM DASHBOARD
4. Performance Test Planning

4.1 Performance Test Approach


The production load modal requirement as stated by business is approx. 2 to 5 user load
Max. The type of testing that will be performed are Load, Stress and Endurance testing
to identify the application’s stability under anticipated load and for longer duration and
also to test how system behaves under extreme load conditions. We may update the
number of cycle runs as per our business needs. The key performance metric area that
will be important to monitor is the response time of the application.

4.1.1 Performance Testing and Monitoring Tool Details


Table 6: Description of Performance Testing Tool
Licensed / Open-
Tool Name Description
Source?
 Apache JMeter Version: 5.4.1 Open Source
Required Protocol:
 Plug-in for Apache JMeter : Web HTTP/HTML
Parallel Controller & Sampler +
WebSocket plugin for parallel
execution
Azure Application Insights For performance Licensed
monitoring

4.1.2 Performance Test Script Steps


In this section, the performance test scripts that need to be developed are detailed by user
action step as shown in the tables below. For each key Business Process within the Application
under Test which was agreed from the project; a Performance Test script needs to be
developed.

The transaction flow and script details must be given like below table: Develop performance
test scripts that simulate all of the actions in the Business Processes/Transactions documented
in the Load Model.

11 DLM DASHBOARD
Table 7: Performance Test (Script 1 Steps)
Application Name: DLM Dashboard- Insite
Business Process Name: Clean/fresh app load
Step #
NFT Script Name: 01_DLMDashboard_ColdStart
https://lncasoftware.ontestpad.com/script/25498#//
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load

Table 8: Performance Test (Script 2 Steps)


Application Name: DLM Dashboard- Insite
Business Process Name: App Login
Step #
NFT Script Name: 02_DLMDashboard_AppLogin
https://lncasoftware.ontestpad.com/script/25499#//
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads

Table 9: Performance Test (Script 3 Steps)


Application Name: DLM Dashboard- RTX
Business Process Name: Choosing a project and Landing on Homepage
Step #
NFT Script Name: 03_DLMDashboard_ChooseProject
https://lncasoftware.ontestpad.com/script/22679#//
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads
7 Choose project from the Project list
8 Landing on Home Page with selected project
9 Logout

12 DLM DASHBOARD
Table 10: Performance Test (Script 4 Steps)
Application Name: DLM Dashboard- RTX
Business Process Name: Navigating through all home page tabs and sub-tabs (all
interactable elements)
Step #
NFT Script Name: 04_DLMDashboard_NavigateAllTabs
https://lncasoftware.ontestpad.com/project/54/folder/f365/
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads
7 Choose project from the Project list
8 Landing on Home Page with selected project
9 Select Spaces tab
On the Overview tab
10 a. Click all area dropdown and select any room
b. Room overview Page loads
Navigate to Occupancy tab
a. Click on graph view
i. Click on room search dropdown and search any room
b. Click on List view
10 i. Click on any column header to sort the column.
ii. On the calendar, Select previous date button
iii. On the calendar, Select forward date button
c. Click on Floor view

Navigate to Energy tab


a. Click on graph view
i. Click on room search dropdown and search any room
b. Click on List view
11 i. Click on any column header to sort the column.
ii. On the calendar, Select previous date button
iii. On the calendar, Select forward date button
c. Click on Floor view

12 Navigate to Report tab


a. Click Message log
b. Message log page loads

13 DLM DASHBOARD
Application Name: DLM Dashboard- RTX
Business Process Name: Navigating through all home page tabs and sub-tabs (all
interactable elements)
Step #
NFT Script Name: 04_DLMDashboard_NavigateAllTabs
https://lncasoftware.ontestpad.com/project/54/folder/f365/
c. Click on refresh button
Navigate to Settings tab
a. Click on Site Information menu item
13
b. Site Information page loads

Navigate to Building Overview page


14
a. Building overview page loads
Click on Help button from dropdown in the top right header
15
a. User is navigated to Login (force.com) site
Click on “What’s New” from dropdown in top right header
16
a. User is navigated to “What’s New” page
17 Logout

Table 12: Performance Test (Script 5 Steps)


Application Name: DLM Dashboard- RTX
Business Process Name: Switch between Projects
Step #
NFT Script Name: 05_DLMDashboard_SwitchProject
https://lncasoftware.ontestpad.com/script/23827#//
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads
7 Choose project from the Project list
8 Landing on Home Page with selected project
9 Click on RTX Preprod
10 Select “Samsung” project from the project list
11 User navigated to the Samsung project
12 Logout

14 DLM DASHBOARD
Table 13: Performance Test (Script 6 Steps)
Application Name: DLM Dashboard- RTX
Step # Business Process Name: Switch between Apps
NFT Script Name: 06_DLMDashboard_SwitchApp
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads
7 Choose project from the Project list
8 Landing on Home Page with selected project
9 Select DLM Dashboard icon on the top left corner
10 Select “Connected Service”
11 User will be navigated to “Connected Services” app login page
12 Close the browser

15 DLM DASHBOARD
5. Performance Test Execution

5.1 Performance Test Summary


The table below provides a short summary of each of the Performance Test scenario runs.

Table 14: Performance Test Scenarios


Test Run
Schedule Test Scenario Summary
Sprint 22
Cycle 1 - Run 1 2-3 days Stress Test – 600 300,120, 90,60,45,30,24,12 (find out
threshold)
Cycle 1 - Run 2 Everyday Load Test - load comprising anticipated user load (3-4 hours)
Cycle 1 - Run 3 If required Endurance Test – Expected load for very long

Test Run
Schedule Test Scenario Summary
Sprint 23
Cycle 1 - Run 1 2-3 days Stress Test – 100 , 90,60,30,24,12 (find out threshold)
Cycle 1 - Run 2 Everyday Load Test - load comprising anticipated user load (3-4 hours)
Cycle 1 - Run 3 If required Endurance Test – Expected load for very long

16 DLM DASHBOARD
5.2 Table 15: Test Scripting and Execution Plan

Task Schedule(can vary) Owner Remarks

Creating Test Platform 1 day Srishti Badola Setting up test


platform: VM, installing
software, plug-ins etc.

Data Parameterization, 1 days Srishti Badola Using different data for


Correlation, Test different users, Get
Scripting dynamic variables from
server, Writing Test
Scripts

Designing the action 2 days Srishti/Maya Designing the action


and transactions and transactions

Unit Testing 1 day Srishti/Maya Unit Testing JMeter


script

Test Script Review 1 day Jake Sparks Test Script Review

Dry Run 2 days Srishti Dry Run

J meter Script check-in 1 day Srishti/Maya Script check-in in Git


in Git

Performance Test 1-2 days Srishti/Maya/Jake Analysis and creating


Summary and Results Performance Test
Summary and Results

5.3 Performance Test Details


5.3.1 Stress Test
Table 16: Stress Test Scenarios Detail
Test Details
Test ID NFT01 (Cycle 1-Run 1)

Purpose During a stress test, the load will be gradually increasing until one of the test stopping
criteria is reached. Stress testing shows how the application reacts to an increased
intensity of transactions and reveals the upper limit of the software performance. This
test is designed to collect performance metrics on transaction throughput, response
times, and system resource utilization, in comparison to Performance requirements.

17 DLM DASHBOARD
Test Details
No. of Tests 1 Test per cycle

Duration Ramp-up: 2 sec


Steady State:
Ramp-down:
Scripts 1. 02_DLMDashboard_AppLogin
2. 03_DLMDashboard_ChooseProject
3. 04_DLMDashboard_NavigateAllTabs
4. 05_DLMDashboard_SwitchProject
5. 06_DLMDashboard_SwitchApp
Scenario Name Stress Test Scenario

Covered NFR NFR02 , NFR03, NFR04, NFR05,NFR06

User Load / Volume Gradually decreasing the load from 100 users, 90 , 60, 30, 24, 18, 12 for different time
durations where (until one of the test stopping criteria is reached )
Entry Criteria 1. The code should be stable and functionally verified
2. Test Environment should be stable and ready to use
3. Test Data should be available
4. All the NFRs should be agreed with the project
5. Test scripts should be ready to use
Test Stopping 1. The response time exceeds the set value by several times.
Criteria 2. The critical level of the hardware resources usage is reached (CPU>80%,
memory>90%).
3. The number of HTTP errors exceeds 1% of the total request number.
4. Response code beyond 200 exceeds 3% of the total request number
5. Fail of the system software.

5.3.2 Load Test


Table 15: Load Test Scenarios Detail
Test Details
Test ID NFT01 (Cycle 1-Run2)

Purpose If the results of the stress check showed that the system hadn’t coped with the
required load, the load test is executed under the load comprising 80% of the maximum
value reached during the stress test. This test is designed to collect performance
metrics on transaction throughput, response times, and system resource utilization, in
comparison to Performance requirements.
No. of Tests 1 Tests per cycle

Duration Ramp-up: 2
Steady State:
Ramp-down:

18 DLM DASHBOARD
Test Details
Scripts 1. 01_DLMDashboard_ColdStart
2. 02_DLMDashboard_AppLogin
3. 03_DLMDashboard_ChooseProject
4. 04_DLMDashboard_NavigateAllTabs
5. 05_DLMDashboard_SwitchProject
6. 06_DLMDashboard_SwitchApp
Scenario Name Load Test Scenario

Covered NFR NFR01, NFR02, NFR03, NFR04, NFR05, NFR06

User Load / Volume 80% of the maximum value reached during the stress test for 3-4 hours

Entry Criteria 1. The code should be stable and functionally verified


2. Test Environment should be stable and ready to use
3. Test Data should be available
4. All the NFRs should be agreed with the project
5. Test scripts should be ready to use

Exit Criteria 1. All the NFR must be met


2. The error rate of transactions must not be more than 3% of total transaction
count

5.3.3 Endurance Test


Table 16: Endurance Test Scenarios Detail
Test Details
Test ID NFT03 (Cycle 1 run 3)

Purpose Endurance/Stability test is conducted under an expected load for an extended period.
When the maximum number of users has been reached, the load remains the same
until the end of the check. The duration of the stability test can take up to several days.
This test is designed to collect performance metrics on transaction throughput,
response times, and system resource utilization, in comparison to Performance
requirements.
No. of Tests 1 Test per cycle for 7 days

Duration Ramp-up: 2
Steady State:
Ramp-down:

19 DLM DASHBOARD
Test Details
Scripts 1. 01_DLMDashboard_ColdStart
2. 02_DLMDashboard_AppLogin
3. 03_DLMDashboard_ChooseProject
4. 04_DLMDashboard_NavigateAllTabs
5. 05_DLMDashboard_SwitchProject
6. 06_DLMDashboard_SwitchApp
Scenario Name Endurance Test Scenario

Covered NFR NFR01, NFR02 , NFR03, NFR04, NFR05,NFR06

User Load / Volume 80% of the maximum value reached during the stress test for longer duration if
application is able to sustain load runs for longer duration
Entry Criteria 1. The code should be stable and functionally verified
2. Test Environment should be stable and ready to use
3. Test Data should be available
4. All the NFRs should be agreed with the project
5. Test scripts should be ready to use
6. Stress and Load Test have been performed
Exit Criteria 1. All the NFR must be met
2. The error rate of transactions must not be more than 3% of total transaction
count

5.4 Performance Test Monitoring Metrics


The table below describe examples of the various performance metrics that can be captured
during the Performance Test stage to view resource usage trends.

Azure Application Insights to monitor performance metrices will be used.

Table 17: Application Server Tier


Metrics Value Measured
Response Time Response Time
CPU utilization CPU utilization
Physical Memory Percentage used Physical Memory Percentage used
Memory Memory utilization
Network Collisions (Collis), Output Packets (Opkts), Input errors (Ierrs),
Input Packets (Pits)

20 DLM DASHBOARD
5.5 Performance Test Environment
Pre-prod will not be exactly same as prod because the data is different, but UI should be the
same.

Test Environment URL


preprod https://dashboard.dlm.preprod.swcoe.legrand.us/

5.6 Assumptions, Constraints, Risks and Dependencies


5.6.1 Assumptions
Assumptions are documented concerning the available release software, test environment,
dependencies, tools, and test schedule associated with the performance test.

Table 21: Assumptions


No. Assumption
1 The code is stable and passed in functional testing before deploying in the Performance Testing
environment.
2 The fully deployed, installed and configured Web tier, middleware tier, and database servers must
be operational for performance testing shake-out to begin.
3. Test schedule is reviewed in case there are any obstacles for **testing activity

**Testing Activities
Entry Criteria Tasks Exit Criteria

 Baselined Test Strategy  Test Scripting  Unit tested review


 Pre-production  Data Parameterization performance scripts
Environment set up  Correlation  J meter Scripting and
 Test platform  Designing the action check in in Git
availability and transactions  Performance Test
 Unit Testing Summary and Results
 Test Script Review
 Dry Run

21 DLM DASHBOARD
5.6.2 Constraints

Constraints should be documented concerning the available release software, test


environment, dependencies, tools, test schedule, and other items pertaining to the
performance test.

Table 22: Constraints


No. Constraint Impact
1 To hit all the tiers of the architecture the there The time and cost associated with exploring different
must be an ecosystem. Need to explore how to plugins to hit components from JMeter.
achieve that through JMeter or find
alternative, e.g., components like Realm
cannot be hit directly from JMeter. So this
testing will be carried as 2nd stage testing.

5.6.3 Risks
Risks should be documented concerning the test schedule, release software, dependencies,
tools, test approach test environment and other items pertaining to the performance test.

Table 23: Risks


No. Risk Impact Action/Mitigation Assigned To
1. QA team is new to medium Team will be mentored Performance
performance testing on performance testing Engineer
tools

5.6.4 Dependencies
Dependencies should be documented concerning the latest build, test data, schedule, required
tools’ installation, test environment and other items pertaining to the performance test.

Table 24: Risks


No. Dependencies Impact Action/Mitigation Assigned To
1 The latest build should be High The team will start Developer
available in the non- Performance Test
functional environment execution once the
before NFT start date environment has the
latest and functionally
tested code in the pre-
production Environment.

22 DLM DASHBOARD
6. Test Organization
The test organization and any other departments that will be supporting the Performance Test
Phase.

Table 25: Test Organization


Name Functional Role Responsibilities
JASON BARTAN Project Manager Facilitating and coordinating all schedules related to SDLC
phases and infrastructure
SRISHTI BADOLA Performance Manages schedules and activities related to Performance
Engineering Lead Testing projects
Prepares for performance test execution, executes
performance tests, analyzes performance tests, and tracks
problem reports
MAYA THAKRE QA Engineer Prepares for performance test execution, executes
performance tests, analyzes performance tests, and tracks
problem reports
JAKE SPARKS QA Lead Prepares for performance test execution, executes
performance tests, analyzes performance tests, and tracks
problem reports.
ALL QAs Monitoring Support Monitors performance tests using Performance monitors
Developers(name) Application Support Supports performance test execution as configuration or
application issues are found
DevOps(name) Performance Test Supports and maintains the Performance Test
Environment Support environment(preprod)

23 DLM DASHBOARD
Appendix A: Acronyms
This section lists out all the acronyms and associated literal translations used within the
document.

Table 26: Acronyms


Acronym Literal Translation
NFR Non-functional Requirement
PT Performance Testing
NFT Non-functional Test

24 DLM DASHBOARD
Appendix B: Referenced Documents
List of documents which were referred during the preparation of Performance Test plan.

Table 26: Referenced Documents


Document Name Document Location and/or URL Issuance Date
SLA Document SLA Document

Meeting Notes Meeting Notes

25 DLM DASHBOARD

You might also like