Professional Documents
Culture Documents
DLM Dashboard Performance Test Plan
DLM Dashboard Performance Test Plan
DLM DASHBOARD
Performance Test Plan
Version 1.1
09/24/2021
Performance Test Plan Sign-off
Table 1: Sign-off Detail
Name Role / Designation Signoff Date Signature
SHANA LONGO Product Manager
JASON BARTAN Project Manager
STEPHEN MACKAY Lead Developer
SATISH JAGADISAN QA Manager
JAKE SPARKS QA Lead
ii DLM DASHBOARD
Record of Changes
Information on how the development and distribution of the performance test plan were
carried out and tracked with dates. Use the table below to provide the version number, the
date of the version, the author/owner of the version, and a brief description of the reason for
creating the revised version
Record of Changes..................................................................................................................iii
Table of Contents....................................................................................................................iv
1. Executive Summary............................................................................................................1
1.1 Overview: Project Background and Scope....................................................................1
2. Application Architecture....................................................................................................2
2.1 Overview: System Architecture....................................................................................2
2.2 Architecture Diagram...................................................................................................2
2.3 Detailed information on each component...................................................................2
iv DLM DASHBOARD
Appendix A: Acronyms...............................................................................................19
v DLM DASHBOARD
1. Executive Summary
1.2 Scope
Scope of our performance testing are outlined as below:
6 DLM DASHBOARD
2. Application Architecture
7 DLM DASHBOARD
2.2 Detailed information on each component
DLM Dashboard UI- Developers use the realm database exposed by GraphQL
endpoints to populate the data in UI.
Graph QL - GraphQL is used to fetch required details from MongoDB by providing query
& Variables, GraphQL queries will work based on realm functions and Custom resolvers
Realm Functions allow you to define and execute server-side logic. Function can be
written based on input type and payload type (Which are defined in custom resolvers)
Hot chocolate is configured for cluster & realm app which help to run all queries
MongoDB – All Segman data will be available in Segman collection & All Sitetopology
data will be available in Sitetopology Collection
MongoSink Connector – is used to sync data real-time from Pulsar to MongoDB for both
Segman and SiteTopology
Cloud analytics platform - the platform acts as live stream data analytics platform and is
used to get all messages from various devices/sensors to the Pulsar through EMQ Mqtt
Ingress.
8 DLM DASHBOARD
3. Performance Test Requirements
3.1 Requirements
To achieve the requirement of 99.9% up time and to execute on the SLA agreement we have
made with our customers. Attached is the Performance Score Metrics sheet or MOM in which
Performance Testing of specific or all the components was agreed.
9 DLM DASHBOARD
3.3 NFR and NFT Matrix
This section contains the non-functional test cases (scripts) and applicable non-functional requirement
10 DLM DASHBOARD
4. Performance Test Planning
The transaction flow and script details must be given like below table: Develop performance
test scripts that simulate all of the actions in the Business Processes/Transactions documented
in the Load Model.
11 DLM DASHBOARD
Table 7: Performance Test (Script 1 Steps)
Application Name: DLM Dashboard- Insite
Business Process Name: Clean/fresh app load
Step #
NFT Script Name: 01_DLMDashboard_ColdStart
https://lncasoftware.ontestpad.com/script/25498#//
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
12 DLM DASHBOARD
Table 10: Performance Test (Script 4 Steps)
Application Name: DLM Dashboard- RTX
Business Process Name: Navigating through all home page tabs and sub-tabs (all
interactable elements)
Step #
NFT Script Name: 04_DLMDashboard_NavigateAllTabs
https://lncasoftware.ontestpad.com/project/54/folder/f365/
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads
7 Choose project from the Project list
8 Landing on Home Page with selected project
9 Select Spaces tab
On the Overview tab
10 a. Click all area dropdown and select any room
b. Room overview Page loads
Navigate to Occupancy tab
a. Click on graph view
i. Click on room search dropdown and search any room
b. Click on List view
10 i. Click on any column header to sort the column.
ii. On the calendar, Select previous date button
iii. On the calendar, Select forward date button
c. Click on Floor view
13 DLM DASHBOARD
Application Name: DLM Dashboard- RTX
Business Process Name: Navigating through all home page tabs and sub-tabs (all
interactable elements)
Step #
NFT Script Name: 04_DLMDashboard_NavigateAllTabs
https://lncasoftware.ontestpad.com/project/54/folder/f365/
c. Click on refresh button
Navigate to Settings tab
a. Click on Site Information menu item
13
b. Site Information page loads
14 DLM DASHBOARD
Table 13: Performance Test (Script 6 Steps)
Application Name: DLM Dashboard- RTX
Step # Business Process Name: Switch between Apps
NFT Script Name: 06_DLMDashboard_SwitchApp
1 Open Browser
2 Browse App Url
3 Splash Screen Appearing
4 Login Page load
5 Login with valid credentials
6 Project List page loads
7 Choose project from the Project list
8 Landing on Home Page with selected project
9 Select DLM Dashboard icon on the top left corner
10 Select “Connected Service”
11 User will be navigated to “Connected Services” app login page
12 Close the browser
15 DLM DASHBOARD
5. Performance Test Execution
Test Run
Schedule Test Scenario Summary
Sprint 23
Cycle 1 - Run 1 2-3 days Stress Test – 100 , 90,60,30,24,12 (find out threshold)
Cycle 1 - Run 2 Everyday Load Test - load comprising anticipated user load (3-4 hours)
Cycle 1 - Run 3 If required Endurance Test – Expected load for very long
16 DLM DASHBOARD
5.2 Table 15: Test Scripting and Execution Plan
Purpose During a stress test, the load will be gradually increasing until one of the test stopping
criteria is reached. Stress testing shows how the application reacts to an increased
intensity of transactions and reveals the upper limit of the software performance. This
test is designed to collect performance metrics on transaction throughput, response
times, and system resource utilization, in comparison to Performance requirements.
17 DLM DASHBOARD
Test Details
No. of Tests 1 Test per cycle
User Load / Volume Gradually decreasing the load from 100 users, 90 , 60, 30, 24, 18, 12 for different time
durations where (until one of the test stopping criteria is reached )
Entry Criteria 1. The code should be stable and functionally verified
2. Test Environment should be stable and ready to use
3. Test Data should be available
4. All the NFRs should be agreed with the project
5. Test scripts should be ready to use
Test Stopping 1. The response time exceeds the set value by several times.
Criteria 2. The critical level of the hardware resources usage is reached (CPU>80%,
memory>90%).
3. The number of HTTP errors exceeds 1% of the total request number.
4. Response code beyond 200 exceeds 3% of the total request number
5. Fail of the system software.
Purpose If the results of the stress check showed that the system hadn’t coped with the
required load, the load test is executed under the load comprising 80% of the maximum
value reached during the stress test. This test is designed to collect performance
metrics on transaction throughput, response times, and system resource utilization, in
comparison to Performance requirements.
No. of Tests 1 Tests per cycle
Duration Ramp-up: 2
Steady State:
Ramp-down:
18 DLM DASHBOARD
Test Details
Scripts 1. 01_DLMDashboard_ColdStart
2. 02_DLMDashboard_AppLogin
3. 03_DLMDashboard_ChooseProject
4. 04_DLMDashboard_NavigateAllTabs
5. 05_DLMDashboard_SwitchProject
6. 06_DLMDashboard_SwitchApp
Scenario Name Load Test Scenario
User Load / Volume 80% of the maximum value reached during the stress test for 3-4 hours
Purpose Endurance/Stability test is conducted under an expected load for an extended period.
When the maximum number of users has been reached, the load remains the same
until the end of the check. The duration of the stability test can take up to several days.
This test is designed to collect performance metrics on transaction throughput,
response times, and system resource utilization, in comparison to Performance
requirements.
No. of Tests 1 Test per cycle for 7 days
Duration Ramp-up: 2
Steady State:
Ramp-down:
19 DLM DASHBOARD
Test Details
Scripts 1. 01_DLMDashboard_ColdStart
2. 02_DLMDashboard_AppLogin
3. 03_DLMDashboard_ChooseProject
4. 04_DLMDashboard_NavigateAllTabs
5. 05_DLMDashboard_SwitchProject
6. 06_DLMDashboard_SwitchApp
Scenario Name Endurance Test Scenario
User Load / Volume 80% of the maximum value reached during the stress test for longer duration if
application is able to sustain load runs for longer duration
Entry Criteria 1. The code should be stable and functionally verified
2. Test Environment should be stable and ready to use
3. Test Data should be available
4. All the NFRs should be agreed with the project
5. Test scripts should be ready to use
6. Stress and Load Test have been performed
Exit Criteria 1. All the NFR must be met
2. The error rate of transactions must not be more than 3% of total transaction
count
20 DLM DASHBOARD
5.5 Performance Test Environment
Pre-prod will not be exactly same as prod because the data is different, but UI should be the
same.
**Testing Activities
Entry Criteria Tasks Exit Criteria
21 DLM DASHBOARD
5.6.2 Constraints
5.6.3 Risks
Risks should be documented concerning the test schedule, release software, dependencies,
tools, test approach test environment and other items pertaining to the performance test.
5.6.4 Dependencies
Dependencies should be documented concerning the latest build, test data, schedule, required
tools’ installation, test environment and other items pertaining to the performance test.
22 DLM DASHBOARD
6. Test Organization
The test organization and any other departments that will be supporting the Performance Test
Phase.
23 DLM DASHBOARD
Appendix A: Acronyms
This section lists out all the acronyms and associated literal translations used within the
document.
24 DLM DASHBOARD
Appendix B: Referenced Documents
List of documents which were referred during the preparation of Performance Test plan.
25 DLM DASHBOARD