Professional Documents
Culture Documents
ApplicationName_PerformanceTestDesign
ApplicationName_PerformanceTestDesign
Table of Contents
INTRODUCTION....................................................................................................................................3
1.1 PURPOSE........................................................................................................................................3
1.2 DOCUMENT REFERENCES..............................................................................................................4
TEST SPECIFICATIONS.....................................................................................................................5
1.3 TEST SETUP PROCEDURE.............................................................................................................5
1.4 DATA SETUP..................................................................................................................................6
1.5 ERROR HANDLING.........................................................................................................................7
1.6 WORKLOAD CRITERIA...................................................................................................................8
1.7 CLIENT SIDE STATISTICS..........................................................................................................12
TEST EXECUTION...............................................................................................................................13
1.8 RUN DETAILS..............................................................................................................................13
1.9 PRE-TEST PROCEDURES.............................................................................................................16
1.10 TEST RUN................................................................................................................................17
1.11 Post-Test Procedures........................................................................................................18
INTRODUCTION
1.1 Purpose
This document presents the test design for the Performance Test to be
conducted for the <Organization Name>, <Application Name> developed by
Development Team. The design currently covers details for the first run of the
performance test.
Top20_Critical
TEST SPECIFICATIONS
# IP Address Location
1 10.236.156.219 <Location1>
2 10.224.60.81 <Location2>
3 10.236.227.43 <Location3>
4 10.238.184.162 <Location4>
5 10.238.82.35 <Location5>
Run-Time Settings: -
A complete backup of the database should be taken prior to running any of the
tests.
During the tests, data will undergo changes in the data stores. To ensure
comparability of the results in different iterations, database needs to be refreshed
using the backups before every test run.
Database equivalent to production will be provided for the last round of
execution.
The Final workload criteria will be the mixture of all below scripts with 3000
users running in parallel to see the behavior of different types of users over the
period of time.
The Hits per Second graph shows the number of hits on the Web server (y-axis) as a
function of the elapsed time in the scenario (x-axis).
3. Throughput
The Throughput graph shows the amount of throughput on the Web server (y-axis)
during each second of the scenario run (x-axis). Throughput is measured in kilobytes
and represents the amount of data that the Vusers received from the server at any
given second.
The HTTP Responses per Second graph shows the number of HTTP status codes,
which indicate the status of HTTP requests, for example, “the request was
successful,” “the page was not found” returned from the Web server during each
second of the scenario run (x-axis), grouped by status code.
The Pages Downloaded per Second graph shows the number of Web pages
downloaded from the server during each second of the scenario run. This graph helps
you evaluate the amount of load Vusers generate, in terms of the number of pages
downloaded. Like throughput, pages downloaded per second is a representation of
the amount of data that the Vusers received from the server at any given second.
The Transaction Response time graph shows the response time of transactions in
seconds (y-axis) as a function of the elapsed time in the scenario (x-axis).
7. Error Statistics
TEST EXECUTION
Incremental type of load for runs having more than 50 concurrent users: -
SCRIPT
50 Vusers
with50 Vusers
TUNE
Increment
users by
100 RESULT
100
Response time
not matched
Ideal Response Time
100
Run starts with 50 concurrent users to see the response time matches the
objectives.
If objectives won’t match then problems will be identify. After resolving the
problems repeat test will be done with same number of users.
Load will be increased until specified load is achieved along with ideal response
time.
From the NFR documents all scripts will be checked for Ideal response time and
Acceptable response time limits.
7 Expense Report Load Test Need to find out the time taken by
Creation/submission batch process to process the time sheet
data.
9 Resource Transfer Load Test Need to find out time taken by batch
process to transfer total number of
Resources.
Report
25 Utilization File Load Test Need to find out time taken by batch
process to calculate utilization of all
Resources.
Connectivity of Controller and Load Generators has to check before starting the
run.
Need to check the settings done on server for monitoring before starting the
execution.
2. For the second round of execution users load will be apply from nine
different Load generators (10.236.156.219, 10.224.60.81, 10.236.227.43,
10.238.184.162, 10.238.82.35, 10.226.48.48, 208.20.155.148,
10.236.208.35, 10.237.99.39)
3. Following is the pattern for applying the load from different Geographical
areas.
4. For third round of execution scripts will be run along with FMS scripts to
check the impact of FMS existence on application.
Once the analysis will be over, all the observations and graphs mentioned in
section 1.8 will be mailed to all the teams involved in <Application>.