Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 18

<Application Name>

Performance Test Design


Version No. - 2.0

Version Date Prepared/Modified Reviewe Changes


By d by Done
Controlled copy Performance Test Design-XXXXXXXXXX

Table of Contents

INTRODUCTION....................................................................................................................................3
1.1 PURPOSE........................................................................................................................................3
1.2 DOCUMENT REFERENCES..............................................................................................................4
TEST SPECIFICATIONS.....................................................................................................................5
1.3 TEST SETUP PROCEDURE.............................................................................................................5
1.4 DATA SETUP..................................................................................................................................6
1.5 ERROR HANDLING.........................................................................................................................7
1.6 WORKLOAD CRITERIA...................................................................................................................8
1.7 CLIENT SIDE STATISTICS..........................................................................................................12
TEST EXECUTION...............................................................................................................................13
1.8 RUN DETAILS..............................................................................................................................13
1.9 PRE-TEST PROCEDURES.............................................................................................................16
1.10 TEST RUN................................................................................................................................17
1.11 Post-Test Procedures........................................................................................................18

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 2 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

INTRODUCTION

1.1 Purpose
This document presents the test design for the Performance Test to be
conducted for the <Organization Name>, <Application Name> developed by
Development Team. The design currently covers details for the first run of the
performance test.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 3 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.2 Document References


 Performance Test Plan

 PS_DM_NFR Ver1.0 11022006

 PS_RM_NFR Ver1.0 11022006

 PS_TE_NFR Ver1.0 11022006

 Top20_Critical

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 4 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

TEST SPECIFICATIONS

1.3 Test Setup Procedure


Scripting for the identified business flows will happen at Pune. For test
execution all the assigned controllers are located at <Location>, but the test
execution will be done from Pune through Remote control. All the scripts need
to transfer to <Location> controller machines to set the scenario. Once the
test execution will over all the results will be collected at <Location> controller
machine and then will be moved to Pune for further analysis.

Load Generators List: -

# IP Address Location

1 10.236.156.219 <Location1>

2 10.224.60.81 <Location2>

3 10.236.227.43 <Location3>

4 10.238.184.162 <Location4>

5 10.238.82.35 <Location5>

Run-Time Settings: -

Runtime Settings Options


Log Enable Logging
Pacing As soon as the previous iteration
ends
Think Time Replay Think Time (5 or 10 Sec)
Proxy Settings No Proxy

1.4 Data Setup


 Primary database will be provided with the required test data for performance
testing.

 A complete backup of the database should be taken prior to running any of the
tests.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 5 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

 During the tests, data will undergo changes in the data stores. To ensure
comparability of the results in different iterations, database needs to be refreshed
using the backups before every test run.
 Database equivalent to production will be provided for the last round of
execution.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 6 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.5 Error Handling


This section identifies all the error checking/ validations that are
performed transaction wise. Error tracking for all the exception
handled by developers at database level and application level will be
covered.

Transaction Name Error to be Handled Action Taken

Sign-in Authentication error User will terminate

Search Criteria Step download timeout Error message will be


populated

Search Criteria HTTP-request Error message will be


connection timeout populated

Search Criteria HTTP-request receive Error message will be


timeout populated

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 7 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.6 Workload Criteria


These workload criteria will be same for all the rounds of execution. Only for some
critical scripts the workload will be in incremental manner (starts with less number of
users up to the desired)

Script Name Concurrent Duration Specific Runtime Ramp Up Pattern


Vusers Settings

Project 15 2 Hrs As mentioned in Simultaneous


Creation(Online) 1.3

Time Sheet Incremental 2 Hrs As mentioned in 10 Users/15 Sec/


creation/submission up to 3000 1.3 Rendezvous Point
(fortnightly)

Express Search 120 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Express Resume 120 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Staffing Workbench 120 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Time Sheet Approval 75 2 Hrs As mentioned in 10 Users/15 Sec


(PM) 1.3

Expense Report 1 End of As mentioned in -


Creation/submission Process 1.3

Upload Resume Incremental 2 Hrs As mentioned in 10 Users/15 Sec/


up to 400 1.3 Rendezvous Point

Resource Transfer 1 End of As mentioned in -


Process 1.3

Create Service order Incremental 2 Hrs As mentioned in 10 Users/15 Sec/


by PM up to 400 1.3 Rendezvous Point

Resource Assignment 40 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Resource Approval by 40 2 Hrs As mentioned in 10 Users/15 Sec


PM 1.3

Job Spy Incremental 2 Hrs As mentioned in 10 Users/15 Sec/


up to 1200 1.3 Rendezvous Point

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 8 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

Pool Resources 30 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Resource Rotation 30 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Assignment Listing 30 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Open Service Order 30 2Hrs As mentioned in 10 Users/15 Sec


listing Report 1.3

Travel Ready 30 2Hrs As mentioned in 10 Users/15 Sec


Resource Listing 1.3

Resource Billability 30 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Proposal Effort 20 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Time Sheet- Entry 60 2 Hrs As mentioned in 10 Users/15 Sec


NonCompl Report 1.3

Total Number of 60 2 Hrs As mentioned in 10 Users/15 Sec


Hours Report 1.3

Approval Non 10 2 Hrs As mentioned in 10 Users/15 Sec


Compliance Report 1.3

Projects Closure 50 2 Hrs As mentioned in 10 Users/15 Sec


within 20 days 1.3
Report

Utilization File 1 End of As mentioned in


-
Process 1.3

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 9 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

The Final workload criteria will be the mixture of all below scripts with 3000
users running in parallel to see the behavior of different types of users over the
period of time.

Script Name Concurrent Duration Specific Runtime Ramp Up Pattern


Vusers Settings

Project 10 2 Hrs As mentioned in Simultaneous


Creation(Online) 1.3

Time Sheet 1500 2 Hrs As mentioned in 10 Users/15 Sec/


creation/submission 1.3 Rendezvous Point
(fortnightly)

Express Search 65 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Express Resume 65 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Staffing Workbench 65 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Time Sheet Approval 40 2 Hrs As mentioned in 10 Users/15 Sec


(PM) 1.3

Expense Report 1 End of As mentioned in -


Creation/submission Process 1.3

Upload Resume 200 2 Hrs As mentioned in 10 Users/15 Sec/


1.3 Rendezvous Point

Resource Transfer 1 End of As mentioned in -


Process 1.3

Create Service order 200 2 Hrs As mentioned in 10 Users/15 Sec/


by PM 1.3 Rendezvous Point

Resource Assignment 25 2 Hrs As mentioned in 10 Users/15 Sec


1.3

Resource Approval by 25 2 Hrs As mentioned in 10 Users/15 Sec


PM 1.3

Job Spy 600 2 Hrs As mentioned in 10 Users/15 Sec/


1.3 Rendezvous Point

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 10 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

Pool Resources 15 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Resource Rotation 15 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Assignment Listing 15 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Open Service Order 15 2Hrs As mentioned in 10 Users/15 Sec


listing Report 1.3

Travel Ready 15 2Hrs As mentioned in 10 Users/15 Sec


Resource Listing 1.3

Resource Billability 15 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Proposal Effort 15 2 Hrs As mentioned in 10 Users/15 Sec


Report 1.3

Time Sheet- Entry 30 2 Hrs As mentioned in 10 Users/15 Sec


NonCompl Report 1.3

Total Number of 30 2 Hrs As mentioned in 10 Users/15 Sec


Hours Report 1.3

Approval Non 10 2 Hrs As mentioned in 10 Users/15 Sec


Compliance Report 1.3

Projects Closure 25 2 Hrs As mentioned in 10 Users/15 Sec


within 20 days 1.3
Report

Utilization File 1 2 Hrs As mentioned in


-
1.3

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 11 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.7 Client Side Statistics


1. Running Vusers

Shows total number of users running at any instance of execution.

2. Hits per Second

The Hits per Second graph shows the number of hits on the Web server (y-axis) as a
function of the elapsed time in the scenario (x-axis).

3. Throughput

The Throughput graph shows the amount of throughput on the Web server (y-axis)
during each second of the scenario run (x-axis). Throughput is measured in kilobytes
and represents the amount of data that the Vusers received from the server at any
given second.

4. HTTP responses per Second

The HTTP Responses per Second graph shows the number of HTTP status codes,
which indicate the status of HTTP requests, for example, “the request was
successful,” “the page was not found” returned from the Web server during each
second of the scenario run (x-axis), grouped by status code.

5. Pages downloaded per Second

The Pages Downloaded per Second graph shows the number of Web pages
downloaded from the server during each second of the scenario run. This graph helps
you evaluate the amount of load Vusers generate, in terms of the number of pages
downloaded. Like throughput, pages downloaded per second is a representation of
the amount of data that the Vusers received from the server at any given second.

6. Transaction response time

The Transaction Response time graph shows the response time of transactions in
seconds (y-axis) as a function of the elapsed time in the scenario (x-axis).

7. Error Statistics

Statistics for different types of errors occurred during the run.

8. Errors per Second

Shows errors occurred at any instance of execution.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 12 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

TEST EXECUTION

1.8 Run Details


Run details for the first round of execution where each script will be run separately.
Concurrent users load for each run is same as specified in section 1.7 (Workload
Criteria)

Incremental type of load for runs having more than 50 concurrent users: -

SCRIPT

50 Vusers
with50 Vusers

TUNE

Increment
users by
100 RESULT

100
Response time
not matched
Ideal Response Time

Incremental Type of Load Test

100
 Run starts with 50 concurrent users to see the response time matches the
objectives.

 If objectives won’t match then problems will be identify. After resolving the
problems repeat test will be done with same number of users.

 Load will be increased until specified load is achieved along with ideal response
time.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 13 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

From the NFR documents all scripts will be checked for Ideal response time and
Acceptable response time limits.

S.N Runs Type of Objectives- Objectives-


Test Ideal Response Acceptable Response
time (Sec) time (Sec)

1 Project Load Test 2 4


Creation(Online)

2 Time Sheet Load Test 2 4


creation/submission
(fortnightly)

3 Express Search Load Test 10 15

4 Express Resume Load Test 10 15

5 Staffing Workbench Load Test 10 15

6 Time Sheet Approval Load Test 3 4


(PM)

7 Expense Report Load Test Need to find out the time taken by
Creation/submission batch process to process the time sheet
data.

8 Upload Resume Load Test 2 4

9 Resource Transfer Load Test Need to find out time taken by batch
process to transfer total number of
Resources.

10 Create Service order Load Test 2 4


by PM

11 Resource Assignment Load Test 2 4

12 Resource Approval by Load Test 2 4


PM

13 Job Spy Load Test 10 15

14 Pool Resources Report Load Test 2 4

15 Resource Rotation Load Test 300 300


Report

16 Assignment Listing Load Test 300 300


Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 14 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

Report

17 Open Service Order Load Test 300 300


listing Report

18 Travel Ready Resource Load Test 300 300


Listing

19 Resource Billability Load Test 300 300


Report

20 Proposal Effort Report Load Test 20 20

21 Time Sheet- Entry Load Test 20 20


NonCompl Report

22 Total Number of Hours Load Test 20 20


Report

23 Approval Non Load Test 20 20


Compliance Report

24 Projects Closure within Load Test 20 20


20 days Report

25 Utilization File Load Test Need to find out time taken by batch
process to calculate utilization of all
Resources.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 15 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.9 Pre-Test Procedures


 Before starting the execution connectivity to the application need to check. Have
to check that all the instances of the Application and database server are up and
running.

 Have to verify that all the scripts are running fine.

 Connectivity of Controller and Load Generators has to check before starting the
run.

 Need to check the settings done on server for monitoring before starting the
execution.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 16 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.10 Test Run


1. For the first round of execution the user load will be apply from one Load
generator only (10.237.99.39) as we are executing each script separately.
Users distribution across all the scripts will be same as it is mentioned in
section 1.7 (Workload Criteria)

2. For the second round of execution users load will be apply from nine
different Load generators (10.236.156.219, 10.224.60.81, 10.236.227.43,
10.238.184.162, 10.238.82.35, 10.226.48.48, 208.20.155.148,
10.236.208.35, 10.237.99.39)

3. Following is the pattern for applying the load from different Geographical
areas.

4. For third round of execution scripts will be run along with FMS scripts to
check the impact of FMS existence on application.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 17 of 18


Controlled copy Performance Test Design-XXXXXXXXXX

1.11 Post-Test Procedures


For test execution all the assigned controllers are located at <Location>, but
the test execution will be done from Pune through Remote control. So once
the test execution will over all the results will be collected at <Location>
controller machine and then will be moved to Pune lab for further analysis.

Once the analysis will be over, all the observations and graphs mentioned in
section 1.8 will be mailed to all the teams involved in <Application>.

Project ID: <SCI.ID. > / Ver: <Ver No.>

Pilot Release ID: C3: Protected Page 18 of 18

You might also like