Professional Documents
Culture Documents
Performance Testing Training - IVS - 2
Performance Testing Training - IVS - 2
Performance Testing Training - IVS - 2
Desired Outcome
* Network? DISK? *
* *
* *
“Give me red” Orange
CPU?
Undesirable
outcome from a functionally
tested application
How many of us like to click on a button and see the result – “Error : Page not found”?
None of us would want to face these situations
Neither do the owners of these sites want their
users to face such situations
This is where Performance testing comes into action
Performance Tests determine the runtime “behavior” of the application and its supporting
infrastructure, under operational conditions
Performance Benchmark
Defects could originate from any part of the application – Application Program, Database
Systems, Application Server, Web Server, Network, Operating System
To simulate real time issues its recommended to conduct performance tests in close-to-real
time test environments
If not the same, Test Environments should be proportionately scaled down from Production
Environments
Finds problematic areas in the SUT Fixes problematic areas in the SUT
Assesses only Performance Critical Scenarios Assesses all possible Functional Scenarios
Analytical and mathematical in nature, based on Either pass or fail is the test case’s execution
various theories. result
Classified as partly art and science Quite evolved and best practices baseline for
various methodologies
Stress Test –> Start with a low user load. Increment the user load with a fixed number of users at a
regular interval ; Break Point
Endurance Test –> Exert constant user load for a prolonged duration ; Memory Leaks
Volume Test –> Exert constant user load for multiple iterations with different database volumes each
time ; Behaviour @ various DB volumes
Scalability Test –> Start with a low user load. Increment the user load with a fixed number of users at a
regular interval ; Max TPS
Change Management
Milestone Recommendation
Requirement Design
Iterative Analysis & tuning
Checkpoint
Checkpoint Checkpoint
process
Joint Activity
SLA -> Verify whether the SLAs are reasonable and have a concrete basis. Determine In/Out of Scope
activities
Priority -> Given a set of applications, prioritize them to pick up the critical ones
Critical Scenarios Identify critical scenarios to be Performance Tested. Business Critical, Resource Critical
Type of Tests Identify the types of Tests to be Performance Tested as driven by the requirement viz. speed,
scalability, stability
Tools Choose the Testing tool, Monitors, Profilers as driven by the requirement. Application Protocol,
Budget are key factors
Stubs/Drivers Determine the need of Stubs/Drivers if tools/monitors don’t satisfy the need
Performance Parameters Identify the monitoring parameters to critically analyze the SUT
POC May be required to critically ascertain whether the desired output is obtained, if the testing
involves a new Tool/Technology
Offshore-ability Not all applications could be Performance Tested offshore. Need to derive an Offshore-ability
criteria and identify applications to be offshore
Test Environment Validate whether the installed software versions & patches reflect the production environment.
Also the Prod Env hardware should be linearly scaled down to the Test Env hardware to
facilitate scalability assessment
Design & Develop Develop Performance Test Scripts, Stubs/Drivers as decided. Validate scripts
Monitor -> the counters during test execution. Requirements will dictate what measurements will be
collected and later analyzed
Measurements collected during this activity may include but aren't limited to the following:
Transaction Response Time
TPS
Memory Usage
CPU Usage
Correlate -> the captured output from all the monitors (Testing Tool, Server/DB/Network Monitors) to
identify the problematic component
Iterative Test Execution -> Isolate the problematic components (modules/scenarios/transactions) and
re-iterate the test executions to confirm the findings
Scope -> Not Tuning but providing Tuning Recommendations is a in-scope function of Performance
Testing
Experience -> Common observation is that 60% of Performance issues are related to Database –
which includes DB configuration, Indexing and SQL Query structuring
Scale -> In case of a scaled down Test Env, its recommended to Linearly scale down from the
Production Env, to facilitate scalability assessment
Network -> An Isolated Test Environment ensures the performance results are free from all kinds of
biases
Ability to change recording to a Yes, for some No, Supports only one Yes
different protocol in the middle of a protocol protocol (Http)
recording session
Actions in a script can be iterated Simple runtime setting No, Programming effort is Simple runtime
any specified number of times change required setting change
without programming
Thank you