Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 72

Public

Performance Testing
in Nutshell
with Apache JMeter Example

Tomas Veprek
Software Performance Tester
Tieto, Testing Services
tomas.veprek@tieto.com
Agenda

Public
2. Why do we bother?
1. What is it? 3. How do we do it?

Performance Testing

4. When do we do it? 6. Where do we do it?


5. Who benefits from it?

2 © Tieto Corporation
About Me …

Public
• Graduated from Technical University of Ostrava in 2003

• Since 2005 employed at Tieto Czech


• Functional software tester
• 3rd tier support of EMC Documentum applications
• Performance tester (2007 - present)

• Proponent of Context-Driven School of Testing

3 © Tieto Corporation
Public
1. What is performance testing?

4 © Tieto Corporation
Software testing is …

Public
The process of evaluating a product by
learning about it through exploration,
experimentation, which includes to some
degree: modelling, study, observation,
inference, etc.

-- James Bach

5 © Tieto Corporation
Performance testing is …

Public
• Type of software testing that evaluates a software product
mainly for the following qualities:

Performance

Stability Scalability

6 © Tieto Corporation
Software Product / System

Public
• Diagram capturing the entire system stack
Application (application code, web,
application, database servers)

System libraries

Kernel (file system, memory management,


Operating system stack
task scheduling, network stack)

Drivers

Hardware

7 © Tieto Corporation
Performance

Public
• How fast are operations performed by the system under
certain load?

• Operations:
• Submitting data after clicking a button on an HTML page
• REST API call
• Disk read / write operation

• Different people see performance differently


• Different tasks imply different performance expectations

8 © Tieto Corporation
Key Performance Metrics

Public
• Response time
• Latency
• Throughput
• IOPS
• Utilisation
• Saturation

9 © Tieto Corporation
Response Time and Latency

Public
• Response time = time for an operation to complete,
including the time spent waiting and being serviced

• Latency = time an operation spent waiting in a queue

• Response time = Latency + Service time

• Easily quantifying degradation and improvement

• Interpretation depends on the purpose of the operation

10 © Tieto Corporation
Latency - HTTP GET Request

Public
• Assume that a GET request consists of:
• DNS resolution
• Establishing TCP connection
• Data transfer from the server to the client

• What do we mean by latency and response time?

11 © Tieto Corporation
Latency – HTTP GET Request (2)

Public
DNS Establishing TCP Data transfer
Resolution connection

Latency

Response Time

12 © Tieto Corporation
Latency – HTML Rendering

Public
• User submits the data entered into a form on HTML page
and waits until the page is rendered with the server
response.

• What do we mean by latency and response time?

13 © Tieto Corporation
Latency – HTML Rendering (2)

Public
DNS Establishing TCP Data transfer HTML page
Resolution connection rendering

Latency

Response Time

14 © Tieto Corporation
Throughput

Public
• The rate at which work is completed

• The meaning depends on the target evaluated

• Examples:
• Bytes / bits per second
• SQL queries per second
• REST calls per second
• Transactions per second

15 © Tieto Corporation
IOPS

Public
• Input / output operations completed per second
• Throughput-oriented metric

• The meaning depends on the target evaluated

• Examples:
• Network devices (TCP) – packets received / sent per second
• Block devices – reads / writes per second

16 © Tieto Corporation
Utilisation

Public
• Time-based definition: How busy a resource was during
a period of time
• Queuing network theory: U = B / T * 100 [%]
• CPU utilisation
• Disk utilisation

• Capacity-based definition: The extent to which the


capacity of a resource was used during a time period.

17 © Tieto Corporation
Saturation

Public
• The extent to which a resource has queued work because
it cannot accept more work

• Capacity-based utilisation >= 100%

• Increases latency and response time

• Bottleneck = the resource that limits performance

18 © Tieto Corporation
Scalability

Public
• How does the performance of the product change under
increasing load?

• Load
• Hourly number of users accessing the product
• API calls per second
• Amount of data uploaded to the product per second

• System may perform and may not be scalable

19 © Tieto Corporation
Throughput vs. Load Graph

Public
20 © Tieto Corporation
Response Time vs Load

Public
21 © Tieto Corporation
Stability

Public
• Does the system perform over long time?
• Does performance gets back to normal after an
exceptional situation?

• Memory leaks
• Excessive logging on disk devices

22 © Tieto Corporation
Memory Leak

Public
23 © Tieto Corporation
Public
2. Why to do performance testing?

24 © Tieto Corporation
Public
It is all about risks and
consequences

Costs Benefits

25 © Tieto Corporation
Risks and Consequences

Public
Risk:
System may crash after being released for
public use

Consequences:
Company will lose money
Company may lose its customers and its credit

26 © Tieto Corporation
Risks and Consequences (2)

Public
Risk:
System may be slower than our competitors’
systems

Consequences:
Users may complain
Company may lose money

27 © Tieto Corporation
Risks and Consequences (3)

Public
Risk:
The new software component integrated
in our system may degrade its performance

Consequences:
User experience will adversely be affected
Additional costs for the company to fix the problem

28 © Tieto Corporation
Component as a Performance Issue

Public
Database server 1

Software load
Web service
balancer

Database server 2

29 © Tieto Corporation
Risks and Consequences (4)

Public
Risk:
System may not be able to handle
occasionally high loads throughout the year

Consequences:
Customers will complain loudly
Company will lose money and credit
Customers may leave for competitors’ systems

30 © Tieto Corporation
Risks and Consequences (5)

Public
Risk:
System will not be able to run over long time

Consequences:
System has to be restarted during the evening hours
Users may complain about system unavailability

31 © Tieto Corporation
Public
3. How to do performance testing?

32 © Tieto Corporation
Common Performance Testing Activities

Public
• Problem statement clarification

• Workload modelling

• Understanding performance requirements

• Performance test design

• Load generation tool selection

33 © Tieto Corporation
Common Performance Testing
Activities (2)

Public
• Performance test implementation

• Test data creation

• Monitoring setup

• Performance test execution

• Test results analysis

• Reporting to stakeholders

34 © Tieto Corporation
How does it all begin?

Public
Client Performance
tester

35 © Tieto Corporation
Conversation with Client

Public
• Client: “We are just about to release a new version of
software application and we want to do performance
testing.”
• P.Tester: “When exactly are you releasing?”
• Client: “In one week.”
• P.Tester: “What do you want to learn from performance
testing?”
• Client: “How fast is the application compared to the
previous version.”

36 © Tieto Corporation
Conversation with Client (2)

Public
• P.Tester: “Did you performance test the previous version?”
• Client: “No. We didn’t. I thought you would do it now.”
• P.Tester: “Do you realise there’s only one week left before
deploying the new version in production?”
• Client: “Yes, the schedule is very tight, but you can just do
simple performance tests.”
• P.Tester: “Uff!… What do you mean by simple tests?”
• ….

37 © Tieto Corporation
Activity #1: Clarify Problem
Statement

Public
• What is the system to be performance tested?

• What is the mission of performance testing (risks to be


reduced, information to be collected)?

• What context will performance testing be done in?


• Clients
• Time and budget
• System status
• Development team & lifecycle
• Experience and skills of performance testers

38 © Tieto Corporation
Problem Statement: Example

Public
• Simple air ticket purchase system
• Web-based application
• Features:
• User registration
• Search for outbound and return flights
• Purchase an air ticket
• Browse the itinerary

• Mission: Test how the application performs under the


anticipated load in production.

39 © Tieto Corporation
System Architecture

Public
Web server
(Xitami)

CGI scripts
in Perl

Flights and
purchased tickets

Physical machine

40 © Tieto Corporation
Air Ticket System: Login Page

Public
41 © Tieto Corporation
Air Ticket System: Landing Page

Public
42 © Tieto Corporation
Air Ticket System: Flights

Public
43 © Tieto Corporation
Air Ticket System: Choose Flight

Public
44 © Tieto Corporation
Air Ticket System: Make Payment

Public
45 © Tieto Corporation
Air Ticket System: Invoice Summary

Public
46 © Tieto Corporation
Air Ticket System: Itinerary

Public
47 © Tieto Corporation
Activity #2: Workload Modelling

Public
• Analyse how the system under test is or to be used
• Determine how much load will be applied

Disturbances

Resulting
Input performance
System under test

48 © Tieto Corporation
Workload Modelling: Input

Public
• Web applications: User actions over web pages

• Web services: REST / SOAP requests

• Relational database: SQL queries

• Java component: Method calls

• SMTP server: Requests sending emails

49 © Tieto Corporation
Workload Modelling: Input (2)

Public
• Which operations do we consider as input?

1. Classify operations using the following criteria:


• Frequency (popularity)
• Business criticality
• Data amount
• Execution time

50 © Tieto Corporation
Workload Modelling: Input (3)

Public
2. Determine the distribution of operations
• Operations do not occur with the same frequency
• Distribution may significantly affect performance

3. Determine the range of anticipated load


• Focus on peak load (usually calculated per hour)

• Sources of usage information:


• Discussion with stakeholders
• Analysis of access log files (e.g. Google Analytics)
• Best guess based on similar software systems

51 © Tieto Corporation
Workload Modelling: Disturbances

Public
• Scheduled jobs (e.g. database backups, search engine
indexing)

• Multiple tenants on virtualised servers

• Distributed system architectures

52 © Tieto Corporation
Workload Modelling: Example

Public
• What is the input and how do we model it?

• Identify who and what interacts with the web application


• View user actions as user scenarios
• Create behavioural diagram
• Analyse the load applied over time

53 © Tieto Corporation
Workload Modelling: Example (2)

Public
54 © Tieto Corporation
Workload Modelling: Example (3)

Public
Daily load profile
400
360
350
350
320
300
300
Sessions startd every hour

278

250

200 200
200

145
150
110
100 89

57
47
50 30
18 25 22
10 10 15 15
5 5 3 2
0
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23

Hour

55 © Tieto Corporation
Activity #3: Understanding
Performance Requirements

Public
• Clients’ expectations of how the system should perform
• Reduce the subjectivity of performance
• Depends on the system under test

• Examples:
• E-shop must handle 1200 orders every hour and 98% of all orders
should be placed within 10 s.
• Web service must handle 10000 searches for available phone numbers
every hour in less than 5 s.
• The system must have enough capacity to accommodate the load
created by 500 more users without any affect to the performance.

56 © Tieto Corporation
Performance Requirements: Example

Public
• Air ticket sales application must handle the peak load of 400
users arriving every hour with the following conditions met:
• Login must be completed in less than 3 s
• In 98% cases, available flights should be found in less then 5 s
• Purchase of flight tickets should not take longer than 7 s

57 © Tieto Corporation
Activity #4: Performance Test Design

Public
• What tests do we carry out in order to fulfil our mission?

• Load intensity and profile


• Test duration
• Ramp-up period

• Performance test types:


• Load test
• Scalability test
• Stability test
• High availability test

58 © Tieto Corporation
Performance Test Design: Example

Public
• Load intensity: 400 sessions started every hour

• Think times: 10 s between page transitions (+- 25%)

• How many virtual users (threads) do we need?

• Ramp-up period: the longest session

• Duration: 2 hours

59 © Tieto Corporation
Activity #5: Load Generation Tool
Selection

Public
• In majority of cases, the use of software tool for load
generation is undeniable

• Commercial or free depends on the budget

• Software tool is the tip of an iceberg => not the centre of


performance testing

• Choose whatever load generation tool that does the job

60 © Tieto Corporation
Load Generation Tool: JMeter

Public
• Java-based, free and open source load generation tool
• Suitable for HTTP(S), FTP, SMTP, TCP, JDBC, MongoDB, shell
scripts, …
• Scripts stored as XML files
• Currently version 2.13 (version 3.0 soon coming)
• Out-of-box bundle lacks reporting functions
• JMeter Plugins enhance JMeter functionalities
• JMeter cloud solutions: Blazemeter, Octoperf, flood IO
• Twitter: @ApacheJMeter, @jmeter_plugins

61 © Tieto Corporation
Load Generation Tool: HP LoadRunner

Public
• Commercial
• Free only the community version with 50 virtual users
• Great range of supported protocols
• Scripts in the C language
• Advanced reporting of performance test results
• Applications: VuGen, Controller, Analysis, Load Agent
• Licences determined based on the number of virtual users and
protocol bundles
• Licences are relatively costly

62 © Tieto Corporation
Activity #6: Performance Test
Implementation

Public
• Workload models transformed into test scripts (one-to-one
relationship between scripts and paths or one complex script
for web applications)

• Performance tests are composed of test scripts

• Supportive tools (e.g. HTTP proxies) used to implement test


scripts

63 © Tieto Corporation
Performance Test Implementation:
Example

Public
• Paths in the workload model:

• Path #1: Login -> Flights -> Find a flight -> exit

• Path #2: Login -> Flights -> Find a flight -> Choose a flight -> Make a payment -> exit

• Path #3: Login -> View Itinerary -> exit

• Path #4: Login -> View Itinerary -> Cancel Flight -> exit

• Path #5: Signup -> Register -> exit

64 © Tieto Corporation
Activity #7: Test Data Creation

Public
• Amount, diversity and accuracy of test data may significantly
affect performance test results

• Recyclable and non-recyclable data

• Static and dynamic data

65 © Tieto Corporation
Test Data: Example

Public
• User names and passwords – CSV file

• Available flights – chosen randomly on the fly

• Seat preference and type – chosen randomly on the fly

• Number of passengers – kept constant during the test

66 © Tieto Corporation
Activity #8: Monitoring Setup

Public
• Observing how the system performs

• System-level monitoring:
• CPU, memory, disk, network, IOPS
• Application-level monitoring
• Heap memory utilisation
• Web server active connections
• Most-time consuming SQL queries

• Software tools: built-in OS tools, HP Sitescope, New Relic,


Nagios

67 © Tieto Corporation
Activity #9: Performance Test Execution

Public
• Ensure the sufficient amount of test data is created

• Activate the monitoring of the software system

• Launch the performance test manually or schedule its


execution

• If possible, actively monitor the software system under load


(e.g. log files, additional monitoring tools, profiling)

• The test may be stopped earlier before its planned completion


as a result of a high number of errors

68 © Tieto Corporation
Activity #10: Performance Test Result
Analysis

Public
• Depends on the purpose of the performance test

• Was the mission of the performance test met? If not, why?

• Load tests: Was the course of transaction response times stable


during the test? If peaks occurred, what was the cause?

• In complex software systems, the interpretation of test results


and any suspicious behaviour is team work

69 © Tieto Corporation
Activity #11: Reporting to Stakeholders

Public
• Summarising the observations made from performance the
results in a way understandable and helpful for the
stakeholders we report to

• Stakeholders:
• Programmers
• Product owner
• Project manager
• Users

• Form: verbal or written (agreed beforehand with the


stakeholders)

70 © Tieto Corporation
Useful Links

Public
• Apache JMeter plugins: http://jmeter-plugins.org/

• Scott Barber’s User Experience, not Metrics Series: http://


www.perftestplus.com/pubs.htm

• Blazemeter JMeter Cloud: https://www.blazemeter.com/

• James Bach’s blog (Context-driven School of Testing):


http://www.satisfice.com/blog/

71 © Tieto Corporation
Public
Tomas Veprek
Software Performance Tester
Tieto, Testing Services
tomas.veprek@tieto.com

You might also like