3. Quote
⢠âThe designer is concerned with what happens
when 1 user presses a button and the architect
is concerned with what happens when 10,000
users press a button.â
⢠Sun Certified Enterprise Architect for J2EE Technology Study Guide. Page 6. Mark Cade, Simon Roberts.
2007 JavaOneSM Conference | Session TS-9235 | 2
5. Performance Testing
⢠Performance testing determines or validates the
speed of the application
(X per T).
⢠Used for finding bottlenecks and establish
baseline for the system.
⢠In other words itâs solo purpose is to determine
the response and effectiveness of a system.
6. Load Testing
⢠Load testing identifies the maximum operating
capacity of an application as well as any
bottlenecks that might interfere with its
operating capacity.
(or, when does it blow up?)
7. Stress Testing
⢠Stress testing is focused on determining an
applicationâs robustness, availability, and
reliability under extreme conditions
â Heavy loads
â High concurrency
â Limited computational resources
⢠An attempt to break the system by
overwhelming its resources
8. Scalability
⢠Scalability testing determines or validates
whether adding another Y resource (database,
memory, disk, CPU, etc) increases speed of X
proportionally
Endurance Testing
This type of testing is used to check that the system can withstand
the load for a long or large number of transactions.
9. Preparation for a Test
⢠Mission
⢠Network
⢠Hardware
⢠Software
⢠Metrics
10. Mission
⢠What is the testing intended to achieve?
⢠What are basic assumptions like:
â What is our anticipated average number of users
(normal load)?
â What is our anticipated peak number of users?
⢠When is a good time to load-test our
application (i.e. off-hours or weekends),
bearing in mind that this may very well crash
one or more of our servers?
11. Environment Prep - Network
⢠Performance testing is usually network
intensive operation and can affect others in the
organization
⢠Testing should be done on a separated /
segregated network
⢠Amazon AWS â virtually unlimited
in/out speed
12. Environment Prep â Hardware
⢠Is your machine ready to receive full load?
⢠Are multiple machine available (for distributed
testing)?
⢠Do you have enough resources
⢠Again â Amazon to the rescue â AWS gives you
as many test machines as you need.
13. Metrics
⢠Performance testing is all about numbers and
metrics
⢠Determine which metrics you are concerned about
and how to get them.
⢠Some simple tests / benchmarks can be done using
Apache Bench âabâ command:
â˘
Suppose we want to see how fast our site can handle 100 requests, with a
maximum of 10 requests running concurrently:
⢠ab -n 100 -c 10 http://example.dev/
14. JMeter â Software of Choice
⢠Open source desktop / server application
⢠Designed for functional/load/performance/stress testing
⢠Extensible⌠write your own test
⢠Simulate heavy load (application, server and network)
⢠Gives instant visual feedback
⢠Distributed testing
⢠Various protocols - HTTP, FTP, JDBC, JMS, LDAP, SOAP
⢠Multi-platform
⢠Full multithreading framework
⢠Caching and offline analysis/replaying of test results.
⢠JMeter is not a web browser!
15. JMeter vs. Real World
Real World JMeter World
One User Browser request HTTP Request Sampler
One HTML page display with JavaScript
Execution
View Tree Listener with basic HTML
display.
No JavaScript execution
Multiple Users Requesting Pages
Simultaneously
Thread Group Configured for Number of Users,
uses the same HTTP Request Sampler to
simulate multiple users
No Equivalent
Difficult to do
Measuring performance like min, max, and
average (avg) time for processing using
Summary Report Listener
16. JMeter - Terminology
JMeter Term Meaning
Test Plan - You keep your test plan under me
- Only one per one JMeter window
- Save it for a future use
Thread Group - Represent one set of action â one scenario
- You add actions that single user will do
- JMeter will use them to simulate multiple users
HTTP Request Sampler - Record the request to web server
- Also receive the response from Web server
- Provide all data received for analysis
View Tree Listener - Show the test data in details for each item
- For HTTP Request it showâs Request, Response and Status of
Transaction
Summary Report Listener - Show aggregated values for all users
- Useful when multiple users are simulated
- Provide performance information
17. 17
JMeter Testing Tools
⢠Test Plan
⢠Thread Group
⢠Controllers:
â Samplers &
â Logical Controllers
⢠Listeners
⢠Timers
⢠Assertions
⢠Configuration Elements
⢠Pre-Processor Elements
⢠Post-Processor Elements
18. ⢠Pre-Processor Elements
⢠A Pre-Processor executes some action prior to a Sampler Request
being made.
⢠If a Pre-Processor is attached to a Sampler element, then it will
execute just prior to that sampler element running.
⢠A Pre-Processor is most often used to modify the settings of a
Sample Request just before it runs, or to update variables that aren't
extracted from response text.
⢠Post-Processor Elements
⢠A Post-Processor executes some action after a Sampler Request has
been made.
⢠If a Post-Processor is attached to a Sampler element, then it will
execute just after that sampler element runs.
⢠A Post-Processor is most often used to process the response data,
often to extract values from it. See the scoping rules for more details
on when Post-Processors are executed.
19. Execution order
1. Configuration elements
2. Pre-Processors
3. Timers
4. Sampler
5. Post-Processors (unless SampleResult is null)
6. Assertions (unless SampleResult is null)
7. Listeners (unless SampleResult is null)
⢠Timers, Assertions, Pre- and Post-Processors are only processed if there is a
sampler to which they apply.
⢠Logic Controllers and Samplers are processed in the order in which they appear in
the tree.
⢠Other test elements are processed according to the scope in which they are found,
and the type of test element
21. 21
JMeter â Basic Elements
⢠ânumber of threadsâ - In other words, this variable is the number of users
executing a âreal lifeâ use case on your system.
⢠This number is not the number of concurrent / parallel users executing a âreal
lifeâ use case on your system:
⢠the concurrency of the users depends on both the duration of your scenario
and the ramp up time configured on the thread group.
⢠The âramp up timeâ in a thread group is the actual time taken by JMeter to
spawn all the threads.
⢠A rough estimation of the throughput (number of requests per second) during
the ramp up period of your test plan is:
number of threads / ramp up time (in seconds).
22. Tips & Tricks
⢠You should try to have a constant throughput during a run:
It is often very difficult to âcontrolâ the throughput particularly
during the ramp up period
⢠If your objective is to simulate a âpeakâ:
You should have a âhighâ number of threads and a âlowâ ramp
up time and number of loops
⢠If your objective is to simulate a âlong runâ:
You should have a âmediumâ number of threads, a âhigherâ
ramp up time and a âhighâ number of loops
⢠Note: The terms âhighâ, âhigherâ, âmediumâ and âlowâ are
voluntary qualitative in the 2 bullets above as they depend on
the system you are testing.
24. JMeter â Adding HTTP Requests
⢠The âHTTP Request Defaultâ elements does not tell
JMeter to send an HTTP request. It simply defines
the default values that the HTTP Request elements
use.
⢠In our test plan we need to add at least one âHTTP
Request Samplerâ.
⢠JMeter sends requests in the order that they
appear in the tree.
⢠Add HTTP Request to the JMeter Users element
(Add->Sampler->HTTP Request).
26. JMeter - Listener
⢠The final element you need to add is a
Listener.
⢠This element is responsible for storing all of
the results of your HTTP requests in a file and
presenting a visual model of the data.
⢠Select the JMeter Users element and add a
Graph Result Listener and Summary Report
Listener. (Add->Listener->Graph Results).
29. 29
The Cloud and Load Testing
⢠One of the things the Cloud is useful for is load testing;
very large amounts of hardware can be used to generate
load at minimal cost.
⢠Added benefit, if your application you are testing is
external to your corporate network, your tests will be run
from a realistic location which prevents any problems
with artificial bottlenecks occurring on your LAN.
⢠This type of testing, using externally located hosts, is
increasingly common and JMeter is a superb open-source
solution for facilitating this.
30. Easy Amazon AWS & JMeter
⢠Thanks to Oliver from http503.com and his
automated JMeter on EC2 script
http://www.http503.com/2012/run-JMeter-on-amazon-
ec2-cloud
⢠Run with up to 20 concurrent EC2 instances
(default max number)
⢠Run when you want it and how you want it!
31. JMeter â http503.com script
⢠It does things like:
â Using Amazonâs API to launch instances
â Installing JAVA & JMeter
â Copying test files to each server
â Adjusting thread counts to ensure the load is evenly
distributed over each host
â Editing the jmx for file paths to external data files
â Displaying real-time aggregated results from the test as it
is running
â Downloading and collating all jtl files
â Terminating instances.
32. 32
JMeter-EC2
⢠Prerequisites
â Test plan to be run has a Generate Summary Results listener.
â Your JMeter test plan has a duration or loop count value set.
⢠Prerequisites specific to using Amazon:
â That you have an Amazon AWS account.
â You have Amazonâs API tools installed on your machine.
⢠Note: you can control execution from your local machine,
but you will need to open few ports for data return. Difficult
if you are behind corporate firewall/proxy.
â Solution: Run controller on Amazon AWS (another EC2 instance)
33. 33
Installation Process
⢠Go to the:
http://www.http503.com/2012/run-jmeter-on-amazon-ec2-
cloud/#example
⢠Small tips
â Create security group with port 22 open to the world (or
your IP address).
â Also allow all machines inside security group to access
each other.
34. 34
JMeter-EC2
⢠Setup your machine image key, and your
secret key pair.
â Call it:
project="myblogtestplan" count="3" owner="Vladâ
./jmeter-ec2.sh
36. 36
Results Analysis
⢠Running a test plan is only 50% of the
performance testing task.
⢠The most challenging part of the performance
testing process is the analysis of test results
and the identification of bottlenecks.
⢠Think of the load testing reports as the
evidentiary support to prove your innocence to
a crime in court.
37. Just a moment pleaseâŚ
⢠Before going any further, we should spend some time
on the measurable outcomes of a stress test. There are
mainly 2 interesting measures that you can record
when you run a stress test on a web application:
⢠The throughput: is the number of requests per unit of
time (seconds, minutes, hours) that are sent to your
server during the test.
⢠The response time: is the elapsed time from the
moment when a given request is sent to the server until
the moment when the last bit of information has
returned to the client
38. ... moment please âŚ
⢠The throughput is the real load processed by
your server during a run but it does not tell
you anything about the performance of your
server during this same run.
⢠This is the reason why you need both
measures in order to get a real idea about your
serverâs performance during a run.
The response time tells you how fast your
server is handling a given load.
39. Results Analysis â Interpreting Results
⢠What do we want to find inside our reports?
⢠Kind of Reports
â Summary Report*
â Graph Results
â View Results in Tree
â View Results in Table
⢠Extra report types
â Response Times vs Threads**
â Transaction Throughput vs Threads**
⢠* - must have report listener!
⢠** - Thread == user (in JMeter world)
40. 40
Response Times vs Threads
⢠This graph shows how
Response Time changes
with amount of parallel
threads.
⢠Naturally, server takes
longer to respond
when a lot of users
requests it
simultaneously. This
graph visualizes such
dependencies.
41. 41
Transaction Throughput vs Threads
⢠This listener is very similar to Response Times vs
Threads, except it shows total server's transaction
throughput for active test threads.
⢠The formula for total server transaction throughput is
<active threads> * 1 second / <1 thread response
time>
⢠So basically, it shows the statistical maximum possible
number of transactions based on number of users
accessing the application.
44. Results Graph - Average Load Time
⢠Page load speeds in
milliseconds.
⢠Lower is better
⢠On a stable system, it
should go flat
45. Results Graph - Deviation
⢠The deviation
(variability) of the load
speed in milliseconds.
⢠Lower is better
⢠On a stable system, it
should go flat
46. Results Graph - Throughput
⢠Throughput in pages per
second.
⢠Higher is better
⢠On a stable system, it
should go flat
48. Good sign
⢠When the values on the
graph begin to flatten
out, it shows that the
system has become
stable at that load.
â Speed flattening
â Throughput flattening
â Deviation dropping
â No exceptions ;-)
49. Connect Results With Logs
⢠Learn to relate problems on the server to its effect on the
graph.
⢠Big spikes indicate that you have a problems
⢠You may see the effects of:
â Exceptions
â Garbage collection
51. Tips & Tricks
⢠A clear name for each performance test
⢠non-GUI mode is more stable than GUI mode
⢠Do not use listeners if not needed
⢠Ramp-up is needed for heavy load
⢠Assertion is needed to simulate a virtual user
⢠Unstable tests: think of data while an user run its scenario
⢠If one user can not login, its later steps should not be counted
⢠Backup after every important step you made to your script easily by
cloning the jmx file
⢠Speedup JMeter script modifying with text editors which support
regex
52. Simulate User Behavior in JMeter
⢠Only Once Controllers
⢠Cache Management
⢠Cookie Management
⢠Header Management
⢠Think Times
53. Gaussian Random Timer
⢠This timer pauses each thread request for a random
amount of time, with most of the time intervals occurring
near a particular value.
⢠The total delay is the sum of the Gaussian distributed value
(with mean 0.0 and standard deviation 1.0) times the
deviation value you specify, and the offset value.
54. References
⢠Apache JMeter
http://jmeter.apache.org/
⢠jmeter-ec2 | Run JMeter on Amazonâs ec2 Cloud
http://www.http503.com/2012/run-jmeter-on-amazon-ec2-cloud
⢠Some thoughts on stress testing web applications with JMeter
http://nico.vahlas.eu/2010/03/17/some-thoughts-on-stress-testing-web-applications-with-jmeter-part-1/
⢠JMeter tips
http://www.javaworld.com/javaworld/jw-07-2005/jw-0711-jmeter.html
⢠Response Times: The 3 Important Limits
http://www.nngroup.com/articles/response-times-3-important-limits/
⢠Apache JMeter Custom Plugins
http://jmeter-plugins.org
⢠JMeter Wiki
http://wiki.apache.org/jmeter/JMeterLinks
⢠Amazon AWS EC2 Command Line Toolkit
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/SettingUp_CommandLine.html
⢠Bayo Ernie - Performance Testing With JMeter 2.9 [Kindle Edition]
Reality check: Architect design for 10,000 users Developer programs for 1 user Murphy crashes it on 100 users
Bottleneck A bottleneck is a phenomenon where the performance or capacity of an entire system is limited by a single or limited number of components or resources. Your web application can consist of several modules used to process request. If one of them has technical limitation, it limits the performance of the whole system Bottleneck in the application can be identified by performing the load test with the defined concurrent user load for various scenarios. Or in other words, performance test is here to determine how fast is the system .
Stress Testing Every system has a capacity limit. When the load goes beyond the limit, the web site starts responding very slowly and even produce errors. The purpose of the stress testing is to find the capacity limit of the system. With it we can verify at which point of time the system degrades or fails. Usually done by increasing the user load in the system.
Endurance Testing Test will be performed with defined set of concurrent users for a prolonged period of time, say for example 5 to 10 hours or 2 to 3 days.
Sequential functional (low-volume) benchmark (the average number of users) load-test (the maximum number of users) test destructively (what is our hard limit?)
JMeter does not execute JavaScript found in HTML pages, nor does it render HTML pages the way a browser does. It does give you the ability to view request responses as HTML through one of its many listeners.
Test Plan A test plan describes a series of steps JMeter will execute when run. A complete test plan will consist of one or more Thread Groups, logic conrollers, sample generating controllers, listeners, timers, assertions, and configuration elements.Thread Group Thread group elements are the beginning points of any test plan. All controllers and samplers must be under a thread group. Other elements, e.g. Listeners, may be placed directly under the test plan, in which case they will apply to all the thread groups. Each thread will execute the test plan in its entirety and completely independently of other test threads. Multiple threads are used to simulate concurrent connections to your server application.Samplers Samplers tell JMeter to send requests to a server and wait for a response. They are processed in the order they appear in the tree. Controllers can be used to modify the number of repetitions of a sampler. Listeners Listeners provide access to the information JMeter gathers about the test cases while JMeter runs. The Graph Results listener plots the response times on a graph. The "View Results Tree" Listener shows details of sampler requests and responses, and can display basic HTML and XML representations of the response.
Controllers JMeter has two types of Controllers: Samplers and Logical Controllers. These drive the processing of a test. Samplers tell JMeter to send requests to a server. For example, add an HTTP Request Sampler if you want JMeter to send an HTTP request. You can also customize a request by adding one or more Configuration Elements to a Sampler. Logical Controllers let you customize the logic that JMeter uses to decide when to send requests. TimersBy default, a JMeter thread sends requests without pausing between each request. We recommend that you specify a delay by adding one of the available timers to your Thread Group. If you do not add a delay, JMeter could overwhelm your server by making too many requests in a very short amount of time. The timer will cause JMeter to delay a certain amount of time before each sampler which is in its scope. AssertionsAssertions allow you to assert facts about responses received from the server being tested. Using an assertion, you can essentially "test" that your application is returning the results you expect it to. Configuration ElementsA configuration element works closely with a Sampler. Although it does not send requests (except for HTTP Proxy Server ), it can add to or modify requests. Pre-Processor ElementsA Pre-Processor executes some action prior to a Sampler Request being made. If a Pre-Processor is attached to a Sampler element, then it will execute just prior to that sampler element running. A Pre-Processor is most often used to modify the settings of a Sample Request just before it runs, or to update variables that aren't extracted from response text. Post-Processor ElementsA Post-Processor executes some action after a Sampler Request has been made. If a Post-Processor is attached to a Sampler element, then it will execute just after that sampler element runs. A Post-Processor is most often used to process the response data, often to extract values from it. See the scoping rules for more details on when Post-Processors are executed.
Pre-Processor Elements A Pre-Processor executes some action prior to a Sampler Request being made. If a Pre-Processor is attached to a Sampler element, then it will execute just prior to that sampler element running. A Pre-Processor is most often used to modify the settings of a Sample Request just before it runs, or to update variables that aren't extracted from response text. Post-Processor Elements A Post-Processor executes some action after a Sampler Request has been made. If a Post-Processor is attached to a Sampler element, then it will execute just after that sampler element runs. A Post-Processor is most often used to process the response data, often to extract values from it. See the scoping rules for more details on when Post-Processors are executed.
The Thread Group tells JMeter the number of users we want to simulate, how often the users should send requests, and the how many requests they should send. Threads == number of users. Ramp-Up Period, This property tells JMeter how long to delay between starting each user. For example, if you enter a Ramp-Up Period of 5 seconds, JMeter will finish starting all of your users by the end of the 5 seconds. If we have 25 users and a 5 second Ramp-Up Period, then the delay between starting users would be 5 second (25 users / 5 seconds = 5 user per second). The â number of loops â in a thread group is the actual number of times that the scenario will be executed by each thread.
Define the tasks that they will be performing. HTTP requests. (HTTP Request element), you will add HTTP Request elements which use some of the default settings you specified here. Begin by selecting the JMeter Users (Thread Group) element. Click your right mouse button to get the Add menu, and then select Add --> Config Element --> HTTP Request Defaults. Nearly all web testing should use cookie support, unless your application specifically doesn â t use cookies. To add cookie support, simply add an HTTP CookieManager to each Thread Group in your test plan. This will ensure that each thread gets its own cookies, but shared across all HTTP Request objects. *** To add the HTTP Cookie Manager , simply select the Thread Group , and choose Add --> Config Element --> HTTP Cookie Manager.
JMeter client machine may not able to simulate enough users to stress server. Control multiple machine to run JMeter without copying test samples to different machine. Configuration: Copy same version of JMeter to different computer. Add remote node IP in JMeter.properties file. Run JMeter on remote machine using /JMETER_HOME/bin/JMeter-server (in command prompt) Start JMeter GUI in host machine. Select any test plan. Go to Run >> Remote Start >> Remote IP Address.
[*Without this no results will be displayed to the screen but the test will still run. No other listeners need to nor should be present.][**Without this the test will run forever or until you press CTRL-C. All testplans should also employ some form of pacing as best practice - load tests should not be run without some way to control the throughput. One way this can be achieved in JMeter is using the Constant Throughput Controller.]
The View Results Tree  is very handy when â debugging â a scenario as it allows to monitor all the HTTP Requests and Responses exchanged with the server. The draw back is that it consumes too much memory to be used in a large stress test. The View Results in Table listener is also useful in the early stages of the stress test implementation as it gives a good and fast overview of the execution of a test plan. However, this listener also consumes too much memory to be used in a large stress test.
Example calculations:when you have one thread (user) sending requests to server and server responds after 100ms, you have 1 thread * 1000ms / 100ms = 10 transactions per second. when you have 10 threads sending requests to server and server responds after 100ms, you have 10 threads * 1000ms / 100ms = 100 transactions per second. This means that your server haven't reached its resources limits and the more users works with it the more transaction it processes. when you have 20 threads sending requests to server and server responds after 200ms, you have 20 threads * 1000ms / 200ms = 50 transactions per second. This means that at 20 parallel users server become responding slower and there's some overhead to handle parallelism, e.g. database locks.
[90% line of response time] is more reliable than [average response time] Variation in range of [-5%,5%] should be treated as normal for [response time] Variation out of range [-10%,10%] should be treated as an improvement/decline of [response time] Variation in range of [-10%,-5%) and ( 5%,10%] should be considered as a decline/improvement of [response time] Variation in range of [-3%,3%] should be treated as normal for [Throughput] Variation out of range [-5%,5%] should be treated as an improvement/decline of [Throughput] Variation in range of [-5%,-3%) and ( 3%,5%] should be considered as a decline/improvement of [Throughput]
From Previous Slides: There are mainly 2 interesting measures that you can record when you run a stress test on a web application: The throughput : is the number of requests per unit of time (seconds, minutes, hours) that are sent to your server during the test. The response time : is the elapsed time from the moment when a given request is sent to the server until the moment when the last bit of information has returned to the client The throughput is the real load  processed by your server during a run but it does  not  tell you anything about the performance  of your server during this same run. This is the reason why you need both measures in order to get a real idea about your server â s performance during a run. The response time tells you how fast your server is handling a given load .
Average value can be very misleading. We can have sample where values goes from 90ms up to 900ms and average to be ~500ms. Other set can have all values from 490ms to 510ms. In both cases average value is the same!
Deviation Standard deviation measures the mean distance of the values to their average. In other words it gives you a good idea of the dispersion or variability of the measures to their mean value.
Gaussian Timer Uniform Random - This timer pauses each thread request for a random amount of time, with each time interval having the same probability of occurring. The total delay is the sum of the random value and the offset value.