5. 5 #Dynatrace
When to find them?
Requirements /
Specification /
Design
(Load) Test /
QA /
Acceptance
Deployment /
Production /
Maintenance
Development
(Load)Test/
QA /
Acceptance
Dev
Test
Development
Dev
Test
Deployment /
Production /
Maintenance
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
Dev
Test
Dep
7. 7 #Dynatrace
The Challenge
» Performance is not a band aid!
» Architecture has enormous influence!
You have to continuously ensure your
performance requirements are met!
"I couldn't help but notice your pain."
"My pain?"
"It runs deep. Share it with me!"
(Star Trek V)
9. 9 #Dynatrace
Software testing tells us that our system
» meets the requirements that guided its design and development,
» responds correctly to all kinds of inputs,
» performs its functions within an acceptable time,
» is sufficiently usable,
» can be installed and run in its intended environments, and
» achieves the general result its stakeholders desire.
Source: Wikipedia
What are we learning from our tests?
10. 10 #Dynatrace
Let’s look at the tests we run
Unit
Tests
Integration
Tests
Acceptance
Tests
Load Tests
Meets
requirements
Responds
correctly to input
Performs in
acceptable time
Usability
Deployment
Achieves Correct
Result High effort (have to be created and maintained)
Only possible at a rather late development phase
12. 12 #Dynatrace
The Goal
Unit Tests Integration
Tests
Acceptance
Tests
Load Tests
Meets requirements
Responds correctly
to input
Performs in
acceptable time
Usability
Deployment
Achieves Correct
Result
13. 13 #Dynatrace
What you usually get
Measuring Performance of
Unit and Integration Tests
[junit] Running com.dynatrace.sample.tests.FastUnitTest
[junit] Tests run: 15, Failures: 0, Errors: 0, Time elapsed: 34 sec
[junit] Running com.dynatrace.sample.tests.SlowUnitTest
[junit] Tests run: 17, Failures: 0, Errors: 1, Time elapsed: 2,457 sec
23. 23 #Dynatrace
What you currently measure
What you could measure
Performance
Metrics in your CI
# Test Failures
Overall Duration
# calls to API
# executed SQL statements
# Web Service Calls
# JMS Messages
# Objects Allocated
# Exceptions
# Log Messages
Execution Time of Tests
…
25. 25 #Dynatrace
Large Web Sites
17! JS Files – 1.7MB in Size
Useless Information!
Even might be a security risk!
26. 26 #Dynatrace
Missing Resources Cause Delays
46! HTTP 403 Requests for
images on the landing page
Lots of time “wasted” due to
roundtrips that just result in a 403
Metrics: HTTP 4xx & 5xx
Total Number of Resources
28. 28 #Dynatrace
What you currently measure
What you could measure
Performance
Metrics in your CI
# Test Failures
Overall Duration
# calls to API
# executed SQL statements
# Web Service Calls
# JMS Messages
# Objects Allocated
# Exceptions
# Log Messages
Execution Time of Tests
# HTTP 4xx/5xx
Request/Response Size
Page Load/Rendering Time
…
29. 29 #Dynatrace
Starting from…
Production
Environment
Developers CI Server Testing
Environment
Release
? ?
30. 30 #Dynatrace
…or maybe…
Production
Environment
Developers CI Server Testing
Environment
Release
31. 31 #Dynatrace
We get to…
Commit
Stage
Automated
Acceptance
Testing
Automated
Capacity
Testing
Developers Release
32. 32 #Dynatrace
Performance as a Quality Gate
Automated collection of
performance metrics in
test runs
Comparison of
performance metrics
across builds
Automated analysis of
performance metrics to
identify outliers
Automated notifications on
performance issues in tests
Measurements accessible and
shareable across teams
Actionable data through
deep transactional insight
Integration with build
automation tools and
practices
33. 33 #Dynatrace
And finally make PERFORMANCE part of our
Continuous Delivery Process
Commit
Stage
Automated
Acceptance
Testing
Automated
Capacity
Testing
Developers Release