It is a true fact that performance experienced by real world users has direct impact on our application adoption rate.Good performance increases retention and conversion rates. Our primary aim should be to deliver value to an user, and make sure that they get the best possible user experience that delights them, and keeps them coming back for more. In this regard, monitoring user behaviour becomes imperative as it provides key metrics for web application performance. While developing web applications, developers test in their local environments, and also do user acceptance testing. But what happens once the application goes out onto the real world? The case I’m making here talks about analysing application performance once its in the hands of real users.
This is where real user monitoring a.k.a R.U.M comes in picture. R.U.M captures performance metrics such as bandwidth and page/view load times, user location, device type, carrier speed, application errors, Ajax request and application usage along with custom performance metrics that provides actionable business intelligence.
In R.U.M, we can then visualize performance over time for key metrics based on different average, geometric mean, median or percentile calculations. The performance data can be drilled down further on the basis of geography, device, error encountered, speed etc.
Scaling API-first – The story of a global engineering organization
Home Brewing R.U.M - Analyzing application performance with real user monitoring
1. Home
Brewing
R.U.M
Image source: http://en.wikipedia.org/wiki/Samuel_Adams_%28beer%29#mediaviewer/File:Samadams.jpg
2. Making
developers
happy since eternity
Image source: http://en.wikipedia.org/wiki/List_of_rum_producers#mediaviewer/File:Rum_display_in_liquor_store.jpg
3. Real
User
Monitoring
What is
Image source: https://www.flickr.com/photos/madmask/421679860/in/gallery-zrav-72157623823227993/
4. Real User Monitoring
It is a technology for collecting performance
metrics directly from the browser of an end
user and sending it to a collection point for
analysis.
15. Cookies
Network = Time between beforeunload event and first byte received.
DOM Processing
Page Rendering
= Time between first byte received & DOMContentLoaded event.
= Time between DOMContentLoaded event & load event.
19. AJAX call performance
var xmlhttp = new XMLHttpRequest();
xmlhttp.open("GET","remoteResource.txt",true);
xmlhttp.send();
20. Measuring AJAX Call Performance
0 1 2 3 4
Unsent Opened Headers
Received
Loading Done
Ready State value
Time to first byte = Time between when send method is called and
when first time ready state value is 3 (Loading)
Downloading Time = Time between first time ready state value is 3
(Loading) and when it is 4 (Done)
31. Wrapper for Reporting Error
var wrappedFunction = wrapper(function(){
// function definition
});
32. Challenges
in error tracking
Image source: https://www.flickr.com/photos/chrismatos/6784152671
33. Challenges in Error Tracking
● Different browsers provide different stack trace information.
● Steps to reproduce Error.
● Need to manually wrap each function.
34. Challenges in Error Tracking
Different browser provides different stack trace information.
Tracekit.js / Stacktrace.js are well tested libraries which abstract
this.
35. Challenges in Error Tracking
Steps to reproduce error.
Capture last X events in array like Input or click event and report
them back along with error information.
36. Challenges in Error Tracking
document.querySelector('input').addEventListener('change',
function(e){
// Adding it to the event array
});
37. Challenges in Error Tracking
Need to manually wrap each function.
Using instrumentation at build time or script serving time it can be
automated.