SlideShare ist ein Scribd-Unternehmen logo
1 von 26
Downloaden Sie, um offline zu lesen
Client-side Performance Optimizations

                             Lecture:             Ultra-large-scale Sites, Prof. Walter Kriha

                             Student:             Jakob Schröter
                                                  jakob.schroeter@hdm-stuttgart.de


                                                  January 2011 | WS 2010/2011

                                                  Computer Science and Media
                                                  Stuttgart Media University, Germany




This paper aims to show the importance of web performance and to sensibilize web developers
to care about the end-users experience. Further it provides an overview of basic optimization
techniques on the client-side with real-world examples to show how powerful these often
simple optimizations are.
Ultra-large-scale Sites                                                                          Client-Side Performance Optimizations



Content

1      Why performance matters ........................................................................................................ 3
2      The client-side ...........................................................................................................................5
3      Analyze and measure ................................................................................................................7
4      Basic optimization techniques ................................................................................................ 11
    4.1       HTTP requests are expensive ........................................................................................... 11
    4.2       Intelligent browser caching ............................................................................................. 11
    4.3       Shrink request size .......................................................................................................... 13
    4.4       Image optimizations........................................................................................................ 15
    4.5       Loading resources & page rendering .............................................................................. 17
    4.6       Domain sharding / CDN .................................................................................................20
    4.7       JS & CSS performance......................................................................................................20
5      Automation ............................................................................................................................. 21
6      Conclusion and a look into the future ................................................................................... 22
References .......................................................................................................................................24




Jakob Schröter | Stuttgart Media University, Germany                                                                                              2
Ultra-large-scale Sites                                            Client-Side Performance Optimizations




1    Why performance matters
What counts to the success of a website is that the website users are satisfied and that they
enjoy browsing through the site. Of course this has mostly to do with the quality of the
content and the usefulness of the service. But also the user experience is an important factor
for successful web projects – which includes fast response times.

During a study Amazon slowed down their website by 100 ms with a result of a 1% drop in
sales. Yahoo made similar experiments with 400 ms, resulting 5-9% drop in requests. Also
Google increased the number of results per page from 10 to 30 which added 500 ms to the
response time. After this change they measured a 20% drop in search requests. Further they
realized that it takes a few days until they reach their old number of requests again. Shopzilla
saved about 5 seconds of their total page loading time with performance optimizations,
resulting in 25% increase in requests, 7-12% increase in revenue and 50% reduction in
hardware.

Amazon                    +100 ms                      1% drop in sales
Yahoo                     +400 ms                      5-9% drop in requests
Google                    +500 ms                      20% drop in requests
Bing                      +2000 ms                     4.3% drop in revenue/user [1]
Shopzilla                 -5000 ms                     25 % increase in requests
                                                       7-12% increase in revenue
                                                       50% reduction in hardware [2]
Mozilla                   -2200 ms                     15.4% increase in downloads [3]


These numbers show that even minimal differences in the response time can have significant
effects on the business. [4]

When having a look at the 3 response-time limits by usability guru Jakob Nielson this can be
confirmed: The users sense an interruption in what there are doing by response times higher
than 0.1 seconds. The response time should be less than 1 second to allow a good navigation
and let the users feel that they are in control. 10 seconds would be an unacceptable response
time. The user experience would be interrupted at a disturbing high rate and the user would
probably leave the site. By the way, these response-time limits haven’t change since 40 years.
[5]



But it’s not all about time - the user-experience matters
To provide good user-experience it’s worthwhile to manage that the user doesn’t notice that
he’s actually waiting for the loading of a page. Even if there is no possibility to avoid the
loading time of e.g. a huge background image there are plenty of ways to enhance the user-
experience since the human sense of time is relative. For example the background image can
be loaded as the last resource, so the website in general is already usable before the image is
loaded. Same works for JavaScript-enhanced elements: Tests on Facebook showed that it’s best
to avoid white screens and immediately show content even if the functionality behind it isn’t
loaded yet [6]. This doesn’t change anything on the total page loading time, but the perceived
loading time will decrease since the user can already read the content and maybe jump to

Jakob Schröter | Stuttgart Media University, Germany                                                  3
Ultra-large-scale Sites                                          Client-Side Performance Optimizations



another page via the navigation even if the current page isn’t completely loaded yet. The user
feels a faster response time. It also helps to show a notification like a progress bar if the action
takes a few seconds. So the user knows that the computer is still working and isn’t crashed. Or
why not just give the user something to do while he’s waiting? For example, let him already
add tags to the pictures he’s currently uploading.

So keep in mind that in the end the perceived page loading time is important.

By the way, Google announced in April 2010 that from now on the page speed will be a factor
for ranking websites in the search results [7].




Jakob Schröter | Stuttgart Media University, Germany                                                4
Ultra-large-scale Sites                                         Client-Side Performance Optimizations




2    The client-side
A few years ago, when talking about website performance it was only about optimizing the
server-side and reducing the generation-time of the HTML-output. But nowadays the server-
side seems not to be the main problem: some guys from Yahoo found out that on an average
only 10-20% of the loading time is spent on the server-side; 80-90% is spent at the client-side,
that means in the users browser [8] [9].



                                Average loading time of a website




                                               10-20%

                                                                                Server-side
                                                                                Client-side

                             80-90%




                               Server                           Client




Waterfall chart of web.de generated with webpagetest.org

The example waterfall chart above shows that just the loading of resources on the client-side
needs way more time than generating the main HTML page on the server. So performance is
not only the job of the guys working on the backend – it’s also an important topic for frontend
engineers.



Jakob Schröter | Stuttgart Media University, Germany                                               5
Ultra-large-scale Sites                                        Client-Side Performance Optimizations



When working for the web one usually assumes that the client is thin. Nowadays this is only
partially true for modern web applications. They use a lot of JavaScript and CSS to create a rich
user interface. That means more and more logic lies on the client and the server is sometimes
only used for persistently saving the data. There doesn’t have to be an obligatory HTTP request
for each user interaction anymore.



The browser
Browsers developed to pretty complex applications. Requesting the first URL, following
redirections, receiving and parsing the main HTML site and after the loading of additional
resources like CSS, JavaScript and images the browser renders the content. This means
rendering text in the defined CSS styles, rendering images, calculating the flow and so on.
With the introduction of CSS3, the Canvas element and SVG browsers are not only simply
drawing black text on a white screen anymore – they come with support for rich graphic effects
like drop shadows, transformations like rotations and animations. So there’s more and more
computing power needed to display a website. Also browsers are now able to play video and
audio files by themselves. And the execution of JavaScript needs time too. With every release
the browser manufacturers are working hard on their browsers performance, trying to beat the
competitors with compiling JavaScript engines, Hardware-accelerated rendering, …




Jakob Schröter | Stuttgart Media University, Germany                                              6
Ultra-large-scale Sites                                       Client-Side Performance Optimizations




3      Analyze and measure
Considering 80-90% of the loading time is spent on the client-side it makes sense to have a
look on how the performance can get tuned here. But before implementing some
optimizations it’s important to analyze the bottlenecks and set up test cases to measure the
performance before and after the optimizations. Not long ago browsers behaved like a black
box – a web developer couldn’t easily see what was going on after the users hits the enter
button in the address bar. Today there are plenty of great tools available, just to name a few:



Firebug1 runs as a browser plugin for Firefox and gives beside many handy debug tools for
frontend developers the ability to track for example all network requests or to profile
JavaScript function calls.




1
    http://getfirebug.com/

Jakob Schröter | Stuttgart Media University, Germany                                             7
Ultra-large-scale Sites                                         Client-Side Performance Optimizations



Yahoo YSlow2 is a plugin for Firebug which can test a website for many basic optimizations.
It’s a great tool for the beginning. The tests are based on the best practices3 from the Yahoo
Exceptional Performance team.




Google Page Speed4 is also a plugin for Firebug, also based on performance rules. It also
includes minifying of HTML, CSS and JavaScript files.




2
  http://developer.yahoo.com/yslow/
3
  http://developer.yahoo.com/performance/rules.html
4
  http://code.google.com/speed/page-speed/docs/extension.html

Jakob Schröter | Stuttgart Media University, Germany                                               8
Ultra-large-scale Sites                                          Client-Side Performance Optimizations



Google Chrome Speed Tracer5 is an extension for Google Chrome and offers a deep insight
what the browser is doing. Starting from the loading of resources, executing JavaScript, CSS
selector matching, paint processes to garbage collection.




Also commercial tools are available, e.g. HTTPWatch6 and dynaTrace7.



As explained in chapter 1 it’s not only the loading time that matters. So only measuring the
loading time or e.g. the time until the browsers onload-event is fired isn’t enough [10]. More
important is the time it takes until the page is usable – that means when the important content
and the navigation is visible to the user.

The speed limiter function in OWASP WebScarab8 can emulate a slow internet connection to
see e.g. how all the images get loaded one after each other. In some cases it really helps to get a
live impression on how the site is being loaded and in which order the content will be shown
to the user.




5
  https://chrome.google.com/extensions/detail/ognampngfcbddbfemdapefohjiobgbdl
6
  http://www.httpwatch.com/
7
  http://www.dynatrace.com/
8
  http://www.owasp.org/index.php/Category:OWASP_WebScarab_Project

Jakob Schröter | Stuttgart Media University, Germany                                                9
Ultra-large-scale Sites                                           Client-Side Performance Optimizations



It’s also recommended to do remote testing from clients around the globe. For example
WebPagetest 9 and Zoompf 10 are offering great tools which combine tools like Google
PageSpeed. WebPagetest also allows the definition of a DOM element (which contains
important content, e.g. the DIV of the main content) and track the time until the element is
available in the DOM. Also “slow-motion” videos can be generated to get an insight look on
the rendering process of the site.



Further it’s possible to track the page load times of real visitors. This can be done for example
with the event tracking feature of Google Analytics11 or the Episodes12 framework.



As often it’s hard to get real precise test results since many external factors interfere with the
measured numbers. This includes server load, network latency, the workload of the client
machine, differences between browsers and so on. Therefore the tests should always be run
multiple times.




9
   http://www.webpagetest.org/
10
   http://zoompf.com/free
11
   http://blog.yottaa.com/2010/10/how-to-measure-page-load-time-with-google-analytics/
12
   http://stevesouders.com/episodes/

Jakob Schröter | Stuttgart Media University, Germany                                                10
Ultra-large-scale Sites                                          Client-Side Performance Optimizations




4    Basic optimization techniques
Many of the now described techniques are actually not new; some people were already
discussing them a few years ago. But unfortunately still many developers and top websites
don’t profit from them.



4.1 HTTP requests are expensive
Every HTTP request requires time and uses a request slot, even if the transferred data has only
a few bytes. Nowadays the transfer speed is not the biggest problem, once a connection is
established the transfer runs fast, even on mobile networks. Latency is the major challenge to
deal with.

Keep in mind that every redirect causes a new request and according to tests of Steve Souders
most of the modern browsers don’t cache redirects. So these additional requests are also
slowing down a website [11].



Avoid HTTP requests when possible
Because every request comes always with latency it’s important to reduce the amount of
requests wherever it’s possible. Many requests can be saved by the smart combination of
resource files. For example all JavaScript files which are loaded on every page, e.g. base.js,
dragndrop.js and animation.js can be combined in one single file (this can be automated, see
chapter 5). If there is a huge JavaScript file for only a few specific pages (e.g. uploader.js) then
it should be kept separate, because the code is not needed on the other pages. The same
should be done with CSS files.

When some more requests are made during the user uses the website, e.g. an form field offers
a nice autosuggest function which calls the backend for JSON responses it sometimes make
sense to deliver more results as the user may need in first step, but which he may will need
later on.

Further the number of requests can be dropped considerably by combining images using CSS
slices as described further on in chapter 4.4.



4.2 Intelligent browser caching
The HTTP protocol specifies great possibilities for caching files on the client and many of them
are widely supported by modern browsers. This really isn’t something new, but it often seems
they got forgotten in developers’ minds. It’s not uncommon that developers or server
administrators just set all HTTP headers to “no-cache” to disable the browser cache so they
don’t have any troubles with cached but outdated content. But if the right HTTP headers will
get sent to the browser, client-side caching can boost the performance of a website, like on the
server-side.



Jakob Schröter | Stuttgart Media University, Germany                                                11
Ultra-large-scale Sites                                                Client-Side Performance Optimizations



Not modified header and the ETag
Webservers can be configured to send an ETag13 (entity tag) header in file responses. This
ETag can e.g. be a Md5 hash of the file, so after modifying the file the ETag will be different. If
the browser saved a file with an ETag in its cache, it will add the ETag to the next request for
the same file. Now the server checks, if the ETag is still valid (the file hasn’t changed). If it’s
valid the server only sends a HTTP/1.x 304 Not Modified response, without sending the
whole file content again. The browser will use the cached file instead. If the file has changed,
the server sends the whole file as usual. With this technique a lot of traffic can be saved, but
there’s still a HTTP to be made.




Expires
The Expires header14 tells the browser the exact date and time when a file will expire. Until
this date the browser doesn’t make any new requests for this file. So this is perfect for static
files where it’s known that they won’t change during the time specified in the Expires
header.




If the project has e.g. a weekly release cycle every Wednesday it’s possible to set the static files
to expire on this date. But mostly it’s a safer way to use cache busters as described in the
following chapter.



Cache busters
After the release of a new version of the website it’s very important that all users are getting
the latest version of the resource files, like CSS and JS. Wrong configured caching can become
a huge problem, imagine what happens when the users are getting the latest HTML page but
are still using an outdated CSS and JavaScript file…

13
     http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.19
14
     http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.21

Jakob Schröter | Stuttgart Media University, Germany                                                     12
Ultra-large-scale Sites                                        Client-Side Performance Optimizations



Cache busters force the browser to reload the file. This is done by appending e.g. a version-
number so the browser thinks it’s a complete different file. Mod_rewrite15 can be used for that
so on the server-side it can still be the same filename. This can look for example like this:

/scripts/uploader-158846.js -> /scripts/uploader.js

or directory-based

/scripts/158846/uploader.js

When possible, cache busters as a get parameter (uploader.js?158846) should be avoided
since some proxies are configured to don’t cache files with get parameters [12].

Using Expire headers together with cache busters should be preferred instead of using
ETags.

But unfortunately web developers still can’t rely on caching as much as it would be desirable.
According to Yahoo!'s Exceptional Performance Team, 40% to 60% of Yahoo!'s users have an
empty cache experience and about 20% of all page views are done with an empty cache. This
surprising fact outlines the importance of keeping websites as lightweight as possible as
described in the following chapter. [13]



4.3 Shrink request size
Transferring bytes always consumes time; therefore the amount of data being transferred from
the server to the client and vice versa should be as small as possible. Just one example: Google
Maps increased their number of map requests by 30% after shrinking their total file size by
30% [14]. To archive a minimal file size it helps to use light data formats like JSON instead of
XML. Further great savings can be gained by minifying and compressing files:



Minifying
CSS and JavaScript files usually contain describing variable names, comments, whitespaces and
line breaks to be easy readable by humans. But the browser doesn’t need all these information.
By removing these characters the file size can be shrinked drastically.

There are a handful tools available like YUI Compressor16 and Dojo ShrinkSafe17 for minifying
JS and CSS files. Even HTML files can be minified.




15
   http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html
16
   http://developer.yahoo.com/yui/compressor/
17
   http://shrinksafe.dojotoolkit.org/

Jakob Schröter | Stuttgart Media University, Germany                                             13
Ultra-large-scale Sites                                         Client-Side Performance Optimizations



Compressing
All modern browsers support compressed content18. This means all plain text content like
HTML, CSS, JS, JSON, XML etc. can be compressed on the server-side and the browser will
decompress the content right away before using it. Of course binary files like images, PDF and
SWF shouldn’t be compressed again since they are already compressed.

The compression can easily be enabled in the configuration of most webservers (e.g. by
enabling mod_deflate in Apache). There’s no need to change anything of the sites code,
everything is done by the webserver. Dynamic content will be compressed on-the-fly while
static content will automatically be served as a cached compressed version.

By the way, the additional computing power needed to compress and decompress the data
generally doesn’t cause any problems.



The following table demonstrates as an example what amount of data can be saved by
minifying and compressing HTML, CSS and JS files on hdm-stuttgart.de [15]:

             Original         Minified          Compressed    Minified + compressed
HTML         101 KB           97 KB             17 KB         16 KB
CSS          90 KB            68 KB             19 KB         14 KB
JS           243 KB           195 KB            73 KB         63 KB
Sum          434 KB           360 KB            109 KB        93 KB


So by minifying and compressing HTML, CSS and JS files their transferred file size can be
reduced by 341 KB – that means 79%!

Unfortunately by October 2010, still 47% of the top 1000 websites weren’t using compression,
although it’s very easy to enable and has amazing potential to speed up websites! [16] For
example the #9 top site of Germany19 Spiegel.de, could save a lot of traffic if they would minify
and compress their content. Just to provide a roundabout number: on every visit with an
empty cache 436 KB of traffic could be saved. Projected on their 380.000.000 page impressions
per month 20 and considering the fact from the previous chapter that 20% of all page
impressions are done with an empty cache this would sum up in a saving of about 32 TB per
month.




18
   The community-driven website http://www.browserscope.org/ offers great compares about different
browsers and versions.
19
   according to http://www.alexa.com/topsites/countries/DE
20
   http://www.spiegel.de/extra/0,1518,249387,00.html

Jakob Schröter | Stuttgart Media University, Germany                                              14
Ultra-large-scale Sites                                       Client-Side Performance Optimizations




4.4 Image optimizations

Use the right image format & dimensions
First, by choosing the optimal image format the file size can be reduced drastically while even
providing better image quality. In general the JPG format should be used for images with a
high number of colors. PNG is best for rendered texts e.g. headlines (which better should be a
plain text anyway) and of course for images with alpha transparency. Background images
usually can be saved with a high compression rate.

It’s not uncommon that images are delivered in a higher resolution as they are displayed in the
browser. So they are downscaled via the browser, which consumes needless network
bandwidth and rendering time of the browser. Whenever possible, the width and height
attributes of <img> tags should be filled [17].



CSS Sprites
Like JS and CSS files also image files can be combined to save an enormous amount of
requests. Therefore a huge image will be made up of the single images. This image will be
defined as the background image of HTML elements and with the CSS background-position
the image will get positioned so that only one single image is visible. [18] When combining
images with similar colors the best compression rate can be achieved.

Google for example combined 53 images into one file. Take a look at the CSS listing to see how
the selection of a single image out of the huge sprite is done:




Jakob Schröter | Stuttgart Media University, Germany                                            15
Ultra-large-scale Sites                                      Client-Side Performance Optimizations




                                         a.button {
                                             width: 13px;
                                             height: 13px;
                                             background: url(sprite.png) no-repeat;
                                             background-position: -19px -193px;
                                         }

                                         a.button:hover {
                                             background-position: -35px -193px;
                                         }




Remove meta data
Further with the removal of useless meta information like EXIF data quite a sum of bytes can
be saved. Dependent on the use case some meta information like copyright hints may be
important but in general these meta data are not needed for images on a webpage. [19]

Yahoo provides with Smush.it21 a great service for automatically optimizing images without
quality loss. Also Google PageSpeed includes image optimizations like removing meta data:




For example 124 KB or 67% of the total image size could be saved by removing meta data on
hdm-stuttgart.de, without any loss in image quality.




21
     http://www.smushit.com/

Jakob Schröter | Stuttgart Media University, Germany                                           16
Ultra-large-scale Sites                                          Client-Side Performance Optimizations



4.5 Loading resources & page rendering
The general aim should be to show the most important content first and as fast as possible. For
example this could be the main article and the page navigation. The order of loading the site
resources like JavaScript, CSS and images files has a wide influence on the rendering process in
the browser. The network waterfall charts of e.g. Firebug can give helpful insights on how the
browser loads the resources on the site.



Correct order of HTTP requests
A best practice is to put the CSS file (hopefully all files are combined into one) as the very first
resource in the <head> tag. This helps to avoid strange content jumping while loading your
page. Otherwise it can happen that the browser displays your content for example with the
default font and later on has to redraw it with the font defined – which is not only confusing
for the user (the so called “Flash of unstyled content”), it also consumes needless computing
power and slows down the website rendering.



Non-blocking JavaScript
The browser waits until all JavaScript files defined in the <head> section are loaded before it
begins to render the page. Also <script> elements in the <body> section are blocking the
browser from rendering all following HTML elements until the JavaScript file is loaded and
executed. Further all other resource downloads are paused while loading JavaScript files since
the loaded script could modify the DOM again; luckily new browsers are trying to fix this. But
not only the loading of JavaScript blocks the page rendering, also all rendering of the page
content is paused and new resource downloads are blocked while executing JavaScript [10].

It’s well-known that this behavior isn’t optimal, so the HTML5 specification offer async and
defer attributes22 for <script> elements to define when the script should be loaded and
executed. The async attribute tells the browser to load the script asynchronous and execute it
as soon as it’s loaded. That means the scripts will be executed in the order they got loaded, not
in the order of the <script> elements. Also the DOM may not be complete when executing
this script. With the defer attribute the script will be loaded after all other site resources
were loaded and executed when the DOM is ready. The order of execution will be kept.
Unfortunately these attributes are not well supported by the major browsers yet.

Nevertheless common JavaScript libraries provide utilities for asynchronous JavaScript and CSS
loading, e.g. YUI3 Get23. And with a few hacks the loading and also the parsing and execution
of JavaScript can be controlled separately. It’s worth to take a look at the JavaScript module
ControlJS24 from Steve Souders which allows this, even though it’s not yet recommended to use
this tool in an production environment. [20]




22
   http://www.whatwg.org/specs/web-apps/current-work/multipage/scripting-1.html
23
   http://developer.yahoo.com/yui/3/get/
24
   http://stevesouders.com/controljs/

Jakob Schröter | Stuttgart Media University, Germany                                               17
Ultra-large-scale Sites                                         Client-Side Performance Optimizations



The following figure shows how JavaScript files block the browser from fetching other resource
files like images and block the page rendering. The green line indicates when the rendering
process starts.




Waterfall chart without ControlJS (IE8) [20]

The next figure shows the same page using ControlJS. The user will see the page right after the
HTML has been loaded and doesn’t have to wait about 4 seconds until all JavaScript files where
loaded. Also the images will get loaded first and parallel to the JavaScript files.




Waterfall chart with ControlJS (IE8) [20]




In general an eye should be kept on which files really need to be loaded to display the first
state of the website and the principle of progressive enhancement should be followed. That
means allowing the browser to render the plain HTML page as fast as possible and enhance it
with JavaScript after the page was rendered.



Frontend single points of failure
Just think about the following case: An external JavaScript file e.g. from an advertisement
company is loaded in the <head> section of a site. What happens to the site when the external
server is down? The whole site won’t show up until the browser decides to timeout the request!
That’s a single point of failure. [21] So it’s not only from the performance point of view wise to
load resources in a non-blocking way. This especially applies for external resources so 3rd
parties don’t slow down the own site.




Jakob Schröter | Stuttgart Media University, Germany                                              18
Ultra-large-scale Sites                                          Client-Side Performance Optimizations



Intelligent pre/lazy-loading
It’s possible to preload resources which will be used later on. For example it can help to
preload huge JavaScript files while the user enters his login data on the login page. After he has
logged in all needed JavaScript files are already in the browsers cache and can be used
immediately. But of course it’s important to start the preloading after the current page has
been rendered so e.g. the login page is already usable before the preloading starts.

Also lazy-loading is possible for content which is first of all not visible for the user. A common
use case is images which are currently not visible and only will get visible after the user scrolls
down. YouTube for example lazy loads the thumbnails of the suggested clips only when the
user scrolls down. Many JavaScript libraries like YUI provide easy functions25 to implement this
behavior. Also lazy-loading of other resources can result in a huge performance boost.



Progressive rendering
An interesting, but depending of the application architecture sometimes difficult to implement
technique is to send the generated HTML code as early as possible to the client, even if it isn’t
completely generated. For example, in PHP this can be done with the flush() method as
shown in the example below. Browsers can already parse the first code lines and e.g. start
loading CSS and JavaScript files during the rest of the document is being generated on the
server. [22] [19]


<html>
<head>
  <title>the page</title>
  <link href="my.css" />
  <script src="my.js"></script>
</head>
<?php flush(); ?>
<body>
<div>site navigation</div>
<div>main content</div>
<?php flush(); ?>
<div>some user comments</div>
<div>some ads</div>
...




25
     e.g. http://developer.yahoo.com/yui/3/imageloader/

Jakob Schröter | Stuttgart Media University, Germany                                               19
Ultra-large-scale Sites                                        Client-Side Performance Optimizations



4.6 Domain sharding / CDN
Browsers only allow a specific number (2-6) of parallel HTTP connections to the same host
name. If all resource files are hosted under the same host name it will take a while until the
files are loaded since there are only e.g. 2 download slots available. To improve concurrency it
makes sense to split up the resources on e.g. 2 additional (sub-)domains.

Also a lightweight webserver26 can be used for delivering static files like JavaScript, CSS and
images. They have faster response times and can unload the main application servers. Further
the domain should be cookie-free, so the browser doesn’t send a cookie with every request for
static files. This saves traffic and computing power.

In addition content delivery networks with servers around the globe provide better response
times since they will choose the nearest server based in the location of the user.




4.7 JS & CSS performance
More and more websites are extensively using JavaScript – especially web 2.0 sites. Some of
them put the whole creation of the DOM in the hand of JavaScript. From this it follows that
the performance of JavaScript is getting more importance. All browser manufacturers are
working hard to optimize their engines and speed up the JavaScript execution. But also web
developers can do a lot to optimize their code. If a website is heavily using JavaScript, it’s
worth to follow JavaScript best practices. [23]

The same applies for CSS – it might sound a bit beside the point, but for example some CSS
selectors have considerable better performance than others. The most misunderstood fact is
that browsers are interpreting CSS selectors from right to left and not like many people would
guess from left to right. One example:

#myElement li a {color: red;}

Actually this selector seems to be very efficient, getting the element with the id myElement,
search for children of the type <li> and then applying the font color to all children from the
type <a>. Instead the browser iterates over all <a> tags on the entire page, checks if they may
are a child of a <li> element in multiple levels and then checks if they are also a child of the
element with the id myElement. [19] From performance point of view the rule above would be
faster when e.g. using only one class selector and apply this class name to all <a> elements:

.myElement-li-a {color: red;}

Steve Souders created a test suit27 to compare the performance of (own) CSS selectors.

Further the new CSS3 shadow and transform effects should be used with care; in some
circumstances they can extremely slow down a website.


26
     e.g. http://www.nginx.org/ or http://www.lighttpd.net/
27
     http://stevesouders.com/efws/css-selectors/tests.php

Jakob Schröter | Stuttgart Media University, Germany                                             20
Ultra-large-scale Sites                                          Client-Side Performance Optimizations




5    Automation
It’s important that performance optimizations don’t break the development process.
Struggling with minified JavaScript and CSS files in the development environment is no fun at
all. And manually minifying and combining them before every release is a time-taking and also
error-prone job. Therefore the aim should be to integrate as much as possible into the
deployment process. This has also advantages when working on a huge project with dozens of
people since it’s hard to convince every developer to follow the optimization rules.
Optimizations like minifying and combining CSS and JavaScript files can be done
automatically during the deployment process (e.g. via Ant) so there is still a nice modular file
structure in the development environment.

In addition some companies such as Strangeloop28 or Blaze29 offer commercial out-of-the-box
optimizations tools. The trend goes to transformation-based performance optimization. This
means these tools will automatically modify the HTML output and optimize resources without
the need of far-reaching changes on the application. It’s a challenge e.g. for sites relying heavily
on JavaScript, Ajax and third party content. But for more simple HTML sites and particularly
smaller (e.g. private) sites this approach can be useful. Also Google recently released their
open-source Apache module mod_pagespeed30 which does performance optimizations like
compressing, minifying, image optimization, combining of JS and CSS files and so on
automatically on-the-fly. The idea sounds very promising; the module even chooses the
optimal optimizations depending on the users’ browser. It’s worth giving it a try. But some
tests from Aaron Peters showed that the module can even slow down a website since it
consumes computing power on the server [24].

Further automated performance tests could be set up to ensure e.g. a new feature doesn’t slow
down the website dramatically. For example the tool ShowSlow31 allows the tracking of YSlow,
Page Speed and dynaTrace rankings over time.




28
   http://www.strangeloopnetworks.com/
29
   http://www.blaze.io/
30
   http://code.google.com/speed/page-speed/docs/module.html
31
   http://www.showslow.com/

Jakob Schröter | Stuttgart Media University, Germany                                               21
Ultra-large-scale Sites                                        Client-Side Performance Optimizations




6    Conclusion and a look into the future
Not only the end-user profits from snappy websites – often the servers and the networks get
unloaded and bandwidth is saved. So also from an economical and ecological point of view
performance optimizations are worthwhile. Like mentioned in chapter 1, Shopzilla reduced
their hardware by 50% after performance optimizations [2]. So with performance
optimizations a lot of money can be saved and also earned when e.g. beating competitors’ site
speed and getting more satisfied customers. Fred Wilson, a New York based tech investor said
in March 2010 that he sees speed as the most important feature of an web application [25].



It’s just the beginning

Comparable to Search Engine Optimization (SEO) a new industry specialized on performance
optimizations has grown: Web Performance Optimization (WPO) [26]. The establishment of
the W3C Web Performance Working Group32 shows that there is effort to standardizing
performance metrics in browsers, e.g. with the Navigation Timing33 specification.

Also just think about performance on mobile devices. Mobile client-side web performance is
already a big topic and will get as important as desktop web performance [27]. While there is a
bunch of well working tools like Firebug available for measuring desktop performance, there is
still a lack of good tools for mobile browsers. And since browsers supporting rich graphic
effects like drop-shadows the client-side performance will get more attention in the future. For
example all major browser manufactures are already working on hardware-accelerated website
rendering.

Further interesting research is done like the Diffable34 project by Google which aims to provide
an tool that only downloads the deltas between cached static files and the updated ones. So
when e.g. a new version of Google Maps is released, the browser only needs to download a diff-
file with maybe 20 KB instead of the full JavaScript file with 300 KB. [28] To further reduce the
number of HTTP connections the idea of Resource Packages came up. All resource files can be
packed into one single ZIP file which is referenced in the documents <head> section. So the
transfer can be done by one single data stream. Good news are, that single files can be
progressively accessed while the (huge) ZIP file is still loading and the loading order can be
defined. The idea sounds really promising and could replace CSS sprites which are often
difficult to maintain. [29] Also in the future local storages could be used to cache application
data and have a better control over cached files on the client-side.

But beside all benefits, website performance may not be the highest priority optimization for
every website. There is no one-click solution yet to perfectly boost the performance of any
random website. Also the performance best practices are always in a change, so the best
optimization rule for browser X may hurt performance in browser Y, or even in a newer
version of browser X. Going into the details of website performance is a very complex, time-
taking and somehow endless task when digging into micro-optimizations. On the other side
32
   http://www.w3.org/2010/webperf/
33
   http://www.w3.org/TR/2010/WD-navigation-timing-20101207/
34
   http://code.google.com/p/diffable/

Jakob Schröter | Stuttgart Media University, Germany                                             22
Ultra-large-scale Sites                                       Client-Side Performance Optimizations



hardware is getting faster and browsers are continuously improving the performance, without
the need to change anything on the own website. So for smaller sites the ROI may not be
worthwhile. Depending on e.g. the CMS or shop system which the website is based on it might
be tricky to implement even simple optimizations.

Nevertheless every web developer should have a basic knowledge about performance
optimizations and also small sites should at least adopt the basic optimizations like enabling
compression. They are very easy to adopt and have a huge benefit. With basic knowledge and
little attention to performance enormous bottlenecks can be avoided right from the launch of a
site.



For further reading and latest news it’s worth to take a look at the blog35 of client-side
performance guru Steve Souders.




35
     http://www.stevesouders.com/blog/

Jakob Schröter | Stuttgart Media University, Germany                                            23
Ultra-large-scale Sites                                       Client-Side Performance Optimizations




References


[1] Eric Schurman and Jake Brutlag. (2009, June) Performance Related Changes and their
    User Impact. [Online]. http://www.slideshare.net/dyninc/the-user-and-business-impact-
    of-server-delays-additional-bytes-and-http-chunking-in-web-search-presentation

[2] Steve Souders. (2009, July) O'Reilly Radar: Velocity and the Bottom Line. [Online].
    http://radar.oreilly.com/2009/07/velocity-making-your-site-fast.html

[3] Blake Cutler. (2010, March) Blog of Metrix: Firefox & Page Load Speed. [Online].
    http://blog.mozilla.com/metrics/category/website-optimization/

[4] Website Optimization, LLC. (2008, May) The Psychology of Web Performance. [Online].
    http://www.websiteoptimization.com/speed/tweak/psychology-web-performance/

[5] Jakob Nielsen. (2010, June) Website Response Times. [Online].
    http://www.useit.com/alertbox/response-times.html

[6] Zizhuang Yang. (2009, August) Facebook: Every Millisecond Counts. [Online].
    http://www.facebook.com/note.php?note_id=122869103919

[7] Amit Singhal and Matt Cutts. (2010, April) Official Google Webmaster Central Blog: Using
    site speed in web search ranking. [Online].
    http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-
    ranking.html

[8] Tenni Theurer. (2006, November) Yahoo! User Interface Blog: Performance Research, Part
    1: What the 80/20 Rule Tells Us about Reducing HTTP Requests. [Online].
    http://www.yuiblog.com/blog/2006/11/28/performance-research-part-1/

[9] Steve Souders, High performance web sites: essential knowledge for frontend engineers,
    O'Reilly, Ed., 2007.

[10] Steve Souders. (2010, September) High Performance Web Sites blog. [Online].
     http://www.stevesouders.com/blog/2010/09/30/render-first-js-second/

[11] Steve Souders. (2010, July) High Performance Web Sites blog: Redirect caching deep dive.
     [Online]. http://www.stevesouders.com/blog/2010/07/23/redirect-caching-deep-dive/

[12] Steve Souders. (2008, August) High Performance Web Sites blog: Revving Filenames: don’t
     use querystring. [Online]. http://www.stevesouders.com/blog/2008/08/23/revving-
     filenames-dont-use-querystring/

[13] Tenni Theurer. (2007, January) Yahoo! User Interface Blog: Performance Research, Part 2:
     Browser Cache Usage – Exposed! [Online].


Jakob Schröter | Stuttgart Media University, Germany                                            24
Ultra-large-scale Sites                                           Client-Side Performance Optimizations



     http://yuiblog.com/blog/2007/01/04/performance-research-part-2/

[14] Stephen Shankland. (2008, May) CNET News: We're all guinea pigs in Google's search
     experiment. [Online]. http://news.cnet.com/8301-10784_3-9954972-7.html

[15] Jakob Schröter. (2010, January) Client-side Performance Optimizations. [Online].
     http://www.slideshare.net/jakob.schroeter/clientside-performance-optimizations

[16] Joshua Bixby. (2010, October) Almost half of the top 1000 retail sites don’t follow two easy
     performance best practices. Does yours? [Online].
     http://www.webperformancetoday.com/2010/10/22/alexa-1000-performance-best-
     practices/

[17] Website Optimization, LLC. (2004, September) Size Images with Width and Height
     Attributes. [Online]. http://www.websiteoptimization.com/speed/tweak/size/

[18] Sven Lennartz. (2009, April) Smashing Magazine: The Mystery Of CSS Sprites:
     Techniques, Tools And Tutorials. [Online].
     http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques-
     tools-and-tutorials/

[19] Steve Souders, Even Faster Web Sites, O'Reilly, Ed., 2009.

[20] Steve Souders. (2010, December) High Performance Web Sites blog: ControlJS part 1: async
     loading. [Online]. http://www.stevesouders.com/blog/2010/12/15/controljs-part-1/

[21] Steve Souders. (2010, June) High Performance Web Sites blog: Frontend SPOF. [Online].
     http://www.stevesouders.com/blog/2010/06/01/frontend-spof/

[22] Stoyan Stefanov. (2009, December) Progressive rendering via multiple flushes. [Online].
     http://www.phpied.com/progressive-rendering-via-multiple-flushes/

[23] Nicholas C. Zakas, High Performance JavaScript.: O'Reilly, 2010.

[24] Aaron Peters. (2010, December) Performance Calendar: Mod_Pagespeed Performance
     Review. [Online]. http://calendar.perfplanet.com/2010/mod_pagespeed-performance-
     review/

[25] Keir Whitaker. (2010, March) Think Vitamin: Fred Wilson’s 10 Golden Principles of
     Successful Web Apps. [Online]. http://thinkvitamin.com/web-apps/fred-wilsons-10-
     golden-principles-of-successful-web-apps/

[26] Steve Souders. (2010, December) Performance Calendar: 2010 State of Performance.
     [Online]. http://calendar.perfplanet.com/2010/state-of-performance/

[27] Joshua Bixby. (2011, January) RCR Wireless News: Reader Forum: 2011 Web performance
     predictions for the mobile industry. [Online].
     http://www.rcrwireless.com/article/20110103/READERFORUM/101229979/reader-forum-

Jakob Schröter | Stuttgart Media University, Germany                                                25
Ultra-large-scale Sites                                     Client-Side Performance Optimizations



     2011-web-performance-predictions-for-the-mobile-industry

[28] Steve Souders. (2010, July) Diffable: only download the deltas. [Online].
     http://www.stevesouders.com/blog/2010/07/09/diffable-only-download-the-deltas/

[29] Alexander Limi. (2009, November) Making browsers faster: Resource Packages. [Online].
     http://limi.net/articles/resource-packages/

[30] Stoyan Stefanov. (2010, November) Progressive Downloads and Rendering. [Online].
     http://www.slideshare.net/stoyan/progressive-downloads-and-rendering




Jakob Schröter | Stuttgart Media University, Germany                                          26

Weitere ähnliche Inhalte

Was ist angesagt?

Magento Performance Improvements with Client Side Optimizations
Magento Performance Improvements with Client Side OptimizationsMagento Performance Improvements with Client Side Optimizations
Magento Performance Improvements with Client Side OptimizationsPINT Inc
 
SearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital Marketers
SearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital MarketersSearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital Marketers
SearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital MarketersDistilled
 
7 secrets of performance oriented front end development services
7 secrets of performance oriented front end development services7 secrets of performance oriented front end development services
7 secrets of performance oriented front end development servicesKaty Slemon
 
Making Facebook Faster
Making Facebook FasterMaking Facebook Faster
Making Facebook Fasterguest1240e7c
 
Metrics, metrics everywhere (but where the heck do you start?)
Metrics, metrics everywhere (but where the heck do you start?)Metrics, metrics everywhere (but where the heck do you start?)
Metrics, metrics everywhere (but where the heck do you start?)Tammy Everts
 
How to Fix a Slow WordPress Site (and get A+ scores)
How to Fix a Slow WordPress Site (and get A+ scores)How to Fix a Slow WordPress Site (and get A+ scores)
How to Fix a Slow WordPress Site (and get A+ scores)Lewis Ogden
 
Amp your site an intro to accelerated mobile pages
Amp your site  an intro to accelerated mobile pagesAmp your site  an intro to accelerated mobile pages
Amp your site an intro to accelerated mobile pagesRobert McFrazier
 
Chanhao Jiang And David Wei Presentation Quickling Pagecache
Chanhao Jiang And David Wei Presentation Quickling PagecacheChanhao Jiang And David Wei Presentation Quickling Pagecache
Chanhao Jiang And David Wei Presentation Quickling PagecacheAjax Experience 2009
 
Best practices to increase the performance of web-based applications
Best practices to increase the performance of web-based applicationsBest practices to increase the performance of web-based applications
Best practices to increase the performance of web-based applicationsMouhamad Kawas
 
Make Drupal Run Fast - increase page load speed
Make Drupal Run Fast - increase page load speedMake Drupal Run Fast - increase page load speed
Make Drupal Run Fast - increase page load speedAndy Kucharski
 
Metrics, Metrics Everywhere (but where the heck do you start?)
Metrics, Metrics Everywhere (but where the heck do you start?)Metrics, Metrics Everywhere (but where the heck do you start?)
Metrics, Metrics Everywhere (but where the heck do you start?)SOASTA
 
Stress Test Drupal on Amazon EC2 vs. RackSpace cloud
Stress Test Drupal on Amazon EC2 vs. RackSpace cloudStress Test Drupal on Amazon EC2 vs. RackSpace cloud
Stress Test Drupal on Amazon EC2 vs. RackSpace cloudAndy Kucharski
 
Dynamic Rendering for SEO by Nati Elimelech
Dynamic Rendering for SEO by Nati ElimelechDynamic Rendering for SEO by Nati Elimelech
Dynamic Rendering for SEO by Nati ElimelechNati Elimelech
 
Html5 Fit: Get Rid of Love Handles
Html5 Fit:  Get Rid of Love HandlesHtml5 Fit:  Get Rid of Love Handles
Html5 Fit: Get Rid of Love HandlesChris Love
 
Improving Drupal Performances
Improving Drupal PerformancesImproving Drupal Performances
Improving Drupal PerformancesVladimir Ilic
 
Beyond Breakpoints: Improving Performance for Responsive Sites
Beyond Breakpoints: Improving Performance for Responsive SitesBeyond Breakpoints: Improving Performance for Responsive Sites
Beyond Breakpoints: Improving Performance for Responsive SitesRakuten Group, Inc.
 
How webpage loading takes place?
How webpage loading takes place?How webpage loading takes place?
How webpage loading takes place?Abhishek Mitra
 
Monitoring web application response times, a new approach
Monitoring web application response times, a new approachMonitoring web application response times, a new approach
Monitoring web application response times, a new approachMark Friedman
 
Implementing a Responsive Image Strategy
Implementing a Responsive Image StrategyImplementing a Responsive Image Strategy
Implementing a Responsive Image StrategyChris Love
 

Was ist angesagt? (20)

Magento Performance Improvements with Client Side Optimizations
Magento Performance Improvements with Client Side OptimizationsMagento Performance Improvements with Client Side Optimizations
Magento Performance Improvements with Client Side Optimizations
 
SearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital Marketers
SearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital MarketersSearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital Marketers
SearchLove San Diego 2018 | Mat Clayton | Site Speed for Digital Marketers
 
7 secrets of performance oriented front end development services
7 secrets of performance oriented front end development services7 secrets of performance oriented front end development services
7 secrets of performance oriented front end development services
 
Keep the Web Fast
Keep the Web FastKeep the Web Fast
Keep the Web Fast
 
Making Facebook Faster
Making Facebook FasterMaking Facebook Faster
Making Facebook Faster
 
Metrics, metrics everywhere (but where the heck do you start?)
Metrics, metrics everywhere (but where the heck do you start?)Metrics, metrics everywhere (but where the heck do you start?)
Metrics, metrics everywhere (but where the heck do you start?)
 
How to Fix a Slow WordPress Site (and get A+ scores)
How to Fix a Slow WordPress Site (and get A+ scores)How to Fix a Slow WordPress Site (and get A+ scores)
How to Fix a Slow WordPress Site (and get A+ scores)
 
Amp your site an intro to accelerated mobile pages
Amp your site  an intro to accelerated mobile pagesAmp your site  an intro to accelerated mobile pages
Amp your site an intro to accelerated mobile pages
 
Chanhao Jiang And David Wei Presentation Quickling Pagecache
Chanhao Jiang And David Wei Presentation Quickling PagecacheChanhao Jiang And David Wei Presentation Quickling Pagecache
Chanhao Jiang And David Wei Presentation Quickling Pagecache
 
Best practices to increase the performance of web-based applications
Best practices to increase the performance of web-based applicationsBest practices to increase the performance of web-based applications
Best practices to increase the performance of web-based applications
 
Make Drupal Run Fast - increase page load speed
Make Drupal Run Fast - increase page load speedMake Drupal Run Fast - increase page load speed
Make Drupal Run Fast - increase page load speed
 
Metrics, Metrics Everywhere (but where the heck do you start?)
Metrics, Metrics Everywhere (but where the heck do you start?)Metrics, Metrics Everywhere (but where the heck do you start?)
Metrics, Metrics Everywhere (but where the heck do you start?)
 
Stress Test Drupal on Amazon EC2 vs. RackSpace cloud
Stress Test Drupal on Amazon EC2 vs. RackSpace cloudStress Test Drupal on Amazon EC2 vs. RackSpace cloud
Stress Test Drupal on Amazon EC2 vs. RackSpace cloud
 
Dynamic Rendering for SEO by Nati Elimelech
Dynamic Rendering for SEO by Nati ElimelechDynamic Rendering for SEO by Nati Elimelech
Dynamic Rendering for SEO by Nati Elimelech
 
Html5 Fit: Get Rid of Love Handles
Html5 Fit:  Get Rid of Love HandlesHtml5 Fit:  Get Rid of Love Handles
Html5 Fit: Get Rid of Love Handles
 
Improving Drupal Performances
Improving Drupal PerformancesImproving Drupal Performances
Improving Drupal Performances
 
Beyond Breakpoints: Improving Performance for Responsive Sites
Beyond Breakpoints: Improving Performance for Responsive SitesBeyond Breakpoints: Improving Performance for Responsive Sites
Beyond Breakpoints: Improving Performance for Responsive Sites
 
How webpage loading takes place?
How webpage loading takes place?How webpage loading takes place?
How webpage loading takes place?
 
Monitoring web application response times, a new approach
Monitoring web application response times, a new approachMonitoring web application response times, a new approach
Monitoring web application response times, a new approach
 
Implementing a Responsive Image Strategy
Implementing a Responsive Image StrategyImplementing a Responsive Image Strategy
Implementing a Responsive Image Strategy
 

Ähnlich wie Client-side Web Performance Optimization [paper]

A Designer's Guide to Web Performance
A Designer's Guide to Web PerformanceA Designer's Guide to Web Performance
A Designer's Guide to Web PerformanceKevin Mandeville
 
Speed in digital marketing en
Speed in digital marketing enSpeed in digital marketing en
Speed in digital marketing enThom. Poole
 
Improving frontend performance
Improving frontend performanceImproving frontend performance
Improving frontend performanceSagar Desarda
 
Proxy Not A Product
Proxy Not A ProductProxy Not A Product
Proxy Not A ProductKim Jensen
 
Boosting your conversion rate through web performance improvements
Boosting your conversion rate through web performance improvementsBoosting your conversion rate through web performance improvements
Boosting your conversion rate through web performance improvementsAlyss Noland
 
Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...
Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...
Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...Nirvana Canada
 
Load Speed PSI development of webcore vitals
Load Speed PSI development of webcore vitalsLoad Speed PSI development of webcore vitals
Load Speed PSI development of webcore vitalsrahmathidayat471220
 
o2script(UK) - Top Site Execution_ Compelling Devices and Systems.pdf
o2script(UK) - Top Site Execution_  Compelling Devices and Systems.pdfo2script(UK) - Top Site Execution_  Compelling Devices and Systems.pdf
o2script(UK) - Top Site Execution_ Compelling Devices and Systems.pdfO2scriptWebSolutions
 
Client-side Website Optimization
Client-side Website OptimizationClient-side Website Optimization
Client-side Website OptimizationRadu Pintilie
 
Five performance factors you need to know about in 2018
Five performance factors you need to know about in 2018Five performance factors you need to know about in 2018
Five performance factors you need to know about in 2018Fredric Lundgren
 
GlobalDots - How Website Speed Affects Conversion Rates
GlobalDots - How Website Speed Affects Conversion RatesGlobalDots - How Website Speed Affects Conversion Rates
GlobalDots - How Website Speed Affects Conversion RatesGlobalDots
 
Web performance and measurement - UKCMG Conference 2011 - steve thair
Web performance and measurement - UKCMG Conference 2011 - steve thairWeb performance and measurement - UKCMG Conference 2011 - steve thair
Web performance and measurement - UKCMG Conference 2011 - steve thairStephen Thair
 
An Introduction to Pagespeed Optimisation
An Introduction to Pagespeed OptimisationAn Introduction to Pagespeed Optimisation
An Introduction to Pagespeed OptimisationPratyush Majumdar
 
Website Performance at Client Level
Website Performance at Client LevelWebsite Performance at Client Level
Website Performance at Client LevelConstantin Stan
 
load speed problems of web resources on the client side classification and ...
 load speed problems of web resources on the client side  classification and ... load speed problems of web resources on the client side  classification and ...
load speed problems of web resources on the client side classification and ...INFOGAIN PUBLICATION
 
Leveraging Website Speed to Increase Sales
Leveraging Website Speed to Increase SalesLeveraging Website Speed to Increase Sales
Leveraging Website Speed to Increase SalesVendasta Technologies
 

Ähnlich wie Client-side Web Performance Optimization [paper] (20)

Web performance e-book
Web performance e-bookWeb performance e-book
Web performance e-book
 
Designers Guide to Web Performance Yotta 2013
Designers Guide to Web Performance Yotta 2013Designers Guide to Web Performance Yotta 2013
Designers Guide to Web Performance Yotta 2013
 
A Designer's Guide to Web Performance
A Designer's Guide to Web PerformanceA Designer's Guide to Web Performance
A Designer's Guide to Web Performance
 
Speed in digital marketing en
Speed in digital marketing enSpeed in digital marketing en
Speed in digital marketing en
 
Improving frontend performance
Improving frontend performanceImproving frontend performance
Improving frontend performance
 
Modern Web Applications
Modern Web ApplicationsModern Web Applications
Modern Web Applications
 
Proxy Not A Product
Proxy Not A ProductProxy Not A Product
Proxy Not A Product
 
Boosting your conversion rate through web performance improvements
Boosting your conversion rate through web performance improvementsBoosting your conversion rate through web performance improvements
Boosting your conversion rate through web performance improvements
 
Core Web Vitals in Website Design.pdf
Core Web Vitals in Website Design.pdfCore Web Vitals in Website Design.pdf
Core Web Vitals in Website Design.pdf
 
Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...
Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...
Is Poor Performance Dragging You Down? Here are Five Strategies to Maximize P...
 
Load Speed PSI development of webcore vitals
Load Speed PSI development of webcore vitalsLoad Speed PSI development of webcore vitals
Load Speed PSI development of webcore vitals
 
o2script(UK) - Top Site Execution_ Compelling Devices and Systems.pdf
o2script(UK) - Top Site Execution_  Compelling Devices and Systems.pdfo2script(UK) - Top Site Execution_  Compelling Devices and Systems.pdf
o2script(UK) - Top Site Execution_ Compelling Devices and Systems.pdf
 
Client-side Website Optimization
Client-side Website OptimizationClient-side Website Optimization
Client-side Website Optimization
 
Five performance factors you need to know about in 2018
Five performance factors you need to know about in 2018Five performance factors you need to know about in 2018
Five performance factors you need to know about in 2018
 
GlobalDots - How Website Speed Affects Conversion Rates
GlobalDots - How Website Speed Affects Conversion RatesGlobalDots - How Website Speed Affects Conversion Rates
GlobalDots - How Website Speed Affects Conversion Rates
 
Web performance and measurement - UKCMG Conference 2011 - steve thair
Web performance and measurement - UKCMG Conference 2011 - steve thairWeb performance and measurement - UKCMG Conference 2011 - steve thair
Web performance and measurement - UKCMG Conference 2011 - steve thair
 
An Introduction to Pagespeed Optimisation
An Introduction to Pagespeed OptimisationAn Introduction to Pagespeed Optimisation
An Introduction to Pagespeed Optimisation
 
Website Performance at Client Level
Website Performance at Client LevelWebsite Performance at Client Level
Website Performance at Client Level
 
load speed problems of web resources on the client side classification and ...
 load speed problems of web resources on the client side  classification and ... load speed problems of web resources on the client side  classification and ...
load speed problems of web resources on the client side classification and ...
 
Leveraging Website Speed to Increase Sales
Leveraging Website Speed to Increase SalesLeveraging Website Speed to Increase Sales
Leveraging Website Speed to Increase Sales
 

Mehr von Jakob

Frontend Performance @ Hochschule der Medien Stuttgart
Frontend Performance @ Hochschule der Medien StuttgartFrontend Performance @ Hochschule der Medien Stuttgart
Frontend Performance @ Hochschule der Medien StuttgartJakob
 
SDC2011: Web Performance Optimization
SDC2011: Web Performance OptimizationSDC2011: Web Performance Optimization
SDC2011: Web Performance OptimizationJakob
 
Automatisierung von Client-seitigen Web-Performance-Optimierungen
Automatisierung von Client-seitigen Web-Performance-OptimierungenAutomatisierung von Client-seitigen Web-Performance-Optimierungen
Automatisierung von Client-seitigen Web-Performance-OptimierungenJakob
 
HTML5 Video vs. Flash Video [paper]
HTML5 Video vs. Flash Video [paper]HTML5 Video vs. Flash Video [paper]
HTML5 Video vs. Flash Video [paper]Jakob
 
Client-side Performance Optimizations
Client-side Performance OptimizationsClient-side Performance Optimizations
Client-side Performance OptimizationsJakob
 
Flash vs. Silverlight auf dem mobilen Endgerät
Flash vs. Silverlight auf dem mobilen EndgerätFlash vs. Silverlight auf dem mobilen Endgerät
Flash vs. Silverlight auf dem mobilen EndgerätJakob
 
Flash Video vs. HTML5 Video
Flash Video vs. HTML5 VideoFlash Video vs. HTML5 Video
Flash Video vs. HTML5 VideoJakob
 
Ruby On Rails - 1. Ruby Introduction
Ruby On Rails - 1. Ruby IntroductionRuby On Rails - 1. Ruby Introduction
Ruby On Rails - 1. Ruby IntroductionJakob
 
Ruby On Rails - 2. Rails Introduction
Ruby On Rails - 2. Rails IntroductionRuby On Rails - 2. Rails Introduction
Ruby On Rails - 2. Rails IntroductionJakob
 
Ruby On Rails - 3. Rails Addons
Ruby On Rails - 3. Rails AddonsRuby On Rails - 3. Rails Addons
Ruby On Rails - 3. Rails AddonsJakob
 

Mehr von Jakob (10)

Frontend Performance @ Hochschule der Medien Stuttgart
Frontend Performance @ Hochschule der Medien StuttgartFrontend Performance @ Hochschule der Medien Stuttgart
Frontend Performance @ Hochschule der Medien Stuttgart
 
SDC2011: Web Performance Optimization
SDC2011: Web Performance OptimizationSDC2011: Web Performance Optimization
SDC2011: Web Performance Optimization
 
Automatisierung von Client-seitigen Web-Performance-Optimierungen
Automatisierung von Client-seitigen Web-Performance-OptimierungenAutomatisierung von Client-seitigen Web-Performance-Optimierungen
Automatisierung von Client-seitigen Web-Performance-Optimierungen
 
HTML5 Video vs. Flash Video [paper]
HTML5 Video vs. Flash Video [paper]HTML5 Video vs. Flash Video [paper]
HTML5 Video vs. Flash Video [paper]
 
Client-side Performance Optimizations
Client-side Performance OptimizationsClient-side Performance Optimizations
Client-side Performance Optimizations
 
Flash vs. Silverlight auf dem mobilen Endgerät
Flash vs. Silverlight auf dem mobilen EndgerätFlash vs. Silverlight auf dem mobilen Endgerät
Flash vs. Silverlight auf dem mobilen Endgerät
 
Flash Video vs. HTML5 Video
Flash Video vs. HTML5 VideoFlash Video vs. HTML5 Video
Flash Video vs. HTML5 Video
 
Ruby On Rails - 1. Ruby Introduction
Ruby On Rails - 1. Ruby IntroductionRuby On Rails - 1. Ruby Introduction
Ruby On Rails - 1. Ruby Introduction
 
Ruby On Rails - 2. Rails Introduction
Ruby On Rails - 2. Rails IntroductionRuby On Rails - 2. Rails Introduction
Ruby On Rails - 2. Rails Introduction
 
Ruby On Rails - 3. Rails Addons
Ruby On Rails - 3. Rails AddonsRuby On Rails - 3. Rails Addons
Ruby On Rails - 3. Rails Addons
 

Kürzlich hochgeladen

Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Farhan Tariq
 
Connecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdfConnecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdfNeo4j
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)Kaya Weers
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentPim van der Noll
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfIngrid Airi González
 
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxfnnc6jmgwh
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesThousandEyes
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Strongerpanagenda
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsPixlogix Infotech
 
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24Mark Goldstein
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterMydbops
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxLoriGlavin3
 
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical InfrastructureVarsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructureitnewsafrica
 
Testing tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examplesTesting tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examplesKari Kakkonen
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...itnewsafrica
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfpanagenda
 

Kürzlich hochgeladen (20)

Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...Genislab builds better products and faster go-to-market with Lean project man...
Genislab builds better products and faster go-to-market with Lean project man...
 
Connecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdfConnecting the Dots for Information Discovery.pdf
Connecting the Dots for Information Discovery.pdf
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)Design pattern talk by Kaya Weers - 2024 (v2)
Design pattern talk by Kaya Weers - 2024 (v2)
 
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native developmentEmixa Mendix Meetup 11 April 2024 about Mendix Native development
Emixa Mendix Meetup 11 April 2024 about Mendix Native development
 
Generative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdfGenerative Artificial Intelligence: How generative AI works.pdf
Generative Artificial Intelligence: How generative AI works.pdf
 
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptxGenerative AI - Gitex v1Generative AI - Gitex v1.pptx
Generative AI - Gitex v1Generative AI - Gitex v1.pptx
 
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyesHow to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
How to Effectively Monitor SD-WAN and SASE Environments with ThousandEyes
 
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better StrongerModern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
Modern Roaming for Notes and Nomad – Cheaper Faster Better Stronger
 
The Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and ConsThe Ultimate Guide to Choosing WordPress Pros and Cons
The Ultimate Guide to Choosing WordPress Pros and Cons
 
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
Arizona Broadband Policy Past, Present, and Future Presentation 3/25/24
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Scale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL RouterScale your database traffic with Read & Write split using MySQL Router
Scale your database traffic with Read & Write split using MySQL Router
 
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptxUse of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
Use of FIDO in the Payments and Identity Landscape: FIDO Paris Seminar.pptx
 
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical InfrastructureVarsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
Varsha Sewlal- Cyber Attacks on Critical Critical Infrastructure
 
Testing tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examplesTesting tools and AI - ideas what to try with some tool examples
Testing tools and AI - ideas what to try with some tool examples
 
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...Abdul Kader Baba- Managing Cybersecurity Risks  and Compliance Requirements i...
Abdul Kader Baba- Managing Cybersecurity Risks and Compliance Requirements i...
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdfSo einfach geht modernes Roaming fuer Notes und Nomad.pdf
So einfach geht modernes Roaming fuer Notes und Nomad.pdf
 

Client-side Web Performance Optimization [paper]

  • 1. Client-side Performance Optimizations Lecture: Ultra-large-scale Sites, Prof. Walter Kriha Student: Jakob Schröter jakob.schroeter@hdm-stuttgart.de January 2011 | WS 2010/2011 Computer Science and Media Stuttgart Media University, Germany This paper aims to show the importance of web performance and to sensibilize web developers to care about the end-users experience. Further it provides an overview of basic optimization techniques on the client-side with real-world examples to show how powerful these often simple optimizations are.
  • 2. Ultra-large-scale Sites Client-Side Performance Optimizations Content 1 Why performance matters ........................................................................................................ 3 2 The client-side ...........................................................................................................................5 3 Analyze and measure ................................................................................................................7 4 Basic optimization techniques ................................................................................................ 11 4.1 HTTP requests are expensive ........................................................................................... 11 4.2 Intelligent browser caching ............................................................................................. 11 4.3 Shrink request size .......................................................................................................... 13 4.4 Image optimizations........................................................................................................ 15 4.5 Loading resources & page rendering .............................................................................. 17 4.6 Domain sharding / CDN .................................................................................................20 4.7 JS & CSS performance......................................................................................................20 5 Automation ............................................................................................................................. 21 6 Conclusion and a look into the future ................................................................................... 22 References .......................................................................................................................................24 Jakob Schröter | Stuttgart Media University, Germany 2
  • 3. Ultra-large-scale Sites Client-Side Performance Optimizations 1 Why performance matters What counts to the success of a website is that the website users are satisfied and that they enjoy browsing through the site. Of course this has mostly to do with the quality of the content and the usefulness of the service. But also the user experience is an important factor for successful web projects – which includes fast response times. During a study Amazon slowed down their website by 100 ms with a result of a 1% drop in sales. Yahoo made similar experiments with 400 ms, resulting 5-9% drop in requests. Also Google increased the number of results per page from 10 to 30 which added 500 ms to the response time. After this change they measured a 20% drop in search requests. Further they realized that it takes a few days until they reach their old number of requests again. Shopzilla saved about 5 seconds of their total page loading time with performance optimizations, resulting in 25% increase in requests, 7-12% increase in revenue and 50% reduction in hardware. Amazon +100 ms 1% drop in sales Yahoo +400 ms 5-9% drop in requests Google +500 ms 20% drop in requests Bing +2000 ms 4.3% drop in revenue/user [1] Shopzilla -5000 ms 25 % increase in requests 7-12% increase in revenue 50% reduction in hardware [2] Mozilla -2200 ms 15.4% increase in downloads [3] These numbers show that even minimal differences in the response time can have significant effects on the business. [4] When having a look at the 3 response-time limits by usability guru Jakob Nielson this can be confirmed: The users sense an interruption in what there are doing by response times higher than 0.1 seconds. The response time should be less than 1 second to allow a good navigation and let the users feel that they are in control. 10 seconds would be an unacceptable response time. The user experience would be interrupted at a disturbing high rate and the user would probably leave the site. By the way, these response-time limits haven’t change since 40 years. [5] But it’s not all about time - the user-experience matters To provide good user-experience it’s worthwhile to manage that the user doesn’t notice that he’s actually waiting for the loading of a page. Even if there is no possibility to avoid the loading time of e.g. a huge background image there are plenty of ways to enhance the user- experience since the human sense of time is relative. For example the background image can be loaded as the last resource, so the website in general is already usable before the image is loaded. Same works for JavaScript-enhanced elements: Tests on Facebook showed that it’s best to avoid white screens and immediately show content even if the functionality behind it isn’t loaded yet [6]. This doesn’t change anything on the total page loading time, but the perceived loading time will decrease since the user can already read the content and maybe jump to Jakob Schröter | Stuttgart Media University, Germany 3
  • 4. Ultra-large-scale Sites Client-Side Performance Optimizations another page via the navigation even if the current page isn’t completely loaded yet. The user feels a faster response time. It also helps to show a notification like a progress bar if the action takes a few seconds. So the user knows that the computer is still working and isn’t crashed. Or why not just give the user something to do while he’s waiting? For example, let him already add tags to the pictures he’s currently uploading. So keep in mind that in the end the perceived page loading time is important. By the way, Google announced in April 2010 that from now on the page speed will be a factor for ranking websites in the search results [7]. Jakob Schröter | Stuttgart Media University, Germany 4
  • 5. Ultra-large-scale Sites Client-Side Performance Optimizations 2 The client-side A few years ago, when talking about website performance it was only about optimizing the server-side and reducing the generation-time of the HTML-output. But nowadays the server- side seems not to be the main problem: some guys from Yahoo found out that on an average only 10-20% of the loading time is spent on the server-side; 80-90% is spent at the client-side, that means in the users browser [8] [9]. Average loading time of a website 10-20% Server-side Client-side 80-90% Server Client Waterfall chart of web.de generated with webpagetest.org The example waterfall chart above shows that just the loading of resources on the client-side needs way more time than generating the main HTML page on the server. So performance is not only the job of the guys working on the backend – it’s also an important topic for frontend engineers. Jakob Schröter | Stuttgart Media University, Germany 5
  • 6. Ultra-large-scale Sites Client-Side Performance Optimizations When working for the web one usually assumes that the client is thin. Nowadays this is only partially true for modern web applications. They use a lot of JavaScript and CSS to create a rich user interface. That means more and more logic lies on the client and the server is sometimes only used for persistently saving the data. There doesn’t have to be an obligatory HTTP request for each user interaction anymore. The browser Browsers developed to pretty complex applications. Requesting the first URL, following redirections, receiving and parsing the main HTML site and after the loading of additional resources like CSS, JavaScript and images the browser renders the content. This means rendering text in the defined CSS styles, rendering images, calculating the flow and so on. With the introduction of CSS3, the Canvas element and SVG browsers are not only simply drawing black text on a white screen anymore – they come with support for rich graphic effects like drop shadows, transformations like rotations and animations. So there’s more and more computing power needed to display a website. Also browsers are now able to play video and audio files by themselves. And the execution of JavaScript needs time too. With every release the browser manufacturers are working hard on their browsers performance, trying to beat the competitors with compiling JavaScript engines, Hardware-accelerated rendering, … Jakob Schröter | Stuttgart Media University, Germany 6
  • 7. Ultra-large-scale Sites Client-Side Performance Optimizations 3 Analyze and measure Considering 80-90% of the loading time is spent on the client-side it makes sense to have a look on how the performance can get tuned here. But before implementing some optimizations it’s important to analyze the bottlenecks and set up test cases to measure the performance before and after the optimizations. Not long ago browsers behaved like a black box – a web developer couldn’t easily see what was going on after the users hits the enter button in the address bar. Today there are plenty of great tools available, just to name a few: Firebug1 runs as a browser plugin for Firefox and gives beside many handy debug tools for frontend developers the ability to track for example all network requests or to profile JavaScript function calls. 1 http://getfirebug.com/ Jakob Schröter | Stuttgart Media University, Germany 7
  • 8. Ultra-large-scale Sites Client-Side Performance Optimizations Yahoo YSlow2 is a plugin for Firebug which can test a website for many basic optimizations. It’s a great tool for the beginning. The tests are based on the best practices3 from the Yahoo Exceptional Performance team. Google Page Speed4 is also a plugin for Firebug, also based on performance rules. It also includes minifying of HTML, CSS and JavaScript files. 2 http://developer.yahoo.com/yslow/ 3 http://developer.yahoo.com/performance/rules.html 4 http://code.google.com/speed/page-speed/docs/extension.html Jakob Schröter | Stuttgart Media University, Germany 8
  • 9. Ultra-large-scale Sites Client-Side Performance Optimizations Google Chrome Speed Tracer5 is an extension for Google Chrome and offers a deep insight what the browser is doing. Starting from the loading of resources, executing JavaScript, CSS selector matching, paint processes to garbage collection. Also commercial tools are available, e.g. HTTPWatch6 and dynaTrace7. As explained in chapter 1 it’s not only the loading time that matters. So only measuring the loading time or e.g. the time until the browsers onload-event is fired isn’t enough [10]. More important is the time it takes until the page is usable – that means when the important content and the navigation is visible to the user. The speed limiter function in OWASP WebScarab8 can emulate a slow internet connection to see e.g. how all the images get loaded one after each other. In some cases it really helps to get a live impression on how the site is being loaded and in which order the content will be shown to the user. 5 https://chrome.google.com/extensions/detail/ognampngfcbddbfemdapefohjiobgbdl 6 http://www.httpwatch.com/ 7 http://www.dynatrace.com/ 8 http://www.owasp.org/index.php/Category:OWASP_WebScarab_Project Jakob Schröter | Stuttgart Media University, Germany 9
  • 10. Ultra-large-scale Sites Client-Side Performance Optimizations It’s also recommended to do remote testing from clients around the globe. For example WebPagetest 9 and Zoompf 10 are offering great tools which combine tools like Google PageSpeed. WebPagetest also allows the definition of a DOM element (which contains important content, e.g. the DIV of the main content) and track the time until the element is available in the DOM. Also “slow-motion” videos can be generated to get an insight look on the rendering process of the site. Further it’s possible to track the page load times of real visitors. This can be done for example with the event tracking feature of Google Analytics11 or the Episodes12 framework. As often it’s hard to get real precise test results since many external factors interfere with the measured numbers. This includes server load, network latency, the workload of the client machine, differences between browsers and so on. Therefore the tests should always be run multiple times. 9 http://www.webpagetest.org/ 10 http://zoompf.com/free 11 http://blog.yottaa.com/2010/10/how-to-measure-page-load-time-with-google-analytics/ 12 http://stevesouders.com/episodes/ Jakob Schröter | Stuttgart Media University, Germany 10
  • 11. Ultra-large-scale Sites Client-Side Performance Optimizations 4 Basic optimization techniques Many of the now described techniques are actually not new; some people were already discussing them a few years ago. But unfortunately still many developers and top websites don’t profit from them. 4.1 HTTP requests are expensive Every HTTP request requires time and uses a request slot, even if the transferred data has only a few bytes. Nowadays the transfer speed is not the biggest problem, once a connection is established the transfer runs fast, even on mobile networks. Latency is the major challenge to deal with. Keep in mind that every redirect causes a new request and according to tests of Steve Souders most of the modern browsers don’t cache redirects. So these additional requests are also slowing down a website [11]. Avoid HTTP requests when possible Because every request comes always with latency it’s important to reduce the amount of requests wherever it’s possible. Many requests can be saved by the smart combination of resource files. For example all JavaScript files which are loaded on every page, e.g. base.js, dragndrop.js and animation.js can be combined in one single file (this can be automated, see chapter 5). If there is a huge JavaScript file for only a few specific pages (e.g. uploader.js) then it should be kept separate, because the code is not needed on the other pages. The same should be done with CSS files. When some more requests are made during the user uses the website, e.g. an form field offers a nice autosuggest function which calls the backend for JSON responses it sometimes make sense to deliver more results as the user may need in first step, but which he may will need later on. Further the number of requests can be dropped considerably by combining images using CSS slices as described further on in chapter 4.4. 4.2 Intelligent browser caching The HTTP protocol specifies great possibilities for caching files on the client and many of them are widely supported by modern browsers. This really isn’t something new, but it often seems they got forgotten in developers’ minds. It’s not uncommon that developers or server administrators just set all HTTP headers to “no-cache” to disable the browser cache so they don’t have any troubles with cached but outdated content. But if the right HTTP headers will get sent to the browser, client-side caching can boost the performance of a website, like on the server-side. Jakob Schröter | Stuttgart Media University, Germany 11
  • 12. Ultra-large-scale Sites Client-Side Performance Optimizations Not modified header and the ETag Webservers can be configured to send an ETag13 (entity tag) header in file responses. This ETag can e.g. be a Md5 hash of the file, so after modifying the file the ETag will be different. If the browser saved a file with an ETag in its cache, it will add the ETag to the next request for the same file. Now the server checks, if the ETag is still valid (the file hasn’t changed). If it’s valid the server only sends a HTTP/1.x 304 Not Modified response, without sending the whole file content again. The browser will use the cached file instead. If the file has changed, the server sends the whole file as usual. With this technique a lot of traffic can be saved, but there’s still a HTTP to be made. Expires The Expires header14 tells the browser the exact date and time when a file will expire. Until this date the browser doesn’t make any new requests for this file. So this is perfect for static files where it’s known that they won’t change during the time specified in the Expires header. If the project has e.g. a weekly release cycle every Wednesday it’s possible to set the static files to expire on this date. But mostly it’s a safer way to use cache busters as described in the following chapter. Cache busters After the release of a new version of the website it’s very important that all users are getting the latest version of the resource files, like CSS and JS. Wrong configured caching can become a huge problem, imagine what happens when the users are getting the latest HTML page but are still using an outdated CSS and JavaScript file… 13 http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.19 14 http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.21 Jakob Schröter | Stuttgart Media University, Germany 12
  • 13. Ultra-large-scale Sites Client-Side Performance Optimizations Cache busters force the browser to reload the file. This is done by appending e.g. a version- number so the browser thinks it’s a complete different file. Mod_rewrite15 can be used for that so on the server-side it can still be the same filename. This can look for example like this: /scripts/uploader-158846.js -> /scripts/uploader.js or directory-based /scripts/158846/uploader.js When possible, cache busters as a get parameter (uploader.js?158846) should be avoided since some proxies are configured to don’t cache files with get parameters [12]. Using Expire headers together with cache busters should be preferred instead of using ETags. But unfortunately web developers still can’t rely on caching as much as it would be desirable. According to Yahoo!'s Exceptional Performance Team, 40% to 60% of Yahoo!'s users have an empty cache experience and about 20% of all page views are done with an empty cache. This surprising fact outlines the importance of keeping websites as lightweight as possible as described in the following chapter. [13] 4.3 Shrink request size Transferring bytes always consumes time; therefore the amount of data being transferred from the server to the client and vice versa should be as small as possible. Just one example: Google Maps increased their number of map requests by 30% after shrinking their total file size by 30% [14]. To archive a minimal file size it helps to use light data formats like JSON instead of XML. Further great savings can be gained by minifying and compressing files: Minifying CSS and JavaScript files usually contain describing variable names, comments, whitespaces and line breaks to be easy readable by humans. But the browser doesn’t need all these information. By removing these characters the file size can be shrinked drastically. There are a handful tools available like YUI Compressor16 and Dojo ShrinkSafe17 for minifying JS and CSS files. Even HTML files can be minified. 15 http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html 16 http://developer.yahoo.com/yui/compressor/ 17 http://shrinksafe.dojotoolkit.org/ Jakob Schröter | Stuttgart Media University, Germany 13
  • 14. Ultra-large-scale Sites Client-Side Performance Optimizations Compressing All modern browsers support compressed content18. This means all plain text content like HTML, CSS, JS, JSON, XML etc. can be compressed on the server-side and the browser will decompress the content right away before using it. Of course binary files like images, PDF and SWF shouldn’t be compressed again since they are already compressed. The compression can easily be enabled in the configuration of most webservers (e.g. by enabling mod_deflate in Apache). There’s no need to change anything of the sites code, everything is done by the webserver. Dynamic content will be compressed on-the-fly while static content will automatically be served as a cached compressed version. By the way, the additional computing power needed to compress and decompress the data generally doesn’t cause any problems. The following table demonstrates as an example what amount of data can be saved by minifying and compressing HTML, CSS and JS files on hdm-stuttgart.de [15]: Original Minified Compressed Minified + compressed HTML 101 KB 97 KB 17 KB 16 KB CSS 90 KB 68 KB 19 KB 14 KB JS 243 KB 195 KB 73 KB 63 KB Sum 434 KB 360 KB 109 KB 93 KB So by minifying and compressing HTML, CSS and JS files their transferred file size can be reduced by 341 KB – that means 79%! Unfortunately by October 2010, still 47% of the top 1000 websites weren’t using compression, although it’s very easy to enable and has amazing potential to speed up websites! [16] For example the #9 top site of Germany19 Spiegel.de, could save a lot of traffic if they would minify and compress their content. Just to provide a roundabout number: on every visit with an empty cache 436 KB of traffic could be saved. Projected on their 380.000.000 page impressions per month 20 and considering the fact from the previous chapter that 20% of all page impressions are done with an empty cache this would sum up in a saving of about 32 TB per month. 18 The community-driven website http://www.browserscope.org/ offers great compares about different browsers and versions. 19 according to http://www.alexa.com/topsites/countries/DE 20 http://www.spiegel.de/extra/0,1518,249387,00.html Jakob Schröter | Stuttgart Media University, Germany 14
  • 15. Ultra-large-scale Sites Client-Side Performance Optimizations 4.4 Image optimizations Use the right image format & dimensions First, by choosing the optimal image format the file size can be reduced drastically while even providing better image quality. In general the JPG format should be used for images with a high number of colors. PNG is best for rendered texts e.g. headlines (which better should be a plain text anyway) and of course for images with alpha transparency. Background images usually can be saved with a high compression rate. It’s not uncommon that images are delivered in a higher resolution as they are displayed in the browser. So they are downscaled via the browser, which consumes needless network bandwidth and rendering time of the browser. Whenever possible, the width and height attributes of <img> tags should be filled [17]. CSS Sprites Like JS and CSS files also image files can be combined to save an enormous amount of requests. Therefore a huge image will be made up of the single images. This image will be defined as the background image of HTML elements and with the CSS background-position the image will get positioned so that only one single image is visible. [18] When combining images with similar colors the best compression rate can be achieved. Google for example combined 53 images into one file. Take a look at the CSS listing to see how the selection of a single image out of the huge sprite is done: Jakob Schröter | Stuttgart Media University, Germany 15
  • 16. Ultra-large-scale Sites Client-Side Performance Optimizations a.button { width: 13px; height: 13px; background: url(sprite.png) no-repeat; background-position: -19px -193px; } a.button:hover { background-position: -35px -193px; } Remove meta data Further with the removal of useless meta information like EXIF data quite a sum of bytes can be saved. Dependent on the use case some meta information like copyright hints may be important but in general these meta data are not needed for images on a webpage. [19] Yahoo provides with Smush.it21 a great service for automatically optimizing images without quality loss. Also Google PageSpeed includes image optimizations like removing meta data: For example 124 KB or 67% of the total image size could be saved by removing meta data on hdm-stuttgart.de, without any loss in image quality. 21 http://www.smushit.com/ Jakob Schröter | Stuttgart Media University, Germany 16
  • 17. Ultra-large-scale Sites Client-Side Performance Optimizations 4.5 Loading resources & page rendering The general aim should be to show the most important content first and as fast as possible. For example this could be the main article and the page navigation. The order of loading the site resources like JavaScript, CSS and images files has a wide influence on the rendering process in the browser. The network waterfall charts of e.g. Firebug can give helpful insights on how the browser loads the resources on the site. Correct order of HTTP requests A best practice is to put the CSS file (hopefully all files are combined into one) as the very first resource in the <head> tag. This helps to avoid strange content jumping while loading your page. Otherwise it can happen that the browser displays your content for example with the default font and later on has to redraw it with the font defined – which is not only confusing for the user (the so called “Flash of unstyled content”), it also consumes needless computing power and slows down the website rendering. Non-blocking JavaScript The browser waits until all JavaScript files defined in the <head> section are loaded before it begins to render the page. Also <script> elements in the <body> section are blocking the browser from rendering all following HTML elements until the JavaScript file is loaded and executed. Further all other resource downloads are paused while loading JavaScript files since the loaded script could modify the DOM again; luckily new browsers are trying to fix this. But not only the loading of JavaScript blocks the page rendering, also all rendering of the page content is paused and new resource downloads are blocked while executing JavaScript [10]. It’s well-known that this behavior isn’t optimal, so the HTML5 specification offer async and defer attributes22 for <script> elements to define when the script should be loaded and executed. The async attribute tells the browser to load the script asynchronous and execute it as soon as it’s loaded. That means the scripts will be executed in the order they got loaded, not in the order of the <script> elements. Also the DOM may not be complete when executing this script. With the defer attribute the script will be loaded after all other site resources were loaded and executed when the DOM is ready. The order of execution will be kept. Unfortunately these attributes are not well supported by the major browsers yet. Nevertheless common JavaScript libraries provide utilities for asynchronous JavaScript and CSS loading, e.g. YUI3 Get23. And with a few hacks the loading and also the parsing and execution of JavaScript can be controlled separately. It’s worth to take a look at the JavaScript module ControlJS24 from Steve Souders which allows this, even though it’s not yet recommended to use this tool in an production environment. [20] 22 http://www.whatwg.org/specs/web-apps/current-work/multipage/scripting-1.html 23 http://developer.yahoo.com/yui/3/get/ 24 http://stevesouders.com/controljs/ Jakob Schröter | Stuttgart Media University, Germany 17
  • 18. Ultra-large-scale Sites Client-Side Performance Optimizations The following figure shows how JavaScript files block the browser from fetching other resource files like images and block the page rendering. The green line indicates when the rendering process starts. Waterfall chart without ControlJS (IE8) [20] The next figure shows the same page using ControlJS. The user will see the page right after the HTML has been loaded and doesn’t have to wait about 4 seconds until all JavaScript files where loaded. Also the images will get loaded first and parallel to the JavaScript files. Waterfall chart with ControlJS (IE8) [20] In general an eye should be kept on which files really need to be loaded to display the first state of the website and the principle of progressive enhancement should be followed. That means allowing the browser to render the plain HTML page as fast as possible and enhance it with JavaScript after the page was rendered. Frontend single points of failure Just think about the following case: An external JavaScript file e.g. from an advertisement company is loaded in the <head> section of a site. What happens to the site when the external server is down? The whole site won’t show up until the browser decides to timeout the request! That’s a single point of failure. [21] So it’s not only from the performance point of view wise to load resources in a non-blocking way. This especially applies for external resources so 3rd parties don’t slow down the own site. Jakob Schröter | Stuttgart Media University, Germany 18
  • 19. Ultra-large-scale Sites Client-Side Performance Optimizations Intelligent pre/lazy-loading It’s possible to preload resources which will be used later on. For example it can help to preload huge JavaScript files while the user enters his login data on the login page. After he has logged in all needed JavaScript files are already in the browsers cache and can be used immediately. But of course it’s important to start the preloading after the current page has been rendered so e.g. the login page is already usable before the preloading starts. Also lazy-loading is possible for content which is first of all not visible for the user. A common use case is images which are currently not visible and only will get visible after the user scrolls down. YouTube for example lazy loads the thumbnails of the suggested clips only when the user scrolls down. Many JavaScript libraries like YUI provide easy functions25 to implement this behavior. Also lazy-loading of other resources can result in a huge performance boost. Progressive rendering An interesting, but depending of the application architecture sometimes difficult to implement technique is to send the generated HTML code as early as possible to the client, even if it isn’t completely generated. For example, in PHP this can be done with the flush() method as shown in the example below. Browsers can already parse the first code lines and e.g. start loading CSS and JavaScript files during the rest of the document is being generated on the server. [22] [19] <html> <head> <title>the page</title> <link href="my.css" /> <script src="my.js"></script> </head> <?php flush(); ?> <body> <div>site navigation</div> <div>main content</div> <?php flush(); ?> <div>some user comments</div> <div>some ads</div> ... 25 e.g. http://developer.yahoo.com/yui/3/imageloader/ Jakob Schröter | Stuttgart Media University, Germany 19
  • 20. Ultra-large-scale Sites Client-Side Performance Optimizations 4.6 Domain sharding / CDN Browsers only allow a specific number (2-6) of parallel HTTP connections to the same host name. If all resource files are hosted under the same host name it will take a while until the files are loaded since there are only e.g. 2 download slots available. To improve concurrency it makes sense to split up the resources on e.g. 2 additional (sub-)domains. Also a lightweight webserver26 can be used for delivering static files like JavaScript, CSS and images. They have faster response times and can unload the main application servers. Further the domain should be cookie-free, so the browser doesn’t send a cookie with every request for static files. This saves traffic and computing power. In addition content delivery networks with servers around the globe provide better response times since they will choose the nearest server based in the location of the user. 4.7 JS & CSS performance More and more websites are extensively using JavaScript – especially web 2.0 sites. Some of them put the whole creation of the DOM in the hand of JavaScript. From this it follows that the performance of JavaScript is getting more importance. All browser manufacturers are working hard to optimize their engines and speed up the JavaScript execution. But also web developers can do a lot to optimize their code. If a website is heavily using JavaScript, it’s worth to follow JavaScript best practices. [23] The same applies for CSS – it might sound a bit beside the point, but for example some CSS selectors have considerable better performance than others. The most misunderstood fact is that browsers are interpreting CSS selectors from right to left and not like many people would guess from left to right. One example: #myElement li a {color: red;} Actually this selector seems to be very efficient, getting the element with the id myElement, search for children of the type <li> and then applying the font color to all children from the type <a>. Instead the browser iterates over all <a> tags on the entire page, checks if they may are a child of a <li> element in multiple levels and then checks if they are also a child of the element with the id myElement. [19] From performance point of view the rule above would be faster when e.g. using only one class selector and apply this class name to all <a> elements: .myElement-li-a {color: red;} Steve Souders created a test suit27 to compare the performance of (own) CSS selectors. Further the new CSS3 shadow and transform effects should be used with care; in some circumstances they can extremely slow down a website. 26 e.g. http://www.nginx.org/ or http://www.lighttpd.net/ 27 http://stevesouders.com/efws/css-selectors/tests.php Jakob Schröter | Stuttgart Media University, Germany 20
  • 21. Ultra-large-scale Sites Client-Side Performance Optimizations 5 Automation It’s important that performance optimizations don’t break the development process. Struggling with minified JavaScript and CSS files in the development environment is no fun at all. And manually minifying and combining them before every release is a time-taking and also error-prone job. Therefore the aim should be to integrate as much as possible into the deployment process. This has also advantages when working on a huge project with dozens of people since it’s hard to convince every developer to follow the optimization rules. Optimizations like minifying and combining CSS and JavaScript files can be done automatically during the deployment process (e.g. via Ant) so there is still a nice modular file structure in the development environment. In addition some companies such as Strangeloop28 or Blaze29 offer commercial out-of-the-box optimizations tools. The trend goes to transformation-based performance optimization. This means these tools will automatically modify the HTML output and optimize resources without the need of far-reaching changes on the application. It’s a challenge e.g. for sites relying heavily on JavaScript, Ajax and third party content. But for more simple HTML sites and particularly smaller (e.g. private) sites this approach can be useful. Also Google recently released their open-source Apache module mod_pagespeed30 which does performance optimizations like compressing, minifying, image optimization, combining of JS and CSS files and so on automatically on-the-fly. The idea sounds very promising; the module even chooses the optimal optimizations depending on the users’ browser. It’s worth giving it a try. But some tests from Aaron Peters showed that the module can even slow down a website since it consumes computing power on the server [24]. Further automated performance tests could be set up to ensure e.g. a new feature doesn’t slow down the website dramatically. For example the tool ShowSlow31 allows the tracking of YSlow, Page Speed and dynaTrace rankings over time. 28 http://www.strangeloopnetworks.com/ 29 http://www.blaze.io/ 30 http://code.google.com/speed/page-speed/docs/module.html 31 http://www.showslow.com/ Jakob Schröter | Stuttgart Media University, Germany 21
  • 22. Ultra-large-scale Sites Client-Side Performance Optimizations 6 Conclusion and a look into the future Not only the end-user profits from snappy websites – often the servers and the networks get unloaded and bandwidth is saved. So also from an economical and ecological point of view performance optimizations are worthwhile. Like mentioned in chapter 1, Shopzilla reduced their hardware by 50% after performance optimizations [2]. So with performance optimizations a lot of money can be saved and also earned when e.g. beating competitors’ site speed and getting more satisfied customers. Fred Wilson, a New York based tech investor said in March 2010 that he sees speed as the most important feature of an web application [25]. It’s just the beginning Comparable to Search Engine Optimization (SEO) a new industry specialized on performance optimizations has grown: Web Performance Optimization (WPO) [26]. The establishment of the W3C Web Performance Working Group32 shows that there is effort to standardizing performance metrics in browsers, e.g. with the Navigation Timing33 specification. Also just think about performance on mobile devices. Mobile client-side web performance is already a big topic and will get as important as desktop web performance [27]. While there is a bunch of well working tools like Firebug available for measuring desktop performance, there is still a lack of good tools for mobile browsers. And since browsers supporting rich graphic effects like drop-shadows the client-side performance will get more attention in the future. For example all major browser manufactures are already working on hardware-accelerated website rendering. Further interesting research is done like the Diffable34 project by Google which aims to provide an tool that only downloads the deltas between cached static files and the updated ones. So when e.g. a new version of Google Maps is released, the browser only needs to download a diff- file with maybe 20 KB instead of the full JavaScript file with 300 KB. [28] To further reduce the number of HTTP connections the idea of Resource Packages came up. All resource files can be packed into one single ZIP file which is referenced in the documents <head> section. So the transfer can be done by one single data stream. Good news are, that single files can be progressively accessed while the (huge) ZIP file is still loading and the loading order can be defined. The idea sounds really promising and could replace CSS sprites which are often difficult to maintain. [29] Also in the future local storages could be used to cache application data and have a better control over cached files on the client-side. But beside all benefits, website performance may not be the highest priority optimization for every website. There is no one-click solution yet to perfectly boost the performance of any random website. Also the performance best practices are always in a change, so the best optimization rule for browser X may hurt performance in browser Y, or even in a newer version of browser X. Going into the details of website performance is a very complex, time- taking and somehow endless task when digging into micro-optimizations. On the other side 32 http://www.w3.org/2010/webperf/ 33 http://www.w3.org/TR/2010/WD-navigation-timing-20101207/ 34 http://code.google.com/p/diffable/ Jakob Schröter | Stuttgart Media University, Germany 22
  • 23. Ultra-large-scale Sites Client-Side Performance Optimizations hardware is getting faster and browsers are continuously improving the performance, without the need to change anything on the own website. So for smaller sites the ROI may not be worthwhile. Depending on e.g. the CMS or shop system which the website is based on it might be tricky to implement even simple optimizations. Nevertheless every web developer should have a basic knowledge about performance optimizations and also small sites should at least adopt the basic optimizations like enabling compression. They are very easy to adopt and have a huge benefit. With basic knowledge and little attention to performance enormous bottlenecks can be avoided right from the launch of a site. For further reading and latest news it’s worth to take a look at the blog35 of client-side performance guru Steve Souders. 35 http://www.stevesouders.com/blog/ Jakob Schröter | Stuttgart Media University, Germany 23
  • 24. Ultra-large-scale Sites Client-Side Performance Optimizations References [1] Eric Schurman and Jake Brutlag. (2009, June) Performance Related Changes and their User Impact. [Online]. http://www.slideshare.net/dyninc/the-user-and-business-impact- of-server-delays-additional-bytes-and-http-chunking-in-web-search-presentation [2] Steve Souders. (2009, July) O'Reilly Radar: Velocity and the Bottom Line. [Online]. http://radar.oreilly.com/2009/07/velocity-making-your-site-fast.html [3] Blake Cutler. (2010, March) Blog of Metrix: Firefox & Page Load Speed. [Online]. http://blog.mozilla.com/metrics/category/website-optimization/ [4] Website Optimization, LLC. (2008, May) The Psychology of Web Performance. [Online]. http://www.websiteoptimization.com/speed/tweak/psychology-web-performance/ [5] Jakob Nielsen. (2010, June) Website Response Times. [Online]. http://www.useit.com/alertbox/response-times.html [6] Zizhuang Yang. (2009, August) Facebook: Every Millisecond Counts. [Online]. http://www.facebook.com/note.php?note_id=122869103919 [7] Amit Singhal and Matt Cutts. (2010, April) Official Google Webmaster Central Blog: Using site speed in web search ranking. [Online]. http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search- ranking.html [8] Tenni Theurer. (2006, November) Yahoo! User Interface Blog: Performance Research, Part 1: What the 80/20 Rule Tells Us about Reducing HTTP Requests. [Online]. http://www.yuiblog.com/blog/2006/11/28/performance-research-part-1/ [9] Steve Souders, High performance web sites: essential knowledge for frontend engineers, O'Reilly, Ed., 2007. [10] Steve Souders. (2010, September) High Performance Web Sites blog. [Online]. http://www.stevesouders.com/blog/2010/09/30/render-first-js-second/ [11] Steve Souders. (2010, July) High Performance Web Sites blog: Redirect caching deep dive. [Online]. http://www.stevesouders.com/blog/2010/07/23/redirect-caching-deep-dive/ [12] Steve Souders. (2008, August) High Performance Web Sites blog: Revving Filenames: don’t use querystring. [Online]. http://www.stevesouders.com/blog/2008/08/23/revving- filenames-dont-use-querystring/ [13] Tenni Theurer. (2007, January) Yahoo! User Interface Blog: Performance Research, Part 2: Browser Cache Usage – Exposed! [Online]. Jakob Schröter | Stuttgart Media University, Germany 24
  • 25. Ultra-large-scale Sites Client-Side Performance Optimizations http://yuiblog.com/blog/2007/01/04/performance-research-part-2/ [14] Stephen Shankland. (2008, May) CNET News: We're all guinea pigs in Google's search experiment. [Online]. http://news.cnet.com/8301-10784_3-9954972-7.html [15] Jakob Schröter. (2010, January) Client-side Performance Optimizations. [Online]. http://www.slideshare.net/jakob.schroeter/clientside-performance-optimizations [16] Joshua Bixby. (2010, October) Almost half of the top 1000 retail sites don’t follow two easy performance best practices. Does yours? [Online]. http://www.webperformancetoday.com/2010/10/22/alexa-1000-performance-best- practices/ [17] Website Optimization, LLC. (2004, September) Size Images with Width and Height Attributes. [Online]. http://www.websiteoptimization.com/speed/tweak/size/ [18] Sven Lennartz. (2009, April) Smashing Magazine: The Mystery Of CSS Sprites: Techniques, Tools And Tutorials. [Online]. http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques- tools-and-tutorials/ [19] Steve Souders, Even Faster Web Sites, O'Reilly, Ed., 2009. [20] Steve Souders. (2010, December) High Performance Web Sites blog: ControlJS part 1: async loading. [Online]. http://www.stevesouders.com/blog/2010/12/15/controljs-part-1/ [21] Steve Souders. (2010, June) High Performance Web Sites blog: Frontend SPOF. [Online]. http://www.stevesouders.com/blog/2010/06/01/frontend-spof/ [22] Stoyan Stefanov. (2009, December) Progressive rendering via multiple flushes. [Online]. http://www.phpied.com/progressive-rendering-via-multiple-flushes/ [23] Nicholas C. Zakas, High Performance JavaScript.: O'Reilly, 2010. [24] Aaron Peters. (2010, December) Performance Calendar: Mod_Pagespeed Performance Review. [Online]. http://calendar.perfplanet.com/2010/mod_pagespeed-performance- review/ [25] Keir Whitaker. (2010, March) Think Vitamin: Fred Wilson’s 10 Golden Principles of Successful Web Apps. [Online]. http://thinkvitamin.com/web-apps/fred-wilsons-10- golden-principles-of-successful-web-apps/ [26] Steve Souders. (2010, December) Performance Calendar: 2010 State of Performance. [Online]. http://calendar.perfplanet.com/2010/state-of-performance/ [27] Joshua Bixby. (2011, January) RCR Wireless News: Reader Forum: 2011 Web performance predictions for the mobile industry. [Online]. http://www.rcrwireless.com/article/20110103/READERFORUM/101229979/reader-forum- Jakob Schröter | Stuttgart Media University, Germany 25
  • 26. Ultra-large-scale Sites Client-Side Performance Optimizations 2011-web-performance-predictions-for-the-mobile-industry [28] Steve Souders. (2010, July) Diffable: only download the deltas. [Online]. http://www.stevesouders.com/blog/2010/07/09/diffable-only-download-the-deltas/ [29] Alexander Limi. (2009, November) Making browsers faster: Resource Packages. [Online]. http://limi.net/articles/resource-packages/ [30] Stoyan Stefanov. (2010, November) Progressive Downloads and Rendering. [Online]. http://www.slideshare.net/stoyan/progressive-downloads-and-rendering Jakob Schröter | Stuttgart Media University, Germany 26