Neither developers nor SEOs can “design” a website without JavaScript. Because JS makes a website so much better. Everybody loves to interact with a website!
However, JS presents a challenge for SEOs. The best way to overcome the challenges generated by JS is to work hand in hand with developers & designers.
The goal of this talk is to dispel some myths & identify what developers should keep in mind when developing a JS-based website.
2. Let me introduce myself
– I 💛 SEO
– I work at Liip
– I’m co-host of a Meetup called
#SEOnerdSwitzerland
– I’m part of the Women in Digital
Community & support the St.Gallen
chapter since a month
– I think that the most inspiring
community is Women in tech SEO.
3. Agenda
– 1st Myth: “SEOs & Devs have different goals”
– 2nd Myth: “Google can’t index JS”
– 3rd Myth: “SEOs don’t care about performance”
– 4th Myth: “Google has no problem with SPA”
– 5th Myth: “Pre-rendering is always the best solution”
All these myths will help explain the importance of speed & rendering for SEOs.
5. Why SEO matters for Businesses,
Users & Devs
Businesses: SEO is a channel that allows them to make money.
Users: What’s good for users is good for SEO. E.g: responsive design
Devs: SEOs & Devs have the same goal: helping businesses improve their
results. Ensure that users are happy on our website.
How can we help both? By working on speed, for example.
6. Bad examples of websites with
generated JS content
https://www.onely.com/blog/ultimate-guide-javascript-seo/
7. Why SEOs are “obsessed” with JS
Because JS impacts:
1. Rendering
2. Performance
...& both are important for SEO.
10. Crawling
⚡Crawling is the process of discovering
new or updated content.
⚡To find this content, search engines
such as Google send robots (crawlers)
within the net. In order to discover this
content, Google needs hyperlinks.
⚡Content may vary - it could be a
webpage, image, video, PDF, etc.
11. Rendering of different steps
Souce: https://developers.google.com/speed/pagespeed/insights/
12. Rendering
⚡To Render a website, Google uses the
web rendering service (WRS).
⚡Google tries to render a page as in the
Browser, so that it can see the page the
same way a user would.
⚡Rendering is dedicated as Google
needs to render a page properly before
indexing it. The more resources there
are, the greater the risk that some will be
wrong as each resource needs to be
fetched.
⚡Rendering a website requires a lot of
Google resources, but resources are
13. How devs can help the WRS
⚡ Make sure that DOM changes made with JavaScript don’t force Chrome to recreate the HTML
layout. E.g.: SSR with Rehydration
⚡Use Tree shaking to remove unused code
⚡Use simple and clean caching by adding content hashes (see next slide)
⚡ Bundle your code, but split it into reasonable bundles
14. Bonus: Caching
👉 It can occur that old file versions are used in the rendering process.
Why? Because the WRS may ignore the caching header which means that Google
may not know when to refresh JS files.
Solution: it’s recommended to use content fingerprinting (query parameter could
also be a solution).
Source: https://developers.google.com/search/docs/guides/fix-search-javascript
20. Largest Contentful Paint
Largest Contentful Paint (LCP) is a metric that reports the render time of the largest
image or text block visible within the viewport. Your performance is crucial for the
Largest Contentful Paint.
Source: https://web.dev/lcp/
23. How to optimize for Largest
Contentful Paint
If you’ve ever optimized a website speed then you’ll know that the most common
causes are the following:
1. Improve render blocking resources like JS & CSS:
a. Minify
b. Remove unused & not critical script
c. Defer or async load resources that aren’t essential for core content
loading.
2. Reduce file size, like images:
a. Compress images
b. Specify image dimensions use the srcset attribute
c. Implement lazy loading.
Source: https://web.dev/lcp/
For more tips, check out this video
24. First Input Delay
First Input Delay (FID) measures the time from when a user first interacts with a
page (i.e. when they click a link) to the time when the browser is actually able to
begin processing event handlers in response to that interaction.
Source: https://web.dev/fid/
26. First Input Delay in dev tool
Here is an example of a long task in the browser
27. How to optimize for First Input Delay
The main cause of a poor FID is heavy JavaScript execution. Optimizing how
JavaScript parses, compiles, and executes on your web page will directly reduce FID
1. Improve render blocking resources like JS & CSS.
1. Code splitting and breaking up your long tasks are good solutions.
1. Minimize unused polyfill. For example, if you use Babel, then just use
@babel/preset-env to only include the polyfill needed.
1. Remember: third-party script can also be a problem, so run JavaScript on a
background thread
Source: https://web.dev/optimize-fid/
28. Cumulative Layout Shift (CLS)
The CLS measures when a visible element within the viewport changes position.
For instance, if you want to click on a button & the button is no longer in the same
place, that is a layout shift & will impact CLS.
FID measures the instability of content by summing shift scores across layout
shifts that don't occur within 500 ms of user input
Metrics: not OK if the sum is more than 0.1
31. How to optimize for Cumulative
Layout Shift solution
The main causes are images without a dimension, ads, new dynamically inserted
content & font (FOUT/FOIT). Each of these elements has various solutions.
1. Specify image and embed dimensions: You can specify these in the ‘img’
HTML tag.
1. You can also use a CSS aspect ratio box to block out the required
space for images.
1. Reserve space for embeds: precompute sufficient space for embeds and
style a placeholder or fallback by finding the height of your final embed by
inspecting it with your browser developer tools.
32. Bonus 1: Why SEOs like that devs
use Webpack
Webpack helps with performance issues that websites have.
1. When a page size is too large it takes a long time to load. Webpack can minify
all site assets, and you can even use it to automatically reduce the resolution
and quality of the images, resulting in smaller page size.
2. As CSS is a blocking resource, you can use Webpack to implement critical
CSS loading
3. Too many requests at the server are a problem & Webpack helps optimizing by
merging assets into unified files. As a result, we have less requests.
33. Bonus 2: An incredibly useful tool
If you use webpack, then you can use webpack bundle analyzer. It will create an
interactive treemap visualization of the contents of all your bundles. That means that
you can visualize your hotspot & decide where you need to optimize your JS.
35. First things first...
– Google won’t click
– Google won't scroll
– If the content is in the DOM it will be seen (even if it’s hidden)
– If the content is not loaded into the DOM until after a click, then the content
won’t be found
36. Links Golden rules
– To index a page, Google needs a unique URL
– Google needs a semantic HTML markup that points a link to a proper URL (see
next slide)
– Google suggests using History API to load different content based on the URL
in a SPA
– Do not use Fragments to load different content in SPA
– Ensure all links have a href defined with target URL (see next slide)
...Remember the crawling phase: Google needs links to trigger the entire indexing
process.
37. Let’s look at some examples of Links
Good links:
– <a href=”/page”>very good link</a>
– <a href=”/page” onclick=”goTo(‘page’)”>acceptable</a>
Bad links:
– <a onclick=”goTo(‘page’)”>There is no href</a>
– <a href=”javascript:goTo(‘page’)”>There is no link</a>
– <a href=”javascript:void(0)”>same as before, there is no link</a>
– <span onclick=”goTo(‘page’)”>Google needs the a href element & not
span</span>
….
38. How AJAX affects crawl budget
In the video JavaScript: SEO
Mythbusting, Jamie Alberico asked what
about using AJAX & how that affects
Crawl budget.
39. What’s crawl budget?
– The Web is big, really big. As a result, prioritizing what to crawl, when, and
how much resources the server hosting the site can allocate to crawling
is important.
– Google has a "crawl rate limit" which limits the maximum fetching rate for a
given site. The crawl limit can go up or down. If the site responds really
quickly for a while, the limit goes up, meaning more connections can be
used to crawl.
– According to John Mueller, crawl budget could be a problem for large-scale
websites (millions or billions of URLs)
40. How AJAX affects crawl budget
1. If you have a limited number of resources that Googlebot can crawl you do not
want to waste them.
2. To supplement a page with AJAX you need many requests.
3. Now remember, you do not have just AJAX but AMP, hreflang, CSS, etc. All
these are extra. So you are consuming crawl budget.
41. How AJAX affects crawl budget
Let’s look at an e.g:
In the video JavaScript: SEO Mythbusting, Martin Splitt gives an excellent example.
Say that you use AJAX for a product page but then you use AJAX to supplement the
piece of content in that product page. So you request 1 URL & you get 9 more. In
total you have 10. However, Google caches aggressively. As a result, you will have
used less of your website budget.
42. Prevent Soft 404
Soft 404 is a page that returns to the user that the page does not exist but we have a
status code 200.
In SPA, we have one request to the server & then everything happens in the browser
The client-side routing takes care of the URLs. As a result, we do not know if we
have a 404 because we do not send any more server requests.
2 solutions:
– Use a JavaScript redirect to a URL for which the server responds with a 404
HTTP status code (for example /not-found).
– Add a <meta name="robots" content="noindex"> to error pages using
JavaScript.
Source: https://developers.google.com/search/docs/guides/fix-search-javascript
43. Testing your SPA
1. Google Search Console
2. Mobile Friendly Test
3. Site command. E.g.: site:
https://www.meetup.com/SEOnerdSwitzerland/events/273404205/
4. Test also the content site:https://www.meetup.com/SEOnerdSwitzerland/
"SEOnerdSwitzerland is a non profit organisation"
44. Do you know what the Google
Search Console is?
46. SEOs love Server-Side rendering
Why SEOs like server-side rendering:
1. We always hear horror stories about client-side rendering.
2. Because the rendering phase is complicated, we want to make it as easy as
possible for Google. So we want to serve users & bots populated HTML (no
AJAX request or wait time)
3. We love everything that implies control. For instance, you know if your server is
fast.
4. Everything that makes a website faster is better & as server-side rendering can
speed up page load times, we like it.
Remember speed is important: Core Web Vitals & a fast website will have more
URLs crawled (Crawl Budget).
47. Pre-rendering
Pre-rendering is good, but then you should also figure out which content you should
be lazy-loading. So ideally elements that are critical for intent should be in the initial
parse.
Is pre-render always the best solution? (10:48)
48. Client-Side rendering
No SEO (or Google) is fond of CSR, but it’s very famous & to help Googlebot there is
a work around:
Dynamic rendering:
Dynamic rendering means switching between client-side rendered and pre-rendered
content for specific user agents
In this context, Google does not consider this cloaking!
Resource: https://developers.google.com/search/docs/guides/dynamic-rendering
49. What Google considers a good
compromise
Resource: https://developers.google.com/web/updates/2019/02/rendering-on-the-web
51. Conclusion
A fast website is important not just for users but also for SEOs, and while JS is super
as it makes the website interactive, it can also impact the speed & the rendering of
the website. So as SEOs, we would like to ask devs to make the website fast while
keeping the interactivity.
53. Resources
Fix Search-related JavaScript problems:
https://developers.google.com/search/docs/guides/fix-search-javascript
Rendering on the Web:
https://developers.google.com/web/updates/2019/02/rendering-on-the-web
What Crawl Budget Means for Googlebot:
https://developers.google.com/search/blog/2017/01/what-crawl-budget-means-for-
googlebot
Tips for authoring fast-loading HTML pages:
https://developer.mozilla.org/en-US/docs/Learn/HTML/Howto/Author_fast-
loading_HTML_pages
If you want to learn more about SEO:
Join Isaline’s & my group #SEOnerdSwitzerland. We invite top experts in the
domain.
54. SEO French experts
During the Meetup there was a question regarding SEO french speaking experts.
Please find below 2 names:
– Andrieu Olivier
– Samuel Schmitt.
Moreover, remember all Google documentation is also translated in french.