3. SEO Element on Page(Keyword Based)
Title
Optimization
(Meta Tag)
Meta
Description
(Meta Tag)
Head Line Tags Anchor Tags
No Follow & DO
Follow Links
URL
Optimization
Page Hierarchy Internal Linking Robots. txt
Re direction
Links
Meta
Optimization
Image
Optimization
Site Structure
XML Sitemap
404 Error Page
Footer
Optimization
Canonical Links External Linking HTML Sitemap Breadcrumbs
Mobile Friendly
Site
6. Re Direction Links
• Redirection is the process of forwarding one URL to a
different URL.
• A redirect is a way to send both users and search engines
to a different URL from the one they originally requested.
•Types of Redirects
- 301, "Moved Permanently"—recommended for SEO
- 302, "Found" or "Moved Temporarily“
- Meta Refresh
7. 301 Redirection Permanently
• A 301 redirect is key to maintaining a website's domain
authority and search rankings when the site's URL is
changed for any reason.
•A 301 redirect is a permanent redirect from one URL to
another. 301 redirects send site visitors and search
engines to a different URL than the one they originally
typed into their browser or selected from a search engine
results page.
•It's essential to set this up so the domain authority from
inbound links to the http:// address are linked to
blog.hubspot.com to improve its search rankings.
8. 302 Redirection Temporarily
• A 301 redirect is key to maintaining a website's domain
authority and search rankings when the site's URL is
changed for any reason.
•A 301 redirect is a permanent redirect from one URL to
another. 301 redirects send site visitors and search
engines to a different URL than the one they originally
typed into their browser or selected from a search engine
results page.
•It's essential to set this up so the domain authority from
inbound links to the http:// address are linked to
blog.hubspot.com to improve its search rankings.
9. Why Set Up a 301 Redirect?
• To associate common web
conventions (http://, www., etc.)
with one URL to maximize domain
authority (hint: this is the same
situation as the scenario we
outlined above.)
• To rebrand or rename a website
with a different URL
• To direct traffic to a website from
other URLs owned by the same
organization
10. HTTP V/s HTTPS
•HTTP stands for Hypertext
Transfer Protocol.
•It’s most commonly used to
transfer data from a web
server to a browser in order to
allow users to view web pages.
•HTTPS stands for Hypertext
Transfer Protocol Secure.
11. HTTP V/s HTTPS
•SEO Advantage Of Switching
To HTTPS
•The use of an HTTPS site
makes Google Analytics more
effective.
•Build Trust With Visitors
•Be Able To Use AMP
•Inform Google that you have
switched from HTTP to HTTPS.
13. 404 Page
• 404 is the standard error code generated when a user attempts
to visit a page that does not exist due to a broken link or some
other type of error.
• Unedited, these types of pages display nothing but “404 – Page
Not Found” error messages when accessed.
• How to fix 404 errors
• The best practice is to track every individual 404 error and
redirect it to the correct page or similar page. This is what the
Redirection plugin allows us to do.
• The second way is to redirect all 404 errors to the homepage,
meaning visitors will be redirected to your site’s homepage if
they land on 404 pages.
14. 404 Page
•How to Find 404 Pages
•You can use any number of tools to identify 404 pages, such
as Screaming Frog, Link Sleuth, or Google Webmaster Tools.
•SEO professionals and website owners will use a 301
redirect to take visitors to the site homepage rather than
showing them a 404 error page.
•Though this is the safest option in terms of retaining the
strength of your incoming links, it isn’t always best from a
user’s perspective.
15. Creating 404 Page
•Include a customized 404 notification.
•Include a site search box
•Include at least a partial version of your navigation.
•Include a link to your home page.
•Include a list of popular links to visit.
•Add in the noindex and nofollow attributes to avoid any
issues with duplicate content or other SEO problems.
•Add Business Information
•Add contact Information
20. Canonical Links
• The canonical tag is all about duplicate content and preferred content.
• helping search engines identify which is the original page in relation to
duplicated content.
• “canonical URL”, the “preferred” version of a web page. Using it well
improves a site’s SEO.
• The SEO benefit of rel=canonical
• Choosing a proper canonical URL for every set of similar URLs improves the
SEO of your site.
• Setting a canonical is similar to doing a 301 redirect, but without actually
redirecting.
• the search engine knows which version is canonical, it can count all the links
towards all the different versions, as links to that single version
21. The process of canonicalization
• How to set canonical URLs
• Let’s assume you have two versions of the same page. Exactly, 100% the
same content. For example’s sake, these are their URLs:
• http://example.com/wordpress/seo-plugin/
• http://example.com/wordpress/plugins/seo/
• You pick one of your two pages as the canonical version. It should be the
version you think is the most important one. If you don’t care, pick the one
with the most links or visitors. If all of that’s equal: flip a coin.
22. Setting the canonical in Yoast SEO
• For posts, pages and custom
post types, you can edit the
canonical in the advanced tab
of the Yoast SEO metabox
• For categories, tags and other
taxonomy terms, you can
change them in the Yoast SEO
metabox too, in the same
spot.
• you can always use the
wpseo_canonical filter to
change the Yoast SEO output.
23. When should you use canonical URLs?
• For posts, pages and custom
post types, you can edit the
canonical in the advanced tab
of the Yoast SEO metabox
• For categories, tags and other
taxonomy terms, you can
change them in the Yoast SEO
metabox too, in the same
spot.
• you can always use the
wpseo_canonical filter to
change the Yoast SEO output.
25. Duplicate Contents
• Search engines like Google have a problem. It’s called ‘duplicate
content.’ Duplicate content means that similar content is being
shown on multiple locations (URLs) on the web.
• As a result, search engines don’t know which URL to show in the
search results. This can hurt the ranking of a webpage.
• As a reader, you don’t mind: you get the content you came for. A
search engine has to pick which one to show in the search
results. It, of course, doesn’t want to show the same content
twice.
• While not technically a penalty, duplicate content can still
sometimes impact search engine rankings.
26. Why does duplicate content matter?
For search engines
• They don't know which version(s) to include/exclude from their indices.
• They don't know whether to direct the link metrics (trust, authority, anchor
text, link equity, etc.) to one page, or keep it separated between multiple
versions.
• They don't know which version(s) to rank for query results.
• For site owners
• Link equity can be further diluted because other sites have to choose
between the duplicates as well. nbound links are a ranking factor, this
can then impact the search visibility of a piece of content.
• To provide the best search experience, search engines will rarely show
multiple versions of the same content
28. How do duplicate content issues happen?
Multiple URLs –
particularly on eCommerce
sites where URLs are
created through filter
options for price, colour,
rating, etc
29. How do duplicate content issues happen?
•Session ID URLs – automatically generated by your system.
The same applies to tracking URLs, breadcrumb links,
printer friendly versions, and permalinks in certain CMS.
•HTTP, HTTPS & WWW – search engines see
http://www.mydomain.com, http://mydomain.com and
https://www.mydomain.com as distinct pages, and will
crawl (and possibly index) them as such.
30. How do duplicate content issues happen?
• Case – users, and most browsers, treat upper and lower case the
same, with the two largely interchangeable. The same is not
necessarily true for search engines, so if your website mixes up case
in filenames and folder structure, you need to use the canonical tag.
• Mobile URL – when using a special URL (typically m.mydomain.com)
for the mobile version of your website.
• Country URL – when using multiple country specific URLs, the
content largely remains the same, with only a few minor differences.
This does not apply if the language is different, in which case you
want the search engines to return separate results.
31. Identifying duplicate contents issues
• Google Search Console : Google
Search Console is a great tool for
identifying duplicate content. If you go
into the Search Console for your site,
check under Search Appearance » HTML
Improvements
• Searching for titles or snippets :
site:example.com intitle:”Keyword X”
Let’s say the full title of your article
was ‘Keyword X – why it is awesome’
Site:example.com intitle: “keyword X –
Why it is awesome”
32. Practical solutions for duplicate content
• Avoiding duplicate content
- Session ID’s in your URLs?
- Have duplicate printer friendly pages?
- Using comment pagination in WordPress?
- Parameters in a different order? (Refer URL Factory)
- Tracking links issues? (use hash tag based campaign tracking instead of parameter-based
campaign tracking.)
- WWW vs. non-WWW issues? (Pick one and stick with it by redirecting the one to the
other. )
• Redirecting duplicate content
•Using rel=”canonical” links
•Linking back to the original content
34. Page Speed
WHY IS PAGE SPEED IMPORTANT?
• It makes users happy and they spend more time on a website.
• It reduces operating costs.
• It increases sales (and transactions in general).
• Webpage loading speed is a critical element for your website’s success.
• one of the known ranking factors, it’s also as an important usability metric.
• Nobody (people or bots) want to spend their time waiting for your website
to load
• especially when there is a dozen of other websites on the same topic that
load faster
37. HOW TO SPEED UP A WORDPRESS WEBSITE ?
#1 – DEACTIVATE UNNECESSARY PLUGINS
#2 – OPTIMIZE YOUR IMAGES AND VIDEOS
#3 – USE A CACHING PLUGIN
#4 – OPTIMIZE YOUR WORDPRESS DB
38. HOW TO SPEED UP A WORDPRESS WEBSITE ?
#5 – AVOID LOADING RESOURCES DIRECTLY FROM OTHER
WEBSITES IN YOUR WEBSITE HEADER.
#6 – CONSIDER CHANGING HOSTS
#7 - USE GOOGLE PAGE SPEED MODULE
Must follow the List of Tools : Click Here
40. Robot.txt
• Robots.txt is a text file webmasters create to instruct web robots (typically search engine
robots) how to crawl pages on their website.
• The REP also includes directives like meta robots, as well as page-, subdirectory-, or site-
wide instructions for how search engines should treat links (such as “follow” or
“nofollow”).
• robots.txt files indicate whether certain user agents (web-crawling software) can or
cannot crawl parts of a website. These crawl instructions are specified by “disallowing” or
“allowing” the behavior of certain (or all) user agents.
• robots.txt file, each set of user-agent directives appear as a discrete set, separated by a
line break:
• Where we create Robot.txt from www.yellowpipe.com & paste either in WP-ADMIN >
Setting> Paste in Robot.txt Box & for normal Website paste in root directory.
42. Robot Meta Directives
• Robots meta directives (sometimes called “meta tags”) are pieces of code that
provide crawlers instructions for how to crawl or index web page content.
• two types of robots meta directives: meta robotstag & x-robots-tag
• WHY ROBOT META DIRECTIVE REQUIRED
• Controlling the indexation of content not written in HTML (like flash or video)
• Blocking indexation of a particular element of a page (like an image or video), but not of
the entire page itself
• Controlling indexation if you don’t have access to a page’s HTML (specifically, to the
<head> section) or if your site uses a global header that cannot be changed
• Adding rules to whether or not a page should be indexed (ex. If a user has commented
over 20 times, index their profile page)
43. Robot Meta Directives
• Indexation-controlling parameters:
• Noindex: Tells a search engine not to index a page.
• Index: Tells a search engine to index a page. Note that you don’t need
to add this meta tag; it’s the default.
• Follow: Even if the page isn’t indexed, the crawler should follow all the
links on a page and pass equity to the linked pages.
• Nofollow: Tells a crawler not to follow any links on a page or pass along
any link equity.
• Noimageindex: Tells a crawler not to index any images on a page.
44. Robot Meta Directives
• None: Equivalent to using both the noindex and nofollow tags
simultaneously.
• Noarchive: Search engines should not show a cached link to this page on a
SERP.
• Nocache: Same as noarchive, but only used by Internet Explorer and Firefox.
• Nosnippet: Tells a search engine not to show a snippet of this page (i.e. meta
description) of this page on a SERP.
• Noodyp/noydir [OBSOLETE]: Prevents search engines from using a page’s
DMOZ description as the SERP snippet for this page. However, DMOZ was
retired in early 2017, making this tag obsolete.
• Unavailable_after: Search engines should no longer index this page after a
particular date.
47. XML Sitemap
• In simple terms, an XML sitemap is a file that lists all important pages of
your website that search engine crawlers should know about.
• Even if you don’t have a sitemap, they are still able to index your website but
having a sitemap makes their job easier and also it gives you the
opportunity to let them know about pages and parts of your website they
may not discover easily
• Different Type of XML Sitemap:
• Posts,
• Images
• Videos.
48. When is a sitemap really important?
• For websites with many pages, a sitemap helps search engines
discover pages that are new or updated.
• For websites that don’t use good internal linking practices a
sitemap is a great way to give information to crawlers about your
website pages.
• For new websites that don’t have many incoming links, a sitemap
will server as a discovery tool
49. When is a sitemap really important?
• For websites with many pages, a sitemap helps search engines
discover pages that are new or updated.
• For websites that don’t use good internal linking practices a
sitemap is a great way to give information to crawlers about your
website pages.
• For new websites that don’t have many incoming links, a sitemap
will server as a discovery tool
50. Tools for XML Sitemap?
• Google Sitemap Generator : Once activated click the XML-SITEMAP option
under SETTINGS. You can leave the default settings for the rest of the options except
SITEMAP CONTENT.
52. RSS FEED /ATOM FEED
• RSS/Atom feeds are small, containing only the most recent updates
to your site.
• optimal crawling, we recommend using both XML sitemaps and
RSS/Atom feeds.
• XML sitemaps will give Google information about all of the pages
on your site.
• RSS/Atom feeds will provide all updates on your site, helping
Google to keep your content fresher in its index.
• Note that submitting sitemaps or feeds does not guarantee the
indexing of those URLs.
54. When do you need to submit an RSS Feed to Google?
• Besides submitting an XML sitemap, there are cases that you can
submit an RSS feed to Google.
• When you have a large website with lots of pages that change
frequently (for example a News website),
• you can make use of an RSS feed to let Google know of new additions
to the website.
• The RSS feed will be smaller and will only include the new pages (thus it
will be processed faster) while your sitemap will include ALL your
pages.
55. Check List
CHECKLIST
• Create an XML Sitemap
• Check which pages, post types are included in Sitemap
• Submit Sitemap to Google
• Add sitemap to robots.txt
• Check if sitemap has errors in Google search Console
57. Anchor Text
• Anchor texts are a crucial part of SEO and strongly impact
your search engine ranking.
• Texts that link to another location or document on the Web
are referred to as anchor texts.
• In short, it’s clickable text in a hyperlink. For instance, in the
sentence “Wikipedia gives an overview of anchor texts,"
Wikipedia is the anchor text.
• As long as backlinks are an important SEO ranking factor,
anchor text is going to play a significant role. It's added for the
benefit of search engines, so that they can determine what the
“linked-to page” is about.
58. Types of Anchor Text
Generic
Anchors
Branded
anchors
Naked link
anchors
Brand +
keyword
anchor
Image
anchors
LSI anchors
Partial match
anchor text
Long tail
anchors
Exact match
59. Anchor Text
• Generic anchors are normal anchor texts like “click here” or “go
here.“
• A branded anchor is any anchor that uses the brand name as text.
Well-known brands and sites like Best Buy and Moz have the
highest number of branded anchor texts.
• Naked link anchors anchors that link back to a site by simply using
the URL. For instance, https://moz.com/ and www.moz.com are
both examples of naked link anchors.
• Brand + keyword anchor– Using brand and keyword anchors is
another safe and effective method for building a strong anchor
profile. For instance, “SEO services by Moz” is an example of a
brand + keyword anchor.
60. Anchor Text
• Image anchors – To diversify your anchor profile even further,
you should consider using image anchors.
• LSI anchors – LSI stands for Latent Semantic Indexing. It basically
refers to the variations of your main keyword, or so-called
synonyms. For instance, if you’re targeting “Flower Shop” , LSI
keywords could include “Flower shop online”, “Florist Shop,”
“Florist Online," and so on.
• Partial match anchor text– The partial presence of keywords in
the anchor text is said to be partial match. Examples of partial
match anchors would be: “useful link building guide," “learn
more about link building," etc.
61. Anchor Text
• Long tail anchors– These are similar to partial match
anchors, except for the fact that they're longer. Some
examples could be: “anchor text is a crucial part of link
building" and “these rankings are a result of anchor text.“
• Exact match– Exact match anchors are easily the most
important type of anchor text. For instance, your target
keyword is “link building,” so your anchor is also “link
building."
62. Anchor Text Optimization
• Keep anchor texts relevant
• Distribute different types of anchors
wisely
• Don't link to toxic sites
• Avoid internal linking with keyword-
rich anchors
• Write relevant guest posts
• Track your anchor texts Using monitoring tools
like SEMrush, Monitor Backlinks and Ahrefs could be helpful.
• Acquire links from the right sites