I think I am really armature in this field. I created this document after long research may it will help us people like who want
to build there carrier in SEO field
1. Optimization techniques in 2013
Keyword optimization:
Important Level: 1/5 (It doesn’t affect ranking but helps in making pages unique)
Do’s: Limit it to 3 to 10 keywords. It can make the pages unique and help search engine
Pick up the right page for a keyword.
Don’ts: Don’t Stuff it with keywords.
Quality check: Check the webmaster consoles and see the duplicate Meta keywords. Also
Make sure that the targeted pages have proper title.
Progress check: See how many Meta Descriptions are fixed. Some of the pages may go
Through many changes, so we can keep a track of that too.
So how you can chose right keyword for your site: Many tools available today to choose
keyword but I think best tool is Google Keyword, https://adwords.google.com
Related Site:
http://tools.seobook.com/keyword-tools/seobook/
http://www.wordtracker.com/
2. Title Optimization: A title tag is the main text that describes an online document. It is the single
most important on-page SEO element (behind overall content):
Title tags, technically called title elements, define the title of a document and are required for all
HTML/XHTML documents.
Code Sample
<head>
<title>Example Title</title>
</head>
Optimal Format:
Primary Keyword - Secondary Keyword | Brand Name
or
Brand Name | Primary Keyword and Secondary Keyword
Best Practices
Less than 70 characters, as this is the limit Google displays in search results
Title appears in three key places:
Browser: Title Tags show up in both the top of browsers chrome and in applicable tabs.
Search Result Pages: Title tags also show up in search engine results.
3. External Websites: Many times, external websites (especially social media sites) will use the title of a
web page as its link anchor text.
Meta description Optimization: Meta Descriptions, which are HTML attributes that provide
concise explanations of the contents of web pages, are commonly used by search engines on search result
pages to display preview snippets for a given page.
Code Sample
<head>
<meta name="description" content="This is
an example of a meta description. This will often
show up in search results.">
</head>
Optimal Length for Search Engines: Roughly 155 Characters
Good Meta Descriptions can help us better Click through Rates in Search Engine Result pages, which
helps in better ranking. If a certain site clicked more from results in search engines, search engines give
more value to it. Example, when people see Wikipedia in the result pages they click on it even when it is
ranking very low as people like Wikipedia. Thus Search Engines promote Wikipedia above others. You
can have email id, phone numbers in Meta description, and show your strength under the Meta
description. See the keywords that are bringing traffic to a page and search with the keyword to see if the
description that appears in the search engine is proper
Avoid Duplicate Meta description Tags (As with title tags, it is important that Meta descriptions on
each page be unique)
4. Related Tools:
http://www.seomoz.org/seo-toolbar The mozBar makes it easier to see relevant SEO metrics
as you surf the web
http://www.w3schools.com/tags/tag_meta.asp (W3 Schools official documentation on meta
tags (including meta descriptions).
http://www.bing.com/community/site_blogs/b/webmaster/archive/2009/07/18/head-s-up-on-
lt-head-gt-tag-optimization-sem-101.aspx
http://www.seochat.com/seo-tools/meta-tag-generator?tool=3/?tool=3 (If you're new to web
development and search engine optimization, you may find this tool useful to ensure that your
meta tags are correctly formed)
Meta Tag Analyzer (http://www.seochat.com/seo-tools/meta-analyzer/ ) this tool will analyze a
website's Meta tags. Analyzing a competitor's keyword and description Meta values is a good
way to find ideas for key terms and more effective copy for your site.
http://tools.seobook.com/meta-medic/
Meta content-language tag: If you are directing your website’s contents toward a specific
language-speaking audience, you can specify the language of your content using the <meta> tag’s
content-language attribute. For example, for a target audience of American English speakers, you would
add the following tag to the <head> section of all your pages:
<meta http-equiv="content-language" content="en-us" />
Implementation of ISO Language Codes: According to the W3C recommendation you should declare
the primary language for each Web page with the lang attribute inside the <html> tag, like this:
<html xmlns="http://www.w3.org/1999/xhtml"> after implement lang attribute it
should be like: <html lang="en" xmlns="http://www.w3.org/1999/xhtml">
For more information please visit: http://www.w3.org/International/articles/language-tags/
No Index OR USE Robot.TXT
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is
called The Robots Exclusion Protocol.
It works like this: a robot wants to visits a Web site URL, say
http://www.example.com/welcome.html. Before it does so, it firsts checks for
http://www.example.com/robots.txt, and finds:
User-agent: *
Disallow: / The "User-agent: *" means this section applies to all robots. The
"Disallow: /" tells the robot that it should not visit any pages on the
5. site.
So how you can create Robot.TXT: You can use http://tools.seobook.com/robots-
txt/generator/ this tool to create robot.txt file
Txt file look like:
User-Agent: *
Disallow:
Disallow: /accountantsinlondon.com/partners
After Create this file you need to upload root folder call publick_html
Keyword density: keyword density is a very important factor. However, it's more important
to ensure that your target keywords appear in key places, such as:
Page title
H1
H2
H3
Main body text
Internal and external pointing to your page
Image alt tags
etc
Also don't forget that content is for users as well as search engines so forcing keywords into the content to
increase density could devalue it.
But remember we write contain for humans not for Google, so you have to maintain proper
density otherwise it will look spammy . So it’s very important to check the keyword density to
avoid over optimization penalty.
So what is Ideal percentage about Keyword Density?
Answer: (One word density ... 1% – 6% two word density 5% to 20%) Don’t go above 20% density for
any keyword, it should between 3 to 10%
So how you can calculate keyword density:
Keyword Density = (Nkr / Tkn) x 100
Where:
6. Density = your keyword density
Nkr = how many times you repeated a specific keyword
Tkn = total words in the analyzed text
Keyword density = (Nkr / Tkn) x 100
= (15 / 500) x 100
= 0.03 x 100
=3
Keyword density = 3%!!!
Key-phrases Density Check
Density = (Nkr x (Nwp / Tkn)) x 100
Where:
Density = your keyword density
Nkr = how many times you repeated a specific key-phrase
Nwp = number of words in your key-phrase
Tkn = total words in the analyzed text
So, again, if we take my “Waffles in Delaware” example – There are three words in my key-phrase and I
have used that key-phrase three times amidst my total word count of 500 words.
Density = (Nkr x (Nwp / Tkn)) x 100
= (3 x (3 / 500)) x 100
= (3 x 0.006) x 100
= 0.018 x 100
Density = 1.8%
Useful Tools
http://www.keyworddensity.com/search_engine_optimization/keyword_density.cgi
http://www.live-keyword-analysis.com/
XMl Sitemap:
If you have a website, you should be using an XML Sitemap to help improve your Internet visibility. What
is an XML Sitemap? It is a simple, effective way for you to give the search engines a list of all the URLs
you want them to crawl and index. If the search engines don’t find your site or specific pages on it, your
prospects won’t either!
So how you will create Xml Site map for your site: It’s very easy,
Use http://www.xml-sitemaps.com/ to get XML site map for your site.
How to Submit XML Sitemap?
7. After Create site map upload your file to your main (root) or public_html for Linux server and www for
windows server. Once you've created a Sitemap in an accepted format, you can submit it to Google
using Google Webmaster Tools.
And then submit your Sitemap to Google webmaster tool. https://www.google.com/webmasters/tools
Xml sitemap for blogger: http://ctrlq.org/blogger/
Useful Tools
http://www.xml-sitemaps.com/
http://ctrlq.org/blogger/
https://www.google.com/webmasters/tools
Broken links
Basically broken links are tow type internal broken links & External broken. Internal broken links are quite
likely a quality issue, always fix those as soon as possible. External broken links may not be quite so
important, but it's a good idea to run a check every few months and fix what you can.
The first two are easy to fix. Use a crawler like xenu or screaming frog to find internal and outgoing link
issues, and fix them.
For incoming links (backlinks) register your website with Google Webmaster Tools and check out the
Diagnostics->Crawl Errors. Here you will see who has broken links to your website.
Useful Tools
http://home.snafu.de/tilman/xenulink.html
http://www.brokenlinkcheck.com/
http://validator.w3.org/checklink
If you have WordPress: http://wordpress.org/extend/plugins/broken-link-checker/
Canonical issues
It is important to represent the content by one URL rather than many URLs. It divides the link strength. Use
htaccess, 301 other redirection methods to consolidate the URL to one. Also one can use canonical tag to make life
easier.
I think I am really armature in this field. I created this document after long research may it will help us people like who want
to build there carrier in SEO field
Sudip Nandy