SlideShare a Scribd company logo
1 of 4
Tips To Create Search Engine Friendly Webpages

Things you need to remember regarding Search engines:

Search Engines survive through advertising. They make money via ads and in the world of
advertising - one rule rules, the more users view or read your ads, the more known your brand
will be and the more are viewed, the money you will make. Therefore, showcasing ads to your
users is profitable. The formula is - the more users, the more money you will make. A search
engine gains additional users by providing them with quality results. If your site is most relevant
to users search results then the search engine will boost your rankings – as that’s how they earn.

Search engines are computerized programs. They have a program called Spider. This program
named Spider downloads your web pages, reads them and then decides what must be done with
them on the basis of the information they contain. This process is known as Crawling. Therefore in
simple words, Search Engine Spiders are computerized programs used for crawling web pages.

If you have ever used a computer, you might have noticed that at times your computer tends to
slow down or at times hang up, which is because you over burden the computer by assigning
multiple tasks at once that overloads it. The same is the case with Search Engine Spiders, if your
web pages are disorganized, not systematically arranged or put up in a confusing fashion; it is
difficult for search engine spiders to crawl your web pages. Hence, your web pages will not be
crawled substantially. Web pages not getting crawled on regular basis adequately lead to the loss
of your websites rankings process. Do not forget, that there are millions and billions of web pages
on the World Wide Web; therefore, it is not possible for search engines to crawl all web pages as
their time is limited. So it is your duty to see to it that your web pages are simple and take less
time for crawling. This minimizes the risk of your website being ignored by search engines.

Now we shall see how to build highly spider-friendly web pages. Though it will not instantly boost
your rankings but will help you from making the mistakes that indirectly affect your rankings. Not
making it difficult for search engines to find and crawl your web pages.

HTML must be well-formatted

All the HTML codes must be well formatted and correct although they will not augment the
rankings, but it is hard for search engines to crawl a broken code. By prevention of using source
codes it is difficult for search engine spiders to track them and be careful enough to close all
relevant tags. Do remember - a search engine spider is pre-programmed computer software. It is
set up of different rules, making it restricted in certain areas. After which, we must be alert
enough to design our web pages according to search engine spider's rules to avoid consequences.


Primary rules for a search engine friendly page are:

The easier we make it for search engine spiders to find our codes, the more we stand a chance
towards success. This means, we must deter to make these two mistakes in HTML codes:

1. HTML code must not be obsolete or archaic or belonging to a proprietor i.e. to one single
browser.
2. Not use an extremely latest and new HTML that is not recognizable by most spiders.

Testing your HTML web page with World Wide Web Consortium (W3C) will tell whether it works
properly or not. Only run it on an HTML validator first:

http://www.validator.w3.org/
If your page works without any hassles then it only indicates that spiders will have not problem in
crawling your web pages. This does not mean that your web pages get you instant rankings; it
only means that your web pages are free of errors so will be easily crawled by spiders.

DOCTYPES work easily with web browsers and can be quickly analyzed and identified by search
engine spiders. Using the W3C validator becomes easy if you have a valid DOCTYPE. If this is the
condition then your web pages will be viewed by customers effortlessly across all types of
browsers, resulting into minor increments in your search engine rankings.

Few search engine spiders have had problems previously in dealing with XHTML DOCTYPES
therefore our suggestion to you would be that you use HTML 4.01 Transitional DOCTYPE, for
example:

<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//En">
"http://www.w3.org/TR/html4/loose.dtd">
(Note: the above code should be in a line.)

To get more information about this you can visit the below mentioned web site:
You can fix your site by the web pages with a valid DOCTYPE
http://www.alistapart.com/articles/doctype/

Keep URLs simple

By following links, a search engine spider finds the web pages of your web site. Search engine
spiders firstly download a page, scan the links in it and then arrange all the links in a queue. After
they are done with this, they pick up the first link and then repeat the same steps until they are
done with all the links. Surely, this is a simplified explanation of the process but it does gives you
an outline of how work is undergone.



A majority of websites, especially ecommerce websites use powerfully created URLs. Meaning, the
URLs are mechanically produced by extracting variables not in the database in order to corelate
the consumer's product details. A dynamic built URL generally contains a number of odd-looking
characters like, ?, &, = and +.

For example, if a shopkeeper who sells books has a powerfully created URL:
http://yoursite.com/index.php?item=books&author=shakespeare&genre=play

A static looking URL is much easier to look at:
http://yoursite.com/books/shakespeare/play

Or

http://yoursite.com/b/s/p


Do you know what Indexing is?
When Search engine spiders gather data while crawling the internet then collecting it in a
database called an index. Gathering and storing a web page is known as process of indexing.

A number of search engine spiders are working on improving themselves greatly in order to be
capable of crawling dynamically long URLs. However, in comparison to dynamically long pages,
short and static URLs are expected to be crawled by search engines more sufficiently and get
more pages indexed.
If a website entirely depends on dynamically drawing content from a database to create its URLs,
despite this, it is still possible to have URLs that appear as static with the use of a tool like
mod_rewrite.

Google Sitemaps feature offers an alternative to use mod_rewrite allowing you the liberty to notify
Google of which pages you want to get crawled, to make it apparent for them to avoid the full
fledged long URLs, only if you will state them as essential.

Obviously, Google Sitemaps does not affect other search engine's caliber to crawl your web pages
therefore such a tool like mod_rewrite proves to be versatile.

Important: Do not change your URLs if your site is already ranked and indexed on search engines.
If you change the URLs of your website uncaringly, it will result into some serious consequences of
you loosing your rankings on search engines. For if you do not report the changes made to search
engines and if they fail to find your web pages then they will drop your web site from the rankings
it holds.

If you still do change your website's URLs then make it definite to pass on the old URLs to their
fresh location. So that, search engines and visitors, who will visit your web site will be
automatically be directed to your new URL, saving you from loosing your rankings and customers.

Stay watchful over Session Ids

Session ids permit a website to crawl any customer from page to page, they are embedded in
URLs and work as exclusive identifiers. For example, if you visit an ecommerce website then
session ids keep a record of all the things you add to your shopping cart. Because of this feature
of session ids of crawling every customer activity creates huge number of links, making it a
tedious task for spiders to crawl them. The danger of spiders repeatedly indexing the same link
over and over again will lead them trapped in a loop of powerfully created loops.

To make the concept clearer, here's an example of how session ids within a particular site create
number of links i.e. pages. A session id link looks like:

http://www.yoursite.com/name.cgi?id=dkom5678kle09i

Such a link is given to the spider when it first downloads a web page. After processing, by the
time the spider reverts to the site to download more pages it finds another URL that looks like:

http://www.yoursite.com/name.cgi?id=ok8787ijidj87k

This is the same page only with an unlike tracking variable. The problem is that it looks like a
brand new URL to spiders and so they repeatedly download them again and again. Therefore
spiders tend to ignore links that seem to have session or tracking variables which are preset.

Google makes it clear by asserting that they do not crawl all such pages:

“Don’t use “&id=” as a parameter in your URLs, as we don't include these pages in our index.”

Make use of a flat directory format

A flat directory structure means that which has less number of sub-directory levels. If there are
deeper levels of sub-directory then there are changes of your web page being less spidered. Let's
understand it better with an example:
http://yoursite.com/content/articles/2010/09/page.html

Similarly, web pages with such deep levels of directory stand at the disadvantage of being highly
ranked on search engines, devoid of some exceptionally popular sites with plenty of incoming
links.

Directories are commonly used for the organization of the format of a site. The reason being – if a
page is three or four levels deep i.e. a flat directory structure becomes tedious to logically
arrange. A well placed site map becomes highly essential if your sub-directory demands it.


Note: If your directory is already set up permanently and your web pages rank decently well on
search engines then there is no need to change anything at all. Without informing search engines
if you move your web pages, they will be deleted from the search engines list. But if you want
your established flat directory to rank high on search engines then you must pass on the web
pages to their fresh site, making it easy for search engines as well as for the customers to be able
to find your new pages.


Semaphore is primarily a Web Design Company that offers Search Engine Friendly Web Site
Design Services and SEO Services. We are expert in Website Designing, Portal Development, Web
Re-designing, Corporate Branding.

More Related Content

Viewers also liked

Modernism And Post Modernism
Modernism And Post ModernismModernism And Post Modernism
Modernism And Post ModernismJohn Kirk
 
Email Marketing Ppt Presentation
Email Marketing Ppt PresentationEmail Marketing Ppt Presentation
Email Marketing Ppt PresentationDiseño Domingo
 
trabajo practico
trabajo practicotrabajo practico
trabajo practicodaianasoro
 
Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)respine2
 
Modernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoaméricaModernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoaméricablogliter
 
Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)respine2
 
Senoir Project Photos
Senoir Project PhotosSenoir Project Photos
Senoir Project PhotosCayleeGeller
 
Treball religió
Treball religióTreball religió
Treball religióBuscarons
 
gusanos de seda ( en gallego)
gusanos de seda ( en gallego)gusanos de seda ( en gallego)
gusanos de seda ( en gallego)manuelo1a
 
Sérgio J. - Estudo de caso seguradora - plano - solução
Sérgio J. - Estudo de caso   seguradora - plano - soluçãoSérgio J. - Estudo de caso   seguradora - plano - solução
Sérgio J. - Estudo de caso seguradora - plano - soluçãozeusi9iuto
 

Viewers also liked (13)

Modernism And Post Modernism
Modernism And Post ModernismModernism And Post Modernism
Modernism And Post Modernism
 
Email Marketing Ppt Presentation
Email Marketing Ppt PresentationEmail Marketing Ppt Presentation
Email Marketing Ppt Presentation
 
trabajo practico
trabajo practicotrabajo practico
trabajo practico
 
Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)Carta de julieta ( grup viquipuig)
Carta de julieta ( grup viquipuig)
 
Modernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoaméricaModernistas de otros países de hispanoamérica
Modernistas de otros países de hispanoamérica
 
Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)Carta de la julieta a la colometa ( grup viquipuig)
Carta de la julieta a la colometa ( grup viquipuig)
 
Senoir Project Photos
Senoir Project PhotosSenoir Project Photos
Senoir Project Photos
 
Treball religió
Treball religióTreball religió
Treball religió
 
1 b – g2
1 b – g21 b – g2
1 b – g2
 
Projecte 1r a
Projecte 1r aProjecte 1r a
Projecte 1r a
 
gusanos de seda ( en gallego)
gusanos de seda ( en gallego)gusanos de seda ( en gallego)
gusanos de seda ( en gallego)
 
Sérgio J. - Estudo de caso seguradora - plano - solução
Sérgio J. - Estudo de caso   seguradora - plano - soluçãoSérgio J. - Estudo de caso   seguradora - plano - solução
Sérgio J. - Estudo de caso seguradora - plano - solução
 
Períodos simples e composto
Períodos simples e compostoPeríodos simples e composto
Períodos simples e composto
 

Recently uploaded

Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteDianaGray10
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii SoldatenkoFwdays
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionDilum Bandara
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningLars Bell
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfMounikaPolabathina
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 

Recently uploaded (20)

Take control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test SuiteTake control of your SAP testing with UiPath Test Suite
Take control of your SAP testing with UiPath Test Suite
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko"Debugging python applications inside k8s environment", Andrii Soldatenko
"Debugging python applications inside k8s environment", Andrii Soldatenko
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An Introduction
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
DSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine TuningDSPy a system for AI to Write Prompts and Do Fine Tuning
DSPy a system for AI to Write Prompts and Do Fine Tuning
 
What is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdfWhat is DBT - The Ultimate Data Build Tool.pdf
What is DBT - The Ultimate Data Build Tool.pdf
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 

Tips To Create Search Engine Friendly Webpages

  • 1. Tips To Create Search Engine Friendly Webpages Things you need to remember regarding Search engines: Search Engines survive through advertising. They make money via ads and in the world of advertising - one rule rules, the more users view or read your ads, the more known your brand will be and the more are viewed, the money you will make. Therefore, showcasing ads to your users is profitable. The formula is - the more users, the more money you will make. A search engine gains additional users by providing them with quality results. If your site is most relevant to users search results then the search engine will boost your rankings – as that’s how they earn. Search engines are computerized programs. They have a program called Spider. This program named Spider downloads your web pages, reads them and then decides what must be done with them on the basis of the information they contain. This process is known as Crawling. Therefore in simple words, Search Engine Spiders are computerized programs used for crawling web pages. If you have ever used a computer, you might have noticed that at times your computer tends to slow down or at times hang up, which is because you over burden the computer by assigning multiple tasks at once that overloads it. The same is the case with Search Engine Spiders, if your web pages are disorganized, not systematically arranged or put up in a confusing fashion; it is difficult for search engine spiders to crawl your web pages. Hence, your web pages will not be crawled substantially. Web pages not getting crawled on regular basis adequately lead to the loss of your websites rankings process. Do not forget, that there are millions and billions of web pages on the World Wide Web; therefore, it is not possible for search engines to crawl all web pages as their time is limited. So it is your duty to see to it that your web pages are simple and take less time for crawling. This minimizes the risk of your website being ignored by search engines. Now we shall see how to build highly spider-friendly web pages. Though it will not instantly boost your rankings but will help you from making the mistakes that indirectly affect your rankings. Not making it difficult for search engines to find and crawl your web pages. HTML must be well-formatted All the HTML codes must be well formatted and correct although they will not augment the rankings, but it is hard for search engines to crawl a broken code. By prevention of using source codes it is difficult for search engine spiders to track them and be careful enough to close all relevant tags. Do remember - a search engine spider is pre-programmed computer software. It is set up of different rules, making it restricted in certain areas. After which, we must be alert enough to design our web pages according to search engine spider's rules to avoid consequences. Primary rules for a search engine friendly page are: The easier we make it for search engine spiders to find our codes, the more we stand a chance towards success. This means, we must deter to make these two mistakes in HTML codes: 1. HTML code must not be obsolete or archaic or belonging to a proprietor i.e. to one single browser. 2. Not use an extremely latest and new HTML that is not recognizable by most spiders. Testing your HTML web page with World Wide Web Consortium (W3C) will tell whether it works properly or not. Only run it on an HTML validator first: http://www.validator.w3.org/
  • 2. If your page works without any hassles then it only indicates that spiders will have not problem in crawling your web pages. This does not mean that your web pages get you instant rankings; it only means that your web pages are free of errors so will be easily crawled by spiders. DOCTYPES work easily with web browsers and can be quickly analyzed and identified by search engine spiders. Using the W3C validator becomes easy if you have a valid DOCTYPE. If this is the condition then your web pages will be viewed by customers effortlessly across all types of browsers, resulting into minor increments in your search engine rankings. Few search engine spiders have had problems previously in dealing with XHTML DOCTYPES therefore our suggestion to you would be that you use HTML 4.01 Transitional DOCTYPE, for example: <!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//En"> "http://www.w3.org/TR/html4/loose.dtd"> (Note: the above code should be in a line.) To get more information about this you can visit the below mentioned web site: You can fix your site by the web pages with a valid DOCTYPE http://www.alistapart.com/articles/doctype/ Keep URLs simple By following links, a search engine spider finds the web pages of your web site. Search engine spiders firstly download a page, scan the links in it and then arrange all the links in a queue. After they are done with this, they pick up the first link and then repeat the same steps until they are done with all the links. Surely, this is a simplified explanation of the process but it does gives you an outline of how work is undergone. A majority of websites, especially ecommerce websites use powerfully created URLs. Meaning, the URLs are mechanically produced by extracting variables not in the database in order to corelate the consumer's product details. A dynamic built URL generally contains a number of odd-looking characters like, ?, &, = and +. For example, if a shopkeeper who sells books has a powerfully created URL: http://yoursite.com/index.php?item=books&author=shakespeare&genre=play A static looking URL is much easier to look at: http://yoursite.com/books/shakespeare/play Or http://yoursite.com/b/s/p Do you know what Indexing is? When Search engine spiders gather data while crawling the internet then collecting it in a database called an index. Gathering and storing a web page is known as process of indexing. A number of search engine spiders are working on improving themselves greatly in order to be capable of crawling dynamically long URLs. However, in comparison to dynamically long pages, short and static URLs are expected to be crawled by search engines more sufficiently and get more pages indexed.
  • 3. If a website entirely depends on dynamically drawing content from a database to create its URLs, despite this, it is still possible to have URLs that appear as static with the use of a tool like mod_rewrite. Google Sitemaps feature offers an alternative to use mod_rewrite allowing you the liberty to notify Google of which pages you want to get crawled, to make it apparent for them to avoid the full fledged long URLs, only if you will state them as essential. Obviously, Google Sitemaps does not affect other search engine's caliber to crawl your web pages therefore such a tool like mod_rewrite proves to be versatile. Important: Do not change your URLs if your site is already ranked and indexed on search engines. If you change the URLs of your website uncaringly, it will result into some serious consequences of you loosing your rankings on search engines. For if you do not report the changes made to search engines and if they fail to find your web pages then they will drop your web site from the rankings it holds. If you still do change your website's URLs then make it definite to pass on the old URLs to their fresh location. So that, search engines and visitors, who will visit your web site will be automatically be directed to your new URL, saving you from loosing your rankings and customers. Stay watchful over Session Ids Session ids permit a website to crawl any customer from page to page, they are embedded in URLs and work as exclusive identifiers. For example, if you visit an ecommerce website then session ids keep a record of all the things you add to your shopping cart. Because of this feature of session ids of crawling every customer activity creates huge number of links, making it a tedious task for spiders to crawl them. The danger of spiders repeatedly indexing the same link over and over again will lead them trapped in a loop of powerfully created loops. To make the concept clearer, here's an example of how session ids within a particular site create number of links i.e. pages. A session id link looks like: http://www.yoursite.com/name.cgi?id=dkom5678kle09i Such a link is given to the spider when it first downloads a web page. After processing, by the time the spider reverts to the site to download more pages it finds another URL that looks like: http://www.yoursite.com/name.cgi?id=ok8787ijidj87k This is the same page only with an unlike tracking variable. The problem is that it looks like a brand new URL to spiders and so they repeatedly download them again and again. Therefore spiders tend to ignore links that seem to have session or tracking variables which are preset. Google makes it clear by asserting that they do not crawl all such pages: “Don’t use “&id=” as a parameter in your URLs, as we don't include these pages in our index.” Make use of a flat directory format A flat directory structure means that which has less number of sub-directory levels. If there are deeper levels of sub-directory then there are changes of your web page being less spidered. Let's understand it better with an example:
  • 4. http://yoursite.com/content/articles/2010/09/page.html Similarly, web pages with such deep levels of directory stand at the disadvantage of being highly ranked on search engines, devoid of some exceptionally popular sites with plenty of incoming links. Directories are commonly used for the organization of the format of a site. The reason being – if a page is three or four levels deep i.e. a flat directory structure becomes tedious to logically arrange. A well placed site map becomes highly essential if your sub-directory demands it. Note: If your directory is already set up permanently and your web pages rank decently well on search engines then there is no need to change anything at all. Without informing search engines if you move your web pages, they will be deleted from the search engines list. But if you want your established flat directory to rank high on search engines then you must pass on the web pages to their fresh site, making it easy for search engines as well as for the customers to be able to find your new pages. Semaphore is primarily a Web Design Company that offers Search Engine Friendly Web Site Design Services and SEO Services. We are expert in Website Designing, Portal Development, Web Re-designing, Corporate Branding.