SlideShare ist ein Scribd-Unternehmen logo
1 von 75
Downloaden Sie, um offline zu lesen
Submitted by: Malarkodi

malarkodiseo@gmail.com
What is a search engine and how does it work?
On the Internet, a search engine has three parts:
1.

A spider (also called a "crawler" or a "bot") which travels to every page or representative

page on every searchable web site, reads it, then using hypertext links on those pages, travels
throughout the other pages linked by that web site.
2.

A catalog or Index which is created by programs compiling the pages read from those

web sites, and...
3.

A program which receives your search request, compares it to the entries in the index,

and returns the results to you. An alternative to using a search engine is to explore a structured
directory of topics. Yahoo, which also lets you use its search engine, is the most widely-used
directory on the Web. A number of Web portal sites offer both the search engine and directory
approaches to finding information Not all search engines are created equal, but all of them have a
few basic components that are essential to their use. Some components are more visible than
others to the average user, but all of them must be working in tandem to create a high
performance search tool. The three basic actions that have to be performed for a search engine to
be useful are: Gather information, analyze information, and display information. The only major
difference between major search engines is how these tasks are performed and how often they
are performed. Gathering information Spiders are the programs that search engines use to collect
information about web sites on the Internet. These programs traverse the world wide web
gathering the content of web sites and store that information for later processing.

There are two basic ways that spiders can find your web site. You can tell the search engine
about your web site, or let it find your site on its own. Typically search engines will have a place
on their web site which allows you to suggest a site to them. After a site has been suggested, the
search engines spider will visit that web site to collect information about it. Spiders also follow
the links on each web site to find linked sites to visit. This is how a spider will find your site by
itself. The more web sites that link to your site, the more likely a spider will find your site
without you telling it your sites URL.

Usually search engine spiders will revisit your site when you submit your URL again. When the
spider finds a link to your site, or after a specified amount of time has passed since its last visit.
Depending on the number of web sites that the spider needs to visit and the resources that the
spider has at its disposal, it can take days or months for a spider to visit or revisit your web site.

Displaying information
Search engines take a search request from a user and display a list of web pages that relate to that
topic. These returned sites give clues to the algorithm used to analyze the web pages in the
search engines index. When a search engine displays the file size of the web page or a percentage
next to the web site, it can be used to help figure out how to optimize your web pages better for
that search engine. Some search engines return results in the order of relevance, others mix up
the results to make sure the web sites returned are from different sites. No matter how a search
engine displays the information requested by a user, this result is typically the first impression of
your web site. It is important to follow any guidelines that search engines give and do research
on how each search engine analyzes web pages so that you not only get a good ranking for your
search, but the description of your site is accurate as well.

What is SEO?
SEO = Search Engine Optimization, ie getting your site ranked higher so more people show up at
your doorstep.
In theory we’re interested in all search engines. In practice SEO = Google.

What are SERPs?
SERPs is an acronym for Search Engine Results Pages. Basically they are the search results you
receive when doing a search at a search engine.

What is anchor text? Why is it important?
Anchor text is the visible hyperlinked text on the page. For example, let's examine this code:

<a href="http://www.sitepoint.com/forums/">Webmaster Forums</a>

The anchor text for this link is "Webmaster Forums". This is important in search engine rankings
because the search engines use anchor text to help determine the relevance of a page being linked
to for those keywords. By having this link pointing to their forum's web page, SitePoint Forums
will perform better in searches for the phrase "webmaster forums" (and other similar phrases as
well).

What are Meta tags?
Meta tags appear in search results as well as on your site’s pages. Meta tags help optimize your
site regarding the search engines. Also, meta tags help browsers understand what your site’s
pages are about in regards to satisfying their search needs.

The large majority of search engines do not use Meta Tags as part of their ranking algorithm.
Some will claim Google uses Meta tags in its algorithm. This is entirely untrue.

Google, however, will use a meta description tag if it is unable to discern a description for a
webpage on its own (if the page has no text and no description in the open directory [dmoz] it is
likely Google will use the meta description tag in its SERPs). Please note that it is only using this
description in its SERPs, not its algorithm.

Should you use Meta Tags in your site?
Yes. They do have some affect in some search engines and even though that effect is almost zero
it is still more then zero so is worth the time.

How much time should I spend on my Meta Tags?
Ten minutes. Write a nice concise description of your page and throw in a sampling of keywords
(which you should have handy if you've optimized your pages properly). You should spend no
more time then this on them. Use your time to promote your site and get quality inbound links.

How many keywords should I use?
As many as you want. If you start to think you may have too many, you probably do. This means
you need to divide your page into subpages with each one taking its own topic.
Which are the most important area to include your keywords?
Page title and Body text are the most important areas where we can include keywords for the
SEO purpose.

What is a Title?
The "Title" of a web site is probably the single most important element for natural search engine
positioning. The Title is placed within the "head" of the html, is generally 12-15 words long and
should be descriptive in nature.

What is a keyword?
A "keyword" or "keyword Phrase" is the word or words a person types into the search box on a
search engine to look up subject matter on the Internet. If you are looking for a flag for your
home or office, you might type in "American Flags". The Search Engine screens its database for
those web sites it has obtained and looks for the words, "American Flags". Through
programming, it then finds and places in order those web sites which it believes to be a match
and displays them in order of relevancy. With proper design of a web site, you should have a
keyword meta tag area within the head of your html to list the words or "keywords" which best
describe your web site. It is important to reflect carefully when choosing your keywords. If you
sell boats, but you are only licensed to do so in Maine, then your keywords might best be "boats
for sale in Maine" or "Maine Boats", etc.

What is a Description?
The "Description" of your web site also resides within the "head" of your html and is usually a
sentence or two containing approximately 15 words which best describe your web site.

What is "body content relevance"?
"Body content relevance" is the written "non-image" text on the page of the web site which is
descriptive in nature and relates to the title, description and keywords. It is not mandatory to
have relevant body content, but it most definetly will assist your ranking on the search engines.
What is an algorithm?
The term algorithm (pronounced "AL-go-rith-um") is a procedure or formula for solving a
problem. The word derives from the name of the Persian mathematician, Al-Khowarizmi (825
AD). A computer program can be viewed as an elaborate algorithm. In mathematics and
computer science, an algorithm usually means a small procedure that solves a recurrent problem.

What is Google Webmaster Tools?
Google Webmaster Tools is a no-charge web service by Google for webmasters. It allows
webmasters to check indexing status and optimize their websites. It has tools that let the
webmasters submit and check sitemaps, generate and check robots.txt files, list internal and
external pages linking to the site, view statistics related to how Google crawls the site, and more.
Webmaster tools is a free service by Google from where we can get free Indexing data, backlinks
information, crawl errors, search queries, CTR, website malware errors and submit the XML
sitemap.

What is Google Analytics?
Google Analytics help you analyze visitor behavior in regards to your site. Analytics tools can
tell you how many visitors you had each day, what pages they crawled, how long they were on
each page, etc. Google Analytics is an invaluable tool in helping to augment your site’s ability to
attract browsers.
What’s an XML sitemap?
An XML sitemap is a list of pages of a web site accessible to crawlers or users. It lists the pages
on a web site, typically organized in hierarchical fashion. This helps visitors and search engine
bots find pages on the site.

What is a robots.txt file?
A robots.txt file on a website will function as a request that specified robots ignore specified files
or directories in their search. This might be, for example, out of a preference for privacy from
search engine results, or the belief that the content of the selected directories might be misleading
or irrelevant to the categorization of the site as a whole, or out of a desire that an application only
operate on certain data.
If you do not wish to block any files from the search engines then you do not need to use a
robots.txt file. Having one will not improve your rankings by itself nor make your site more
attractive to the search engines. In fact, you should only use one if you absolutely need it as an
error in your robots.txt file may result in important pages not being crawled and indexed and you
will never know unless you check your file for errors at some point in the future.

If you do want to use a robots.txt file to prevent 404 errors in your logs make this the only
content in your file:
Code:
User-agent: *
Disallow:

What is keyword proximity?
Keyword Proximity is a measurement criteria of the closeness of the keywords within the Page
Title, Meta Description and Body Text.

What is keyword prominence?
Keyword prominence is the location of the keywords in the page title, meta description and body
text.

Difference between exit rate and bounce rate?
Bounce rate is the percentage of people who leaves a particular website just after visiting a single
page on this and exit rate refers to the percentage of people who leaves from a particular page.

What was caffeine update?
Caffeine update was rolled out by Google in June 2010 and the main purpose of this update was
to include more fresh results in the search index, at least 50%.
What is 301 redirect?
A 301 redirect tells search engine spiders that a page of content has permanently been moved to
another location. This ensures that there are no ‘dead’ or non-working, active links on a page
within your site.

What is 302 redirect?
It is a temporary redirect.

What is 404?
It is a server error code which is returned by the server what a particular webpage or the file is
missing from the webhost server.

What is Page Rank and is it important?
Page Rank is a way for search engines to ‘grade’ a site and its associated pages. No one is
completely certain as to the ‘exact’ science of how page rank is formulated, but it is understood
that a number of elements such as age, number of backlinks, and amount of content, are used to
formulate it.

What is a Landing Page?
A landing page puts your customers close to the final sale. A good landing page offers intriguing
copy an opportunity for your visitors to make a purchase or desired conversion depending on the
desired end result of your site’s existence.

Still have Search Engine Optimization Questions? Click below to chat with one of our
experienced representatives. They will be happy assist in getting you the additional SEO answers
you need.

What is referrer spam?
Referrer spam is when a spammer sends fake referrers to your server. They do this because they
know most web stats lists referrers as hyperlinks. They then submit your stats to the search
engines in the hopes that they will crawl your stats and find that link. They also hope you click
on that link yourself.

What is a doorway page?
Doorway pages are web pages that are created to rank high in search engine results for particular
phrases with the purpose of sending you to a different page. They are also known as landing
pages, bridge pages, portal pages, zebra pages, jump pages, gateway pages, entry pages and by
other names.

What is cloaking?
Search engine optimization technique in which the content presented to the search engine spider
is different from that presented to the users' browser; this is done by delivering content based on
the IP addresses or the User-Agent HTTP header of whatever is requesting the page. The only
legitimate uses for cloaking used to be for delivering content to users that search engines couldn't
parse, like Macromedia Flash. However, cloaking is often used to try to trick search engines into
giving the relevant site a higher ranking; it can also be used to trick search engine users into
visiting a site based on the search engine description which site turns out to have substantially
different - or even pornographic - content. For this reason some search engines threaten to ban
sites using cloaking.

Hidden Text/Hidden DIVs
Hidden text/DIVs are only bad if you are using them to manipulate the SERPs. There are many
practical uses of hidden text/DIVs that enhance a web page without being malicious.
Good uses of hidden text/DIVs: Dynamic menus, dynamic page content
Bad uses of hidden text/DIVs: Text that is present on the page but cannot be viewed by human
beings at any time

What's on-page SEO?
On-page SEO refers to the things you do on your own site to enhance it’s ranking in the search
engines. This includes but is not limited to:
•

Creating content around specific keywords.
•

Formatting/designing your site so that the most important keywords are emphasized and

appear near the top of the page.
•

Including the chosen keywords in meta tags.

•

Including the keywords in the navigation menu and other links.

•

Using your keywords in other parts of your site, such as the title of the page, the file

name, etc.
•

Using related keywords on the site (see the question on LSI for more information).

What's off-page SEO?
Off page SEO refers to those things you do outside of your own web pages to enhance their
rankings in the search engines.
This is a glorified way of saying, “get links” and did I mention, “more links”.

What's the difference between SEO and SEM?
While some people use SEO and SEM interchangeably, SEO (search engine optimization) is
actually a part of SEM (search engine marketing).
SEO refers to the process of using on and off page factors (typically free) to get your web pages
ranked for your chosen keywords in order to get more search engine traffic to your sites. SEM
takes it a step farther to include using paid search engine listings and paid inclusion to get more
traffic to your websites.

What's the difference between paid and organic search listings?
Organic search engine listings are the main results users see when they do a Google search. The
websites appearing in the organic listings appear because those sites are most relevant to the
user’s keywords. Indeed, most of these sites appear in the top of the search engine results
because the webmasters of these sites have used SEO tactics to ensure top rankings.
The paid (or “sponsored”) listings usually appear on the top, bottom and to the right of the
regular organic listings. Usually these are pay per click (PPC) ads, which means the website
owner only pays when someone clicks on his ad (as opposed to paying for impressions).
This isn’t an either/or game. Just because you do SEO doesn’t mean you can’t/shouldn’t use PPC
and vice versa.
SEO is not free traffic, it takes time and/or money to get good organic rankings but in the long
run it’s usually cheaper than PPC.

Why do I need SEO services?
SEO services help your site rank better in the search engines. Better rankings drives more traffic
to your site, creating the ability for better exposure and revenue streams.

How does Google view my site?
Google crawls each site often and sporadically using ‘spider bots.’ The bots read your pages and
help Google catalog your site and its associated pages.

How long does it normally take to see SEO Results?
Many sites usually engage in an SEO program for at least six months in order to achieve good
results, yet ‘desired results’ will vary with each client. Patience is one of the best aids to an SEO
campaign. It may be best to understand that SEO tactics are ‘working’ all the time to achieve
your desired results; and, once your SEO results are achieved, they are long lasting.

What is link popularity?
Link popularity refers to the number of web pages on the Internet which are recognized by a
search engine to have a hyperlink reference to your site, or in other words are "pointing" to your
web site as a reference.

How do backlinks affect my rankings?
Backlinks help improve your rankings. Search engines see backlinks as positive ‘votes’ for your
site. Search engines highly associate your site’s backlinks with your site’s ability to satiate a
browsers search wishes.

What is a quality link?
A quality link is:
1) On topic (The page linking to your page is about the same main topic)
2) Ranked well for the keyphrase you are after (In the top 1,000)
3) Contains the keywords you wish to rank well for
4) Has high PR (PR 4 or higher)
I left out high traffic because that is irrelevant from an SEO point of view. But if you're looking
at the big picture that would be #5.

How many backlinks do I need?
There is no fixed, ‘golden’ number of backlinks. Ideally you want to acquire backlinks from
reputable sites in an ongoing fashion.

How do I get a lot of backlinks to point to my site?
A good place to start is to submit to directories. Start with the free ones and then decide whether
pay ones are worth it for you. Here's a great place where you can find a free directory listing that
sorts them by PR, Alexa rank (worthless), and more.

What is the best criterion to identify the value of a backlink?
The authority of the domain, quality of the content on the page where the backlink is provided
and then the page rank of the website.

I was thinking of doing <seo trick here> to my site but I'm afraid the search engines might
think it is spam. Should I do it?
No. Why? If you're not sure if it will get you in trouble with the search engines or not then it's
probably something you shouldn't be doing. Another good reason not to do it is accessibility.
Many webmasters employ hacks and tricks in an effort to increase their search engine ranking.
Often times these tricks are at the expense of the usability of their website. Not only to those who
have disabilities but to anyone who's trying to navigate their site.

What does the Submission Process Actually Do?
The SUBMISSION programs send your web site address, "URL" to search engines and links
using what is referred to as add-a-URL strings. After receiving the URL, engines use a "spider"
to then parses through the HTML code looking for tags that begin with "<a href=". After the
entire page has been parsed, a small "Web BOT" travels the links it found, searching for more
links using the same procedure until all of the pages at that URL address have been found.

When will my Submissions appear on the engines?
Every engine and directory is different. In some cases, your submission will appear within a few
days. In some cases your submission may be much longer and in some instances, your web site
may never get listed by that submission. Because of this, the idea is that the more engines you
submit to, the better your visibility will be and if you submit regularly (every month), you have a
better chance of getting added to the engines that didn't add you the last time. Many engines and
directories put you in a queue. Some will manually add you when they get a chance. Some will
wait to check your site out for content.

What is the difference between submission and placement and when will my first page paid
placement list on the search engines?
With search engine submission, we do not guarantee that a search engine will place your web
site. With search engine placement, we ask for you to allow ten days for placement on the search
engines. You will receive a ranking report at the email address you provided on your order form.

What happens if I use includes for my pages? Will the search engines see them?
The search engines don't care about what server side technology you use. All they see is the
(x)HTML your server side code generates. To see what they see simply load your page in your
favorite web browser and then view the source. What you see is exactly what they see.

Should I submit my website to the Search Engines by hand or use software?
Do it by hand. It will not take long to do and will ensure that you are successful in submitting
each form with the correct information. There is a constant debate about how search engines feel
about automated submission software. Since there is a reasonable chance these are frowned upon
by the search engines, and since you can do anything they can do on your own, you might as
well avoid them.
How often should I submit my website to the search engines?
Once. Resubmitting your url does not get you indexed faster or improve your rankings. Also,
resubmitting your site will never cause your site to be banned. If so, then all you would need to
do is submit your competitors' sites repeatedly until they were banned.

How often should I submit my website to the search engines?
This is a very common myth that is 100% untrue. The file extension does not affect your
rankings in any way. After all, no matter what server side programming language you use, and
what extension you choose to use with it, they all just spit out HTML in the end. That's all a web
browser will see and that all a search engine will see.

Sites with .com rank higher then with <TLD here>
This is another common myth that is untrue. The only time a domain extension can affect your
ranking is if the search is based by country. The country-specific TLDs (e.g. .co.uk) will have
priority over non-country specific TLDs (e.g. .com or .net).

One observation many make is that .coms tend to rank higher then other domain extensions.
They assume it is because .coms are given preferential treatment. This is a poor assumption.
.coms seem to rank higher then other extensions because they are by for more popular then any
other domain extension (there are more .coms than .net, .org, .biz, .edu, .gov, and .info
combined) so they naturally have a greater chance of ranking higher vs other domain extensions
through sheer quantity alone. .coms also tend to be older sites so they have had a chance to
establish themselves whereas newer domain extensions have not. They have also used this time
to acquire more backlinks which is an important factor in search engine algorithms.

It is also commonly believed that .gov and .edu sites are given preferential treatment from search
engines. This is also untrue. Web pages on .edu and .gov domains tend to rank well because they
contain quality content and many webmasters will link to their content as a result. Both of these
are key elements in SEO. But the fact that they are .edu or .gov domains does not benefit them
directly in the SERPs.
Pages with query strings don't rank as well as without query strings
Another common myth that is untrue. The only way variables in a query string can affect a site in
the SERPs is if it has a sessionID or something that looks like a sessionID in it (e.g. id=123456).
These usually prevent indexing of these pages or limit the amount of pages indexed. But query
strings do not affect the page's ranking. Neither in a positive way or negative way.

Should I use relative links or absolute links?
Absolute links. It is recommended by Google as it is possible for crawlers to miss some relative
links.

I just changed from .html to .php. How can I switch without losing my rankings?
There are two ways to do this:
1) Tell Apache to parse all .html files a .php file. Using this method you do not have to change
any files extensions or worry about any redirects. To do this, place this code in your httpd.conf
file:
Code:
AddType application/x-httpd-php .php .html
2) Use a 301 redirect to redirect from the .html files to the .php files. You can do that by placing
this code in the root directory of your website:
Code:
RedirectMatch 301 ^/(.*).html$ http://www.yourdomain.com/$1.php

I just changed my domain name. How can I switch without losing my rankings?
You'll need to do a 301 redirect from the old domain to the new domain. Fortunately this is not
difficult to do. You'll need to add the following lines of code to a file called .htaccess and place it
in the root directory of the old domain:
Code:
RewriteEngine On
RewriteCond %{HTTP_HOST} ^(www.)?old-domain.com$ [NC]
RewriteRule ^(.*)$ http://www.new-domain.com/$1 [R=301,L]
Why aren't all my pages being indexed?
If your site is less then six months old stop reading now. Your site is too new to be worrying
about getting all of your pages indexed. Be patient. It takes time to crawl through your whole
website and add your pages to the index. If you are sure your pages are search engine friendly
then you have nothing to worry about.

If your site is six months old or older you need to check your website to make sure all of your
pages can be found and indexed. Have you:

1) Made a human sitemap?
2) Made a Google or Yahoo sitemap?
3) Used search engine friendly URLs?
4) Used search engine friendly navigation?

An additional note: get incoming links. These are important for the search engines' algorithms
and may play an important part in how deep the search engines will crawl your website.

How do I check if my site is search engine friendly?
Turn off JavaScript, CSS, and cookies in your web browser and view your website. This is how
the search engines most likely see your website. If you can successfully view your content and
navigate your website your site is mostly search engine friendly. The only other thing to check is
your URLs. Not using a session ID or 'id=' in your query strings is also very helpful.

What does it mean to have your site indexed by the search engines
To be indexed by the search engines means your webpages have been crawled and included in
the database of the search engines. Your pages are now available to be included in search results
of user queries. This doesn't mean your pages are guaranteed to be included. It just means they
are available. The pages will still need to be relevant to the search terms before they will be
included in the SERPs.
Which is better for domain name and/or url: hyphen (-), underscore(_), or plus sign(+)?
Hyphens and underscores are the best keyword delimiter you can use in your domain name or
URL. They are seen as equal by all of the major search engines.
Many say that separators are not necessary as search engines can find keywords in URLs without
assistance. They are smart and most likely can pick some keywords out of a URL. But they are
not that smart. Sometimes it is not obvious where one keyword ends and another begins. For
example: expertsexchange.com can be seen as "experts exchange" and "expert sex change".
These are obviously two very different topics. In this case a hyphen or underscore would clearly
separate the keywords and solve this problem.

Will too many hyphens in your domain name cause the search engines to label your site as
spam?
No. This is a myth caused by many spam sites using multiple hyphens in their domain name.
Many people have wrongly concluded that only spam sites would need to use more then one
hyphen. The truth of the matter is that having more then one hyphen in your domain name will
not result in your site being penalized. The more likely scenario is that having multiple hyphens
will result in a flag being set at the search engines and a manual review being done to see if the
site is spammy or legitimate.

One thing to keep in mind when choosing a domain name with hyphens in it: you users. When
using a domain with multiple hyphens you make it more difficult for your human visitors to
remember and type in your domain name. Domain names with more then one hyphen should
only be used if you are attempting to market your website through the search engines. If you plan
on doing offline advertising, including word of mouth, one hyphen or less is recommended.

Does the order of the keywords used in a search affect the search results?
Yes. Do a search and see for yourself.

Does the order of the keywords in a domain name/URL matter?
Yes. You will typically rank better in the SERPs for the phrases that use the words in the same
order as your domain and URL then if they are not in the same order.
Does using automated SEO software cause a website to be penalized?
No. This is a common myth that is untrue. If it were true you could get your competitor
penalized or banned by using automated SEO software to resubmit their website every 60
seconds. Naturally this does not happen (nor should it).
Some webmasters will try to say that Google says in their guidelines that you shouldn't use
automated software like Web Position Gold. The reason for this is that most of these tools scrape
Google's SERPs to find your site's ranking information. This is in violation of Google's terms of
service. Software that uses Google's API is acceptable for querying their servers. Also, if you
constantly use SEO software to query a search engine's servers you might find that they ban your
IP address to prevent you from using their resources any further. However, this has no effect on
your web pages' rankings.

Can search engines see password protected pages?
Search engines are not different from regular users in most ways. They cannot go anywhere that
a regular user cannot go. If you have a password protected area of your website that cannot be
accessed without a login and password then the search engines cannot see it.

Which is better for SEO: text links or graphical links?
Text links are better for SEO. Text links can contain the anchor text that your page wishes to
rank well for and that is an important factor in all three search engines, especially Google. Image
links are still valuable but do have less benefits compared to text links. This is true despite image
tags having the ALT attribute available. The ALT attribute can contain keywords but thanks to
keyword stuffing they now are virtually worthless. (You should be using your ALT attributes for
usability and accessibility and not SEO anyway).

Does validation help your ranking?
Short answer: No.
Longer answer: No. But having a webpage that validates is a good idea. A webpage that has been
validated to a W3C standard contains no errors and therefore can be easily parsed and
understood by the search engine crawlers. An invalid webpage runs the risk of being
misinterpreted or just not read at all.
Can the search engines read javascript?
Probably not. But we can't say no for sure because some JavaScript is so easy to read it is hard to
imagine that it does not get interpreted. An example of an easy to interpret snippet of JavaScript
would be:
Code:
<script type="text/javascript">
document.write('<a href="http://www.example.com">Trying to hide this link from search
engines</a>');
</script>
To ensure that the search engines don't read your JavaScript it should be inserted into a web page
using an external file and that directory should be blocked using robots.txt.

Why should I not launch an incomplete website?
1) Users will remember that your site was incomplete and will be less willing to come back
2) Search engines may index incomplete pages and cache them and then not refresh their cache
for months or years
3) Other webmasters will not exchange links with incomplete sites
4) Directories won't accept submissions from incomplete sites
Keep in mind this generally covers your "under construction" kind of incomplete sites. You
certainly can launch a site and then continually add to it and grow it. Even adding whole new
sections. But a site that is obviously incomplete just shouldn't be set loose in the wild until it is
ready to go.

What is the best way to determine my online marketing budget?
Determine your potential return on investment. Online marketing tactics help bring more traffic
and business to your site, raising your revenue. There is always a need for online marketing; yet,
be sure your provider is presenting you with quantifiable results.

What are the best ways to optimize my site?
Search engine optimization involves a high number of tactics, which all help to optimize your
site. A combination of online marketing and search engine optimization is a good way to achieve
great optimization. Unlike, short-term advertising, search engine optimization presents tenacious
results.

How quickly will I see results?
If you target long tail keywords you can see results pretty quickly but always remember SEO is a
long term strategy not a set and forget thing.
If you’re after more competitive keywords prepare to commit to it for at least three months of
consistent effort.

Should I rank my own content or articles on other sites?
Yes – but let’s qualify that.
Because you can’t control what third-party sites do, you should focus the vast majority of your
efforts on ranking content on your own sites.
However, you can leverage high-ranking third-party sites by posting SEO’ed content on them
and then you including a link back to your own site. Not only do you get the SEO benefits of the
backlinks, you’ll also get indirect search engine traffic from people clicking through to your
main site.

How many keywords should I put into my <title>, <a>, and <h1>..<h6> tags?
You should only put the few keywords that are most relevant to your pages. The more you put in
each tag, the more you dilute the value each keyword is given.
<h1>Advanced PHP Programming</h1>
is better then
<h1>Advanced PHP Programming Is Really Cool And Stuff Dude</h1>
What’s the difference between organic SEO and Paid results?
When a browser conducts a search, they will be confronted by both organic results and paid
results (those which are highlighted and usually placed on the very top or right-hand side of the
page). It is the quest of every business to achieve first-page, organic-SEO results because they
are long lasting and organic results are more respected by browsers.
What is the best way to maximize the frequency of crawling of your website by search
engines?
Frequently adding new, original and quality content on the website.

What is keyword density and how does it help?
Keyword density refers to the ratio of particular keywords in your copy as compared to the rest
of the copy. Having good keyword density improves the likelihood that both search engines and
Web browsers will associate your site’s content with your chosen keywords.

Why do I need to write copy for my web site?
Content is king when it comes to the Web. Remember that the Web’s purpose is to provide
information to users; having fresh copy implemented regularly on your site is one of the top
ways to achieve good rankings and to intrigue visitors to visit your site.

How can Social Media be used for SEO?
Social media presents opportunities to acquire backlinks to your site’s pages, articles, press
releases, etc. Social media is a popular and ever-growing aspect of the Web. Engaging in social
media works well to generate good publicity for your site while helping SEO initiatives as well.

Why does my company need Reputation Management?
Having an online business means you are open all the time. Competition can be fierce in many
industries. Reputation management helps your business build and maintain a ‘good’ name within
your industry and with customers.

What other factors affect rankings besides backlinks?
Where you’re getting your links, the quality of these links, the relevancy of these links, how
many links you have and what keywords you’re using as the anchor text all affect your rankings.
But there are other factors that affect your ranking, including but not limited to:
•

On page optimization factors – this is how well you’ve optimized your tags, content,

formatting, keyword proximity, site map, and links on your web page. This also includes whether
you use your keywords at the top of your page and in your “alt” tags (both good things).
•

Having a lot outgoing or reciprocal links pointing to “bad” sites (like link farms) – can

negatively impact rankings.
•

Whether you have unique content (which the SE’s like).

•

How frequently you update your site. Faster isn't necessarily better. Check what ranks

well for your niche and aim to match it.
•

Whether your domain includes your primary keywords.

•

Your domain’s age, reputation, IP address and whether it’s a top level domain (e.g., a

.com is better than a .info although probably not by much).
•

Shady practices such as keyword stuffing or using text that’s the same color as the

background can negatively affect your rankings. Only an issue if your site gets manually
inspected and you don't have a legitimate reason for it.
•

Showing one page to the search engines and other page to visitors negatively affects your

rankings. (Cloaking and doorway pages.)
•

Frames negatively affect your rankings.

•

Using content that the search engines can’t read, like audios, flash, videos, graphics

(without alt tags), etc.
•

Whether you have a robots.txt file that tells the search engine bots to stop crawling or

indexing your site.

Does domain age help?
Yes – search engines view older domains as more trustworthy, which means older domains may
have a slight advantage. But this is only true if the older domain has a good reputation (e.g., it
hasn’t been blacklisted, penalized or banned from the search engines).

Why would I want to 301 redirect an aged domain?
Google passes link juice/authority/age/ranking strength (call it what you like) from one domain
to another if you do a 301 redirect on it.
For the less tech savvy out there the 301 code means “permanently moved” and is a way to
announce that your site that was once “here” is now “there”.
The upshot of this is that you can buy an aged domain and “301” it to the site you’re trying to
rank instantly passing on all that lovely ranking power that it’s acquired just by sitting in some
domain squatters account for 10 years.

Just make sure they do a domain push at the same registrar it was originally registered at or all
these effects are lost.

Also, you have to wait up to 2 weeks to see the benefits. They are not instant!

What is rel="canonical"?
If you have two or more pages with similar content, you can tell Google which is your preferred
page to show in the search engine results. This is referred to as your “canonical” page. If Google
agrees this designated page is the best version, it will show this preferred page in its index.

To tell Google which page you want listed as the canonical page, add the following bit of code
into the head section of the similar (non-canonical) pages:

<link rel="canonical" href="http://www.example.com/filename.html"/>

Naturally, you should replace the example.com/filename.html with your actual domain name and
file name.
For example…

Example.com/file1.html is your preferred canonical page, the one you want displayed in the
search engine results. You don’t have to add any tags to this site.

Example.com/file2.html

and

Example.com/file3.html

have

similar

content

to

example.com/file1.html. As such, you’d place the canonical code within the <head> tag of these
two sites to tell Google that example.com/file1.html is the most important page.
The most common reason to do this is to tell Google that these pages are all the same –
•

Example.com

•

www.example.com

•

www.example.com/index.html

•

Example.com/index.html

Don’t go overboard with this and certainly don’t use it on stuff like paginated comment pages
because they are “similar” but contain the same post. They contain enough unique content to be
treated as unique and Google will start to ignore your legitimate canonicals if it finds too many
instances of you misusing it.
Yes, Google thinks it’s smarter than you, deal with it and move on.

What's the truth about duplicate content?
There is no duplicate content penalty when it comes to multiple sites. Otherwise, your shady
competitors could just create near-clones of your site to make your site disappear. But that
doesn’t happen. Indeed, run a search for a PLR article and you’ll likely see many SE results for
that same article.
TIP: Nonetheless, it’s better if you have unique content, rather than competing with others for
the same keywords using similar content.

What about duplicate content on your OWN site? In other words, what happens if you have two
web pages with the same content but different file names? In that case, refer to the question on
rel-canonical for instructions on how to deal with this.

What is a doorway page/cloaking?
Cloaking refers to showing one page to a search engine and a different page to your human
visitors. Doorway pages are optimized pages that pull in SE traffic, but this traffic is immediately
redirected (either manually or automatically) to a different page.
Google and other search engines do NOT like these practices.
What are meta tags?
Meta tags are information that you put between the <head> tag of your web page’s source code.
These meta tags primarily tell search engines and other user agents about your site’s content
(description), keywords, formatting, title and whether you want the search engines to crawl (and
index) the page.

There are also some tags that are shown to the user, such as the title tag (which is the title that
appears at the top of your browser).

Note that the big search engines no longer take these tags into consideration when ranking your
web pages (with the exception of the title tags). Some smaller and more specialized search
engines still utilize the keywords and description tags when ranking and displaying your site.

What is the "freshness" factor?
Search engines such as Google prefer “fresh” (newly updated) web pages and content over stale
content. That’s why when you first add content to your site – such as a new blog post – this page
may sit high in the rankings for a while. Eventually it may sink to a more realistic ranking.
It’s this “freshness factor” that allows your pages to get those higher rankings, even if the
ranking is temporary. Thus updating your pages frequently can help push them to the top of the
rankings.
This is one of the primary reasons why you hear people talking about how “Google loves blogs”.
Google doesn’t love blogs, Google loves regularly updated sites.

A computer’s IP address is it’s address on the Internet. A C-Class block of IPs are ones which
are next to each other. Links from the same IP have very limited value. Links from the same CClass IP block have a little more value but still not much. Links from different C-Class IPs are
worth the most.
Not as important as it once was, especially when it comes to sites hosted on huge shared server
clusters like those at HostGator/ThePlanet, BlueHost and others. The shortage of available IP
addresses is driving this.

Most importantly tons of domains all on the same IP or C-Class that all interlink are the fastest
way to announce to Google that you’re trying to cheat the system. This may have worked a
couple of years ago, now it’s just a flashing neon sign telling Google to deindex you.

What is LSI?
LSI is short for latent semantic indexing. This refers to different words that have the same or
similar meanings (or words that are otherwise related). For example, “housebreaking a dog” and
“housetraining a puppy” are two entirely different phrases, but they mean about the same thing.

The reason this is important is because Google analyzes webpages using LSI to help it return the
most relevant results to the user.
For example, a page that has the keyword “housebreaking a dog” but NO other similar words
(like housetraining, paper training, potty training, puppy, dogs, puppies, etc) probably really isn’t
about housebreaking. End result: Google won’t rank it as high as a web page that does include a
lot of relevant, related terms.

What does this mean to you? When you create a web page around a keyword, be sure to also
include the keyword’s synonyms and other related words.

Pure LSI analysis isn't scalable enough to handle the volumes of data that Google processes.
Instead they use more streamlined and scalable content analysis algorithms that have some basis
in LSI and other related technologies. It also appears that this analysis is ongoing and not just a
one time run through the system.
Cliff Notes: Don’t write content that a drunk 4th grader would be ashamed of. Spend the extra
couple of minutes to write decent stuff and you’ll be fine.
Should I build links for human beings or the search engines?
Both but make sure you know which one you’re going for at any point.

If you want human beings to click the link then make sure your content high quality and worth
that click.
If it’s never going to be seen by a human then don’t spend a week writing a beautifully crafted
piece of prose use automation or anything you can lay your hands on to get links fast.

What is an XML Sitemap?
This is a listing of all the pages on a website, along with important information about those pages
(such as when they were last updated, how important they are to the site, etc). The reason to
create a sitemap is so that the search engines can easily find and crawl all your web pages.

This is really only important if you have a large and complex site that won't be crawled easily. A
10-20 page HTML mini-niche site doesn't really need one while a 20,000 page product catalog
might benefit from one. Also avoid automating this on WordPress autoblogs since sitemap
generation is a processor hog and can get you kicked off of shared hosting.

What's the sandbox?
The disappointment webmasters feel when Google's stupid algorithms don't appreciate their site.
It can't be them so it must be Google's fault.

What is robots.txt for?
This is a file some include in some or all of their website directories. Search engine robots (bots)
look at this file to see if they should crawl and index pages on your site, certain file types or even
the entire site. An absence of this file gives them the green light to crawl and index your site.
If you don’t want search engine bots to crawl your site, then create a robots.txt file in your root
directory that includes this bit of code:
User-agent: *
Disallow: /

You can also create a meta tag that keeps the search engines from indexing your site:

<meta name="robots" content="noindex">
Important: Only “well behaved” bots read robots.txt so don’t use it to “protect” content on your
site just to keep Google from indexing stuff. Most importantly be aware that malicious bots will
look for pages you’re asking not to be indexed and go to them with priority to see why.

What's a spamblog?
A spamblog (or splog) is a blog used primarily to create backlinks to another site. Splogs tend to
be populated with fake articles, commercial links and other garbage content.

In other words, they provide little or no value to a human reader. As such, the search engines
tend to de-index these sites once they discover them.

What's an autoblog?
An autoblog uses an automation tool to pull in content from other sources and post it on the blog.
In other words, it’s an easy way to automatically and frequently update a blog.

They are a great way to build foundation sites to provide link juice to your higher ranking, more
competitive sites but a good way to get sites banned if you don’t know what you are doing.

Most importantly there is a lot of discussion about how legal they are due to reproducing content.
I’m definitely not going to get involved in that discussion and I ask you not to turn this thread
into a flame fest discussing it.
What's an "authority" site?
An authority site is one that is seen as influential and trustworthy by search engines, and thus it
tends to rank well. Authority sites tend to be well-established sites that have a lot of high-quality,
relevant content as well as links from other authority sites,
Obviously, getting your own website recognized as an “authority site” will boost your rankings.
However, it’s also beneficial to get backlinks from these authority sites.

What are "supplemental" results?
These are results that are displayed in Google’s index after the main results – especially if
Google’s trusted websites didn’t return many results. These supplemental results are no longer
labeled as “supplemental” results. However, this secondary database still exists to index pages
that have less importance, such as duplicate content on your site or orphaned pages.

For example, if you have multiple pages on your site with the exact same content, then Google
will index your most important page in the main index, and place the duplicate page in the
supplemental index.

Does changing hosting affect my ranking?
No. Your webhosting does not affect your rankings. You can change hosts without it affecting
your rankings. The only issue you may run into is if you fail to make a smooth transition to your
new webhost. Downtime will naturally prevent the search engine crawlers from crawling your
site properly. Extended downtime may cause indexing issues.

To switch hosts properly follow these easy steps:
1) Set up your website on your new webhost
2) Change your DNS to point your domain name to your new webhost
3) Leave the website up on the old server for at least one week to make sure DNS has propagated
completely. After one week you can safely take down the site from the old server.
The most common mistake users make when switching hosts is not leaving the old site up while
DNS propagates. Make sure you don't wait until the last minute when switching hosts or you
may run into trouble.

What is a "good crawlable design"?
•

Don't use flash - flash is SEO suicide. Some say the search engines can crawl flash but

even if they can they certainly can't crawl it as well as HTML. (See below).
•

Don't use JavaScript to create page content - For the most part, search engines don't read

JavaScript. If you use JavaScript to create your pages' content it is as good as not being there
when the search engines come around.
•

Interlink your pages - search engines find your pages by following other links in your

pages. Be sure to link to your pages liberally especially important pages.
•

Use search engine friendly URLs - Although search engines can crawl query strings just

fine, using a URL that appears static is a good thing. Errors can occur on long or complex query
strings and this eliminates that possibility plus it is a great chance to get keywords into your URL
•

Use semantic markup - HTML is a powerful tool that search engines use to determine the

context of a page (that's another reason why flash sucks for SEO: no html). Use HTML properly
to give keywords more weight within your pages. See the Search Engine Optimization FAQ for
more on that.
•

Use a sitemap - sitemaps make sure your pages are easily found by the search engines

(good for humans, too).

Flash and SEO
An all Flash website is handicapped versus a semantic website (HTML). Even optimizing non
content aspects of your pages will still put an all Flash website at a severe disadvantage.

The problems with using Flash include:
1) It's a one page site. How many one page sites do you know that rank well?
2) You lose the power of semantic markup. No HTML = no clues for the search engines as to the
importance of keywords.
3) Expanding on point 2, you don't have any anchor text since you don't have any internal links.
That just kills you in Google.
4) There isn't a whole lot of proof that the search engines can read flash as well as HTML.

You have only one available tool for trying to SEO the site and its effect is minimal. Put
alternative content between the <object> tags. This has the same effect as the <noscript> tags for
JavaScript.

If you are making an all flash site, your only real hope is to try to be successful in a massive
incoming link campaign. Otherwise you have to target marginally competitive keywords or niche
keywords as you virtually don't have a prayer of ranking for anything even remotely competitive.

Your only other option is to create a second version of the site so it can be read by search
engines, users with accessibility issues, and users who don't have flash. Of course you've doubled
your development costs by doing this as you have two websites to maintain now.

Is Ajax bad for SEO?
Any content available via Ajax should also be available without Ajax if a site is designed
properly (i.e. accessible). That means search engines should still be able to access that data even
though they don't support Ajax/JavaScript. If you cannot then it isn't a flaw in using Ajax, it is a
flaw in the development of the site.

Do outbound links help my rankings?
No. This is a common myth that is untrue.
The whole outgoing link thing has been mentioned by amateurs going on 4 or 5 years now, no
one has ever proved it, and many have shown evidence of it not mattering. Newer people to SEO
tend to see Google's goal as one to police the webmaster community and make sure everyone
plays fair. Google is not a referee, they are a search engine, they care about serving relevant
results.
Newer people to SEO also fail to understand what PageRank is. PageRank is a measure of
perceived page quality, if an outgoing link adds to a page's quality it will get more incoming
links and thus rank better. If it doesn't add to a page's quality, no bonus will be had. There is
absolutely no reason for Google to second guess themselves and add arbitrary blanket bonuses or
penalties to all sites because of a perceived notion of a certain attribute always making a site
better or worse. So, in short, because Google measures incoming links, they have no need to
measure outgoing links, or anything else that supposedly marks a site as having a higher
"quality." in the end, if it truly does have a higher quality, it'll get more incoming links naturally.

Then there is the fact that outgoing links are strictly under the control of the webmaster, like
meta tags, and so assigning them any weight leads to the same problems that brought around the
downfall of meta tags.

Finally, there are all the thousands or millions of sites and pages that rank perfectly well without
any outgoing links. Certain types of sites, such as blogs, normally have outgoing links and it
would look abnormal for them not to. However most other site types normally do not have
outgoing links and haven't traditionally had them, going back to the 90s, long before Google
came about. Most business, commercial, ecommerce, or service sites do not have outgoing links.
Not because they're hoarding PR, but because they're trying to sell something and do not want to
distract from the user experience or send users away.

You must not remember a time before incoming link algorithms. In those times to measure
quality search engines had to guess based on on-page factors, and it was hard to impossible. The
invention of incoming links, or PageRank, search engines had a perfect way to measure the
quality of a site, and so then only had to discern topicality. Why would they take a step
backwards and again start using on-page factors to measure quality?

Google has a lot of smart people working for them, they realize that if external links truly do add
to the usefulness of a site then that site is already receiving a bonus because more useful sites
garner more incoming links. This is also true for anything else that supposedly adds usefulness.
They aren't going to say "Hey, we have this really good algorithm here, but lets second guess it
and make an assumption that pages without outgoing links need to be penalized for being less
useful." Why would you ever make an assumption about something that you can already
discretely measure?

Also, do not forget, Google itself created the nofollow link attribute to give webmasters an easier
way to block links.

In the end, if Google did give value to external links, it'd be meaningless. As soon as it was
confirmed (which no one has been able to do) all the spammers and everyone else would just add
one or two links to their pages. It would do nothing to increase relevance.

Does a page's traffic affect its rank?
No and here's why:
1) The search engines don't have access to the data they would need to use this as a ranking
factor. They do not know how much traffic a web page gets, as it is not publicly available, and
thus cannot use it to determine its rank. (For those of you who want to say, "But there is Google
Analytics", that service is used only by a small percentage of websites and unless every web site
decided to use it on every web page the data is far too incomplete to be used this way).
2) It would be a self-fulfilling prophecy if the search engines used its own SERPs as a means of
determining it's search engines results. Obviously the number one ranked page for a search is
going to get more traffic than a page not on the first page. If traffic was the indicator of where a
page belonged there would be little or no way for a page to ever move up simply based on the
fact that the pages ranked higher would be receiving more traffic from the search engine based
on the mere fact that they are ranked higher.

3) Traffic volume can be manipulated. Spammers and black hats could easily write bots to
artificially inflate their page views and thus their rankings. Plus you can purchase traffic from
traffic providers or buy expired domains and redirect them to your site. It would just be too easy
to do. (I can also see newbies hitting refresh for hours on end....)
4) Traffic is not an indicator of quality content. It is only an indicator of good marketing.
What about reciprocal links?
In general, reciprocal links are bad for SEO and should be avoided. Here's why:
1) They are a clear attempt to manipulate the search results which is a big no-no. That's why
Google specifically outs them in their webmaster guidelines. Basically they see it as vote
swapping. If you have an excessive amount of reciprocal links you run the risk of incurring
penalties. (No one knows how many it takes to incur a penalty so it isn't wise to push your luck).
2) You risk being considered part of a link farm. If you link to a website that is considered a link
farm and they link back to you, you may be seen as being part of the link farm. Link farms
violate the search engine's TOS and are a quick way to get banned.
3) The links themselves carry virtually no value, or worse, cause you to lose strength from your
pages. Because the links are on unrelated pages or pages that have little value for your niche (e.g.
wrong context) prevents them from holding any value in the search engine's eyes. What little
value they may have gets lost when you send a link back to their website thus negating any value
that link may have had. Even worse, if your link is "worth" more then their link you will actually
be hurting your site with that link exchange.
4) Many webmasters are dishonest and will remove your link or hide it from the search engines.
No incoming link means no gain for you.

Link exchanges should be saved for websites in your niche that are well established and ahead of
you in the rankings.

What keyword tools should I use and how do I use them?
Good tools to use for keywords research are Google's Adwords Keyword Suggestion Tool and
Google Trends.

You have to keep in mind that tools like Google's Adwords Keyword Suggestion Tools and
Wordtracker are not to be taken literally. You are supposed to look at the volume of keyword
searches by volume relative to each other and major search terms. That will give you an idea of
how frequently a search term is being used. The exact number isn't important unless you're
conducting trend analysis over an extended period of time. Even then, the exact number doesn't
really offer any useful information. A number within 5% - 10% (or maybe more) of the exact
number is just as useful. Those rough numbers will clearly expose which terms are popular and
which are not. The fact that Google's Adwords Keyword Suggestion Tools or Wordtracker
showed no results for a keyword means that it's search volume is extremely low which is all you
need to know. Whether it is 1 search or 100 searches doesn't matter. You now know what kind of
volume it has and what to expect in terms of competitiveness and traffic.

For example:
Let's use these fictitious results for 'stymieebot'
stymieebot 15000
stymieebot clone 6000
stymieebot repellent 5500
stymieebot stickers 5200
stymieebot t-shirts 4950
stymieebot hoolahoop 300
stymieebot mask 180
stymieebot uzi 15
stymieebot cologne 1

What we can tell is 'stymieebot' is clearly the most popular search term related to 'stymieebot'.
The number of searches could be 18,000 or 12,000 and it still would clearly be the primary
search term we would hope to rank well for and the most competitive (most likely).

'stymieebot clone', 'stymieebot repellent', 'stymieebot stickers', and 'stymieebot t-shirts' make up
the second tier of results. They're grouped relatively close together and their order really is
irrelevant. Their order will almost certainly change month-to-month but their average search
volume will most likely remain the same. They'll always be searched far less then just
'stymieebot' but still get a decent number of searches each month. Their numbers don't matter
because we know how popular they are relative to 'stymieebot' and that they are searched often
enough to be worth targeting.
'stymieebot hoolahoop', 'stymieebot mask', 'stymieebot uzi', and 'stymieebot cologne' make up
the third tier of results. They're seldom searched for and either will be longtail keywords or
ignored completely. The exact number of searches are irrelevant because relative to the first two
tiers we can see traffic from these terms will be sporadic at best and can assume they will be easy
to target.

How do improve my rankings for country specific search?
To rank better in country specific search you should:
1) Use the country specific TLD
2) Host the site in that country
3) Set the geographic location for the site in Google Webmaster Tools

What directories should I submit my site to?
There are four tiers of directories:
1) Truly quality directories - these directories are well known, actually used by some people
(although not really a whole lot), and their links have decent value (just decent value, not great
value). You can count the number of these directories on two hands and probably have fingers
left over. These directories include Dmoz and Yahoo. Links from these sites are the most
valuable types of links you can get from a directory. However, even then they are not that strong.
Links from related websites are much better.
2) Quality niche directories - These directories exclusively list sites in a certain niche: yours.
These directories don't carry the weight of the first tier but because they are in a related niche
they are better then general directories.
3) General directories with an editorial process - These are your run-of-the-mill, just-likeeveryone-else directories that litter the Internet. What separates these from the bottom tier of
directories is that these directories actually monitor their listings and try to list only quality sites
and reject spam sites and Internet waste. Links from these directories are not worth very much,
Basically if you are seeking links from these kinds of directories you are going for volume as
opposed to quality. Over time these can be helpful in your rankings for long tail and medium
competitiveness keywords.
4) General directories with no editorial process - These directories accept anyone. They are full
of crap sites and probably engage in a lot of link exchanges. These directories are worthless and
should be avoided.

What is the story with Alexa?
Alexa's rankings are generally considered to be inaccurate at best. Their rankings depend on a
user having their toolbar or their spyware installed in order to track their surfing habits. Plus their
software is limited to the Windows operating system further limiting the reach of their software
and accuracy of their results.

With the possible exceptions of selling/buying a website and applying to and ad service, Alexa
serves no useful purpose and important decisions should not be made based on its results.

If you want to improve your ranking in Alexa just install the toolbar into your browser. Be sure
to visit your site daily. This will cause your site to jump in the rankings after a few weeks. Get
your friends to do it, too, and you can make a significant impact on your rankings.
What is Page Rank?
Page Rank (PR) is a numeric value from 0-10 that Google assigns to your individual web pages,
and it’s a measure of how important that page is. Google determines this importance by looking
at how many other high quality, relevant pages link to a particular page. The more links – and the
better quality those links are – the more “votes” a page gets in terms of importance. And the
more “votes” a site gets, generally the higher the PR.

How often does Google update Page Rank?
It used to be every 3 months but it’s becoming more and more erratic.

Does PR matter?
Yes and no.
Originally PR was all that mattered in the search rankings but today that’s just not true since
there are a myriad of other factors that Google considers when weighting who should appear
where. That said, high PR is always worth having just don’t obsess over it.

What is the "Google Dance"?
When “stuff” changes the SERPs fluctuate, sometimes wildly. One day your site could be
number 1 and the next nowhere to be seen. One of the main contributing factors to that is how
Google sees your backlinks (which you’re consistently building, right?). Don’t obsess over it,
just keep building and you’ll be fine.

How does Google personalize my results?
If you’re signed into Google, then Google keeps track of what search engine results you’ve
clicked on. And even if you’re not signed in, Google keeps track of what results people who use
your computer click on.
Over time, Google starts to detect a pattern. For example, if you seem to always click on
Wikipedia results, then Google will start showing you more Wikipedia results. If you always
click on health results from webmd.com, then you’ll get more webmd.com results when you run
a health-related search.
What is a backlink?
This is when a third-party website links to your website. For example, if you write and submit an
article to an article directory, then you’ll have an incoming link – a backlink -- from the
directory.

The search engines prefer one-way incoming backlinks from high-quality, relevant websites.

What is anchor text?
When you create a link, the anchor text is the clickbable part of the link. For example, in the
phrase, “go to Google,” Google is the anchor text.

The reason this is important is because you want to use your keywords as your anchor text on
incoming links. So if you’re trying to rank for “gardening secrets,” then those two words should
make up the anchor text for several of your backlinks.

What is a do-follow/no-follow link?
There are two types of “nofollow” attribute. The robots meta tag version –

<meta name="robots" content="nofollow" />

Which tells (well behaved) bots/crawlers/spiders not to follow links on the page

And the link attribute
<a href=”http://www.google.com” rel=”nofollow”>

Which tells search engines not to count the link in terms of ranking pages.
In theory these links are worthless for boosting your search engine rankings. In practice you’ll
often see some benefit, especially when mixed in with a load of dofollow links.
Links are automatically “dofollow” in the absence of the rel=”nofollow” attribute. There is no
rel=”dofollow” attribute.

Types of backlinks?
TBD

Can paid links harm my ranking?
Google’s official stance is that buying links is an attempt to manipulate rankings – and Google
frowns on this practice.
In reality, however, it’s very hard for Google to penalize you for buying links (and they wouldn’t
be able to tell for sure anyway). Indeed, if there was a penalty, then you could destroy a
competitor simply by purchasing links to their site and then reporting them to Google. Poof,
competition gone.
Of course it doesn’t work that way. As such, if there’s any “penalty,” it may just be that Google
doesn’t “count” links from paid sources.
TIP: Google does penalize the sites that are selling these backlinks – so if you buy backlinks, be
sure that the backlinks aren’t coming directly from the penalized sites.

Are reciprocal links bad?
They’re not bad, per se, especially if they’re coming from relevant, high quality websites.
However, one-way incoming links tend to be more valuable in terms of SEO.

What is a one-way link?
This is a non-reciprocal link. That means that Site A links to Site B, but Site B does NOT link
back to Site A.

The search engines prefer to see one-way links from relevant, quality sites.
What is three-way linking?
Three-way linking is a way for two webmasters to exchange links so that each person’s website
gets a one-way link (rather than a reciprocal link).

In order to make this work, at least one of the webmasters has to have a second site in the same
niche. Here’s how it works:
Webmaster 1 links his Site A to Webmaster 2’s Site B. Then Webmaster 2 links his Site C to
Webmaster 1’s Site A.

Thus Sites A, B and C all have one-way incoming links, like this:

Site A -> Site B -> Site C -> Site A

What is a site wide link?
These are links that are found on every page of a website. For example, many people have a link
to their “home” page (the index page) on every other page of their web site. That’s a site wide
link.

What is pinging?
Pinging is informing web-crawling bots (such as search engines or directories) that you’ve
updated the content on your web page. The goal is to get these bots to crawl and index your new
content immediately.

For example, if you post a new article on your blog, you can use pingomatic.com or pingler.com
to inform multiple bots about this change.
What is link velocity?
This refers to how quickly you gain backlinks. For best results, maintain a consistent link
velocity. Most importantly don’t build a load of backlinks (especially with fast indexation
techniques) and then stop. Google sees this as a news article that was interesting for a short
period of time but no longer relevant so stops ranking it. “Too many links” or “links built too
fast” are rarely a problem but inconsistency is.

Can I build links too fast?
Yes and no. If you’ve got a brand new domain name and you fire up some of the more powerful
link spamming automation software you’ll get you domain flagged quicker than you can say,
“help me my site is gone”.
If you’re building links manually or controlling your usage of serious spam software you’ll be
hard pushed to build links too fast on any domain that’s already been aged a bit. Just be
consistent.
If you think you can build links too fast on any site here’s an experiment for you next time
you’re having a slow weekend. Go out and buy the fastest, spammiest link building software you
can lay your hands on and pick a Wikipedia article that currently ranks quite well. Go nuts. All
you will do is strengthen its position.

What is page rank sculpting?
There are various techniques available to channel link juice through the links you actually want
to receive it and thus rank them higher. In theory Google has corrected this but several
experiments have shown this isn’t the case, although the actual PR passed through the links no
longer gets affected.

What is a link wheel?
A link wheel refers to setting up multiple pages on multiple third-party websites (usually at least
five) as a means of getting backlinks to your main site.
You link these properties to each other, but not reciprocally. For example, you link your
EzineArticles article to your Squidoo page, then link your Squidoo page to HubPages… and so
on. Finally, you link each of these third-party pages to your main site.
By using sites with a ton of content (and other SEOs backlinking them) you’re naturally tapping
a bigger seem of link juice. Take advantage of this by writing high quality content for them so
human beings follow the links as well since they will rank alongside your money site.

What is a mininet?
This is like a link wheel, except that you own all the sites that you’re linking together. You may
link together a series of smaller niche sites, with each smaller site linking to your main site.

For example, you might link your dog housetraining site to your dog obedience site, and then
link your dog obedience site to a site about training dogs to do tricks. All of these smaller niche
sites would then link to your main dog training site.

What makes a good site for a link wheel?

Web 2.0 properties and other websites that have a high Page Rank. The best ones are sites which
you get a page that will be automatically linked to from all over the site. Article directories like
EzineArticles are perfect for this since you get tons of internal links to kick things off with.

What is link bait?
This means “baiting” others into linking to your site. Typically, this means posting unique,
controversial, extremely useful or otherwise entertaining content or tools so that others naturally
link to your web page.
In other words, you create and post viral content.

What is a link farm?
Link farms consist of large networks of sites whose sole purpose is to generate pages that can be
used to link out to other sites that are actually worth something.
They are pretty much essential to rank for more highly competitive keywords but don’t attempt
this unless you really know what you are doing. Google is smarter than you!

What is a footprint?
TBD

How do I search for footprints?
TBD

What is a proxy?
A proxy server is one that sits between your computer and the Internet, and using one allows you
to go online somewhat anonymously. If you get online using a Proxy, no one can trace your IP
address back to you and your computer.

For example, you can use a proxy to set up multiple EzineArticles.com accounts.
How do I get my site indexed?
Don’t bother submitting your site through the traditional methods. The fastest way to get a site to
appear in Google’s index is to create backlinks to it. Use social bookmarking sites to create lots
of easy win links from sites that are spidered regularly and submit any RSS feeds you’ve got to
directories.
If you’re really keen to get indexed as fast as humanly possible –
•

Stick Adsense on your pages (even if you remove it later) as this forces Google to spider

you.
•

Setup an Adwords campaign to your domain (Google has to spider you to determine your

quality score).
•

Search for your domain name.

•

Perform site: and link: searches on your domain.

•

Visit your site using accounts with some of the most widespread ISPs (eg AOL) since

their logs are used to find new content.
•

Email links to your site to and from a Gmail account.

How do I get my backlinks indexed?
The slow way is to wait for the search engines to naturally find them. The faster way is to ping
the page after you leave a backlink. For truly fast backlinking social bookmark them or create
RSS feeds with links in.

How can I tell if my site has been visited by a spider/bot?
By checking your traffic logs and statistics. Most traffic analyzing software will recognize and
label the bots and spiders that crawl your site. You can also recognize these visitors manually, as
the “user agent” is usually labeled something obvious, such as “Google Bot.”
What percentage of people click on the first listing in Google?
Only Google knows for sure, but estimates range from about 40% to 50%. AOL once released
their data, which suggested that 42% click on the first listing. “Heat map” studies tend to lean
more towards 50% or more.

How do I use Google alerts to monitor the SERPs?
All you have to do is get a Google account and then go to Google Alerts. There you enter the
keywords you want the tool to monitor the SERPs for, choose “comprehensive,” choose the
frequency you want to receive the alerts and then enter your email address where you want to
receive the alerts.
Once you’ve completed those simple steps, you’ll get alerted when new pages that use your
keywords appear in the search engines.

You can also use this tool to monitor your backlinks as they appear in Google. Just enter this
search term into the alerts field:

link:www.yourdomain.com/filename.html

Replace the above URL with your actual link, of course.

How can I track the number of backlinks I have?
There are a variety of tools available to you, such as using the Yahoo! Site Explorer, Google
Webmaster tools (check the links report) and SEO Quake.
Using these tools is preferable to searching directly in Google. That’s because searching
manually generally yields only a sample of the sites that are linking to your site.
Ultimately they’re all wrong! Don’t obsess about tracking these things just focus on building
more.
What makes a good keyword?
A good keyword is one that your target market is searching for regularly. An even better
keyword is one that’s not only searched for regularly, there’s also very little competition in the
search engines. That means you have a good chance of ranking well for that keyword.

How many people are searching for my keyword?
You’ll need to use a keyword tool to find out the answer. Example tools include the Google
keyword tool, WordTracker.com, MarketSamurai.com and any number of other similar tools.

What is the "true" competition for a keyword?
Forget all that rubbish you see in just about everyone’s WSO “proof” about how they outranked
a bajillion other sites for some phrase or other.

The only listings that matter are on page 1 so the only people you are competing with are on
page 1. I would much rather compete with a billion PR0 unrelated sites than 10 PR9s that have
been around over a decade and you should too!

Find out the page rank for the top ten listed pages and find the number of backlinks they have.
That’s your competition.

What are long tail keywords?
Highly niche searches. For example, “dog training” is a short tail keyword, while “how to train a
deaf dog” is a long tail keyword.

Long tail keywords tend to have less people searching for them than short tail words. On the
other hand, they also tend to have less competition in the search engines, thus it can be easier for
you to get top rankings for these words.
What is the official Google/Yahoo/Bing policy on SEO?
The search engines encourage you to design your site and organize your content in a search
engine friendly way. This includes proper use of meta tags, creating information-rich sites,
including words on your site that your users are searching for, using site maps and more.

However, they all strongly discourage any attempts to manipulate your search engine rankings,
such as keyword stuffing, link spamming, cloaking and similar practices.

Why doesn't Google tell me how many links I have?
Google only shows a sample of backlinks, because generally it’s only webmasters who are
seeking this information. As such, webmasters who know ALL of their competitor’s backlinks
can just go and get links from the same sources (which may be viewed as manipulating the
rankings). By only showing a sample, Google helps reduce this practice somewhat.

They also make some claim about the amount of resources required to list all this information
which I guess would be true if they didn’t have to have it stored for a million other reasons.
Bottom line, they don’t want you to have it, get over it.

Who is Matt Cutts?
Matt Cutts is a Google employee specializing in SEO issues, and thus he’s seen as the authority
on all things Google. He frequently talks about SEO practices, Google’s policies, link strategies
and other Google issues. You can find his personal blog here.
He’s an incredibly talented and influential individual but never forget that he has Google’s best
interest at heart. Not everything he says can be taken as Gospel.

Google webmaster tools
Google offers webmasters a variety of free tools that allow you to do things like: submit your site
map, get more info about how often the Google bot is crawling your site, get a report of any
errors the bot found, see the internal and external links pointing to your site, determine how your
URL is displayed in the SERPs, etc.
You can access the full set of Webmaster Tools here.
Automation, Outsourcing and 3rd Party Stuff

Can anyone guarantee a 1st place ranking?
No. Because the search engines can and do change their algorithms, and because a third-party
site may drop or change your links, no one can guarantee a first place ranking for a keyword.
However, SEO experts can create high rankings – even first place – for certain keywords. They
just can’t guarantee those placements, as the algorithms and third-party links are not under their
control.

What is a backlink packet?
Instead of searching for high-PR, .edu, .gov and authority sites to place your backlinks, you can
save time by purchasing a “packet” that lists these types of sites for you. These packets typically
include “do follow”:
•

Blogs where you can make comments.

•

Forums where you can set up profiles.

•

Directories where you can post content and backlinks

…and similar sites.
The bonus of these packets is that they save you time since you don’t have to seek out these sites
yourself. The downside is that sometimes the webmasters change their policies once they get an
onslaught of these links. For example, the owner of a high-PR blog may change to “no follow”
links or disallow comments altogether.

I bought a packet of "high pr links" but all my links are PR0, what happened?
Usually this is because the main page of the website – such as the main page of the forum – has a
high PR. However, the actual place where you put your link – such as your profile page – is PR0
because you basically just created the page when you created your profile.
What automation tools are there?
There are a variety of tools you can use to help automate the SEO process, including:
•

Tools to structure your content in a search engine friendly way. (Hint: Content

management systems and blogs like WordPress do this naturally, but you can also use SEO
plugins to make your blog content even more search-engine friendly.)
•

Keyword tools.

•

Tools to automatically submit or update content, such as tools that submit to directories

or tools that automatically update your social media content (such as ping.fm).
•

Tools that automate social bookmarking.

•

Tools that help automate tasks like building link wheels.

•

Tools to create content, such as article spinners, scrapers and autoblog tools.

•

Pinging tools (like pingomatic.com or pingler.com).

•

Tools that automate link-building, such as blog and guest book commenting tools.

What SEO service should I use?
This question is far too contentious for a forum FAQ like this so I’m not going to name specific
services. Instead here’s some general advice on selecting SEO services.
Don’t fall for hype about “ranking for the most competitive terms in the SEO industry”. SEO
companies that do this are pouring their resources into this highly competitive game because of
the PR boost its worth. Ultimately that cost has to go somewhere. Instead find SEO firms that
focus on customer testimonials showing good results.
Don’t get involved in “my links are better than your links” battles. Nothing annoys me more than
seeing arguments about how so-and-so’s link packet is more effective than such-and-such’s. Just
focus on building a large variety of links and you’ll be fine.

What does an SEO host give me that a regular one doesn't?
Multiple C-class IP addresses. So even if you host multiple websites with one host, you get
different addresses. And that means you can build a mininet more easily without being detected.
301 A permanent server redirect - a change of address for a web page found in the htaccess file
on apache servers. Also useful for dealing with canonical issues.

Adwords Google Pay Per Click contextual advertisement program, very common way of basic
website advertisement.

Adwords site (MFA) Made For Google Adsense Advertisements - websites that are designed
from the ground up as a venue for GA advertisements. This is usually, but not always a bad
thing. TV programming is usually Made For Advertisement.

Affiliate An affiliate site markets products or services that are actually sold by another website
or business in exchange for fees or commissions.

Algorithm (algo) A program used by search engines to determine what pages to suggest for a
given search query.
ALT text A description of a graphic, which usually isn’t displayed to the end user, unless the
graphic is undeliverable, or a browser is used that doesn’t display graphics. Alt text is important
because search engines can’t tell one picture from another. Alt text is the one place where it is
acceptable for the spider to get different content than the human user, but only because the alt
text is accessible to the user, and when properly used is an accurate description of the associated
picture. Special web browsers for visually challenged people rely on the alt text to make the
content of graphics accessible to the users.

Analytics A program which assists in gathering and analyzing data about website usage. Google
analytics is a feature rich, popular, free analytics program.

Anchor text The user visible text of a link. Search engines use anchor text to indicate the
relevancy of the referring site and of the link to the content on the landing page. Ideally all three
will share some keywords in common.
Astroturfing (the opposite of full disclosure) attempting to advance a commercial or political
agenda while pretending to be an impartial grassroots participant in a social group. Participating
in a user forum with the secret purpose of branding, customer recruitment, or public relations.

Authority (trust, link juice, Google juice) The amount of trust that a site is credited with for a
particular search query. Authority/trust is derived from related incoming links from other trusted
sites.

Authority site A website which has many incoming links from other related expert/hub sites.
Because of this simultaneous citation from trusted hubs an authority site usually has high trust,
pagerank, and search results placement. Wikipedia, is an example of an authority site.

B2B Business to Business.

B2C Business to Consumer

Back link (inlink, incoming link) Any link into a page or site from any other page or site.

Black hat Search engine optimization tactics that are counter to best practices such as the
Google Webmaster Guidelines.

Blog A website which presents content in a more or less chronological series. Content may or
may not be time sensitive. Most blogs us a Content Management System such as WordPress
rather than individually crafted WebPages. Because of this, the Blogger can chose to concentrate
on content creation instead of arcane code.

Bot (robot, spider, crawler) A program which performs a task more or less autonomously. Search
engines use bots to find and add web pages to their search indexes. Spammers often use bots to
“scrape” content for the purpose of plagiarizing it for exploitation by the Spammer.
Bounce rate The percentage of users who enter a site and then leave it without viewing any
other pages.

Bread crumbs Web site navigation in a horizontal bar above the main content which helps the
user to understand where they are on the site and how to get back to the root areas.

Canonical issues (duplicate content) canon = legitimate or official version - It is often nearly
impossible to avoid duplicate content, especially with CMSs like Wordpress, but also due to the
fact that www.site.com, site.com, and www.site.com/index.htm are supposedly seen as dupes by
the SEs - although it’s a bit hard to believe they aren’t more sophisticated than that. However
these issues can be dealt with effectively in several ways including - using the noindex meta tag
in the non-canonical copies, and 301 server redirects to the canon.

Click fraud Improper clicks on a PPC advertisement usually by the publisher or his minions for
the purpose of undeserved profit. Click fraud is a huge issue for add agencies like Google,
because it lowers advertiser confidence that they will get fair value for their add spend.

Cloak The practice of delivering different content to the search engine spider than that seen by
the human users. This Black Hat tactic is frowned upon by the search engines and caries a virtual
death penalty of the site/domain being banned from the search engine results.

CMS Content Management System - Programs such as Wordpress, which separate most of the
mundane Webmaster tasks from content creation so that a publisher can be effective without
acquiring or even understanding sophisticated coding skills if they so chose.

Code swapping (bait and switch) Changing the content after high rankings are achieved.

Comment spam Posting blog comments for the purpose of generating an inlink to another site.
The reason many blogs use link condoms.
Content (text, copy) The part of a web page that is intended to have value for and be of interest
to the user. Advertising, navigation, branding and boilerplate are not usually considered to be
content.

Contextual advertisement Advertising which is related to the content.

Conversion (goal) Achievement of a quantifiable goal on a website. Add clicks, sign ups, and
sales are examples of conversions.

Conversion rate Percentage of users who convert - see conversion.

CPC Cost Per Click - the rate that is paid per click for a Pay Per Click Advertiser

CPM (Cost Per Thousand impressions) A statistical metric used to quantify the average value /
cost of Pay Per Click advertisements. M - from the Roman numeral for one thousand.

Crawler (bot, spider) A program which moves through the worldwide web or a website by way
of the link structure to gather data.

Directory A site devoted to directory pages. The Yahoo directory is an example.

Directory page A page of links to related WebPages.

Doorway (gateway) A web page that is designed specifically to attract traffic from a search
engine. A doorway page which redirects users (but not spiders) to another site or page is
implementing cloaking. - Previous Definition revised based upon advice from Michael Martinez

Duplicate content Obviously content which is similar or identical to that found on another
website or page. A site may not be penalized for serving duplicate content but it will receive little
if any Trust from the search engines compared to the content that the SE considers being the
original.
E commerce site A website devoted to retail sales.

Feed Content which is delivered to the user via special websites or programs such as news
aggregators.

FFA (Free For All) A page or site with many outgoing links to unrelated websites, containing
little if any unique content. Link farms are only intended for spiders, and have little if any value
to human users, and thus are ignored or penalized by the search engines.

Frames a web page design where two or more documents appear on the same screen, each
within it’s own frame. Frames are bad for SEO because spiders sometimes fail to correctly
navigate them. Additionally, most users dislike frames because it is almost like having two tiny
monitors neither of which shows a full page of information at one time.

Gateway page (doorway page) A web page that is designed to attract traffic from a search
engine and then redirect it to another site or page. A doorway page is not exactly the same as
cloaking but the effect is the same in that users and search engines are served different content.

Gadget see gizmo

Gizmo (gadget, widget) small applications used on web pages to provide specific functions such
as a hit counter or IP address display. Gizmos can make good link bait.

Google bomb The combined effort of multiple webmasters to change the Google search results
usually for humorous effect. The “miserable failure” - George Bush, and “greatest living
American” - Steven Colbert Google bombs are famous examples.
Google bowling Maliciously trying to lower a sites rank by sending it links from the “bad
neighborhood” - Kind of like yelling “Good luck with that infection!” to your buddy as you get
off the school bus - there is some controversy as to if this works or is just an SEO urban myth.
Google dance The change in SERPs caused by an update of the Google database or algorithm.
The cause of great angst and consternation for webmasters who slip in the SERPs. Or, the period
of time during a Google index update when different data centers have different data.

Google juice (trust, authority, pagerank) trust / authority from Google, which flows through
outgoing links to other pages.
Googlebot Google’s spider program

GYM Google - Yahoo - Microsoft, the big three of search

Hit Once the standard by which web traffic was often judged, but now a largely meaningless
term replaced by pageviews AKA impressions. A hit happens each time that a server sends an
object - documents, graphics, include files, etc. Thus one pageview could generate many hits.

Hub (expert page) a trusted page with high quality content that links out to related pages.
HTML (Hyper Text Markup Language) directives or “markup” which are used to add
formatting and web functionality to plain text for use on the internet. HTML is the mother
tongue of the search engines, and should generally be strictly and exclusively adhered to on web
pages.

Impression (page view) The event where a user views a webpage one time.

In bound link (inlink, incoming link) Inbound links from related pages are the source of trust
and pagerank.

Index Noun - a database of WebPages and their content used by the search engines.

Index Verb - to add a web page to a search engine index.
Indexed Pages The pages on a site which have been indexed.

Inlink (incoming link, inbound link) Inbound links from related pages are the source of trust and
pagerank.

Keyword - key phrase The word or phrase that a user enters into a search engine.

Keyword cannibalization The excessive reuse of the same keyword on too many web pages
within the same site. This practice makes it difficult for the users and the search engines to
determine which page is most relevant for the keyword.

Keyword density The percentage of words on a web page which are a particular keyword. If this
value is unnaturally high the page may be penalized.

Keyword research The hard work of determining which keywords are appropriate for targeting.

Keyword spam (keyword stuffing) Inappropriately high keyword density.

Keyword stuffing (keyword spam) Inappropriately high keyword density.

Landing page the page that a user lands on when they click on a link in a SERP

Latent semantic indexing (LSI) This mouthful just means that the search engines index
commonly associated groups of words in a document. SEOs refer to these same groups of words
as “Long Tail Searches”. The majority of searches consist of three or more words strung
together. See also “long tail”. The significance is that it might be almost impossible to rank well
for “mortgage”, but fairly easy to rank for “second mortgage to finance monster truck team”. Go
figure.

Link An element on a web page that can be clicked on to cause the browser to jump to another
page or another part of the current page.
Link bait A webpage with the designed purpose of attracting incoming links, often mostly via
social media.

Link building actively cultivating incoming links to a site.

Link condom Any of several methods used to avoid passing link love to another page, or to
avoid possible detrimental results of indorsing a bad site by way of an outgoing link, or to
discourage link spam in user generated content.

Linkerati internet users who are the most productive targets of linkbait. The Linkerati includes social taggers, forum posters, resource maintainers, bloggers and other content creators, etc who are most likely to create incoming links or link generating traffic (in the case of social
networkers). Suggested by lorisa.

Link exchange a reciprocal linking scheme often facilitated by a site devoted to directory pages.
Link exchanges usually allow links to sites of low or no quality, and add no value themselves.
Quality directories are usually human edited for quality assurance.

Link farm a group of sites which all link to each other.- Previous Definition revised based upon
advice from Michael Martinez

Link juice (trust, authority, pagerank)

Link love An outgoing link, which passes trust, unencumbered by any kind of link condom.

Link partner (link exchange, reciprocal linking) Two sites which link to each other. Search
engines usually don’t see these as high value links, because of the reciprocal nature.

Link popularity a measure of the value of a site based upon the number and quality of sites that
link to it
SEO Interview FAQ
SEO Interview FAQ
SEO Interview FAQ
SEO Interview FAQ
SEO Interview FAQ
SEO Interview FAQ
SEO Interview FAQ
SEO Interview FAQ

Weitere ähnliche Inhalte

Was ist angesagt?

Introduction to SEO Basics
Introduction to SEO BasicsIntroduction to SEO Basics
Introduction to SEO BasicsJenifer Renjini
 
SEO 101 webinar 10 25-2012
SEO 101 webinar 10 25-2012SEO 101 webinar 10 25-2012
SEO 101 webinar 10 25-2012451 Marketing
 
Free SEO Beginner's Guide From New Web Dimension
 Free SEO Beginner's Guide From New Web Dimension Free SEO Beginner's Guide From New Web Dimension
Free SEO Beginner's Guide From New Web DimensionNew Web Dimension
 
On page optimization factors
On page optimization factorsOn page optimization factors
On page optimization factorsArjun Raveendran
 
What is SEO ? (Search Engine Optimization) - Types of SEO and techniques
What is SEO ? (Search Engine Optimization) - Types of SEO and techniquesWhat is SEO ? (Search Engine Optimization) - Types of SEO and techniques
What is SEO ? (Search Engine Optimization) - Types of SEO and techniquesMohideen Kareem
 
Search Engine Optimization
Search Engine OptimizationSearch Engine Optimization
Search Engine OptimizationShashank Varun
 
Vocus Smart Start Website Analysis
Vocus Smart Start Website AnalysisVocus Smart Start Website Analysis
Vocus Smart Start Website AnalysisJeff Zelaya
 
How a search engine works report
How a search engine works reportHow a search engine works report
How a search engine works reportSovan Misra
 
Basic SEO mini workshop for copywriter
Basic SEO mini workshop for copywriter Basic SEO mini workshop for copywriter
Basic SEO mini workshop for copywriter salomon dayan
 
Legal Publish SEO Webinar
Legal Publish SEO WebinarLegal Publish SEO Webinar
Legal Publish SEO WebinarLegal Publish
 

Was ist angesagt? (20)

concepts of SEO
concepts of SEOconcepts of SEO
concepts of SEO
 
Introduction to SEO Basics
Introduction to SEO BasicsIntroduction to SEO Basics
Introduction to SEO Basics
 
Seo
Seo Seo
Seo
 
SEO 101 webinar 10 25-2012
SEO 101 webinar 10 25-2012SEO 101 webinar 10 25-2012
SEO 101 webinar 10 25-2012
 
Search engine optimization
Search engine optimizationSearch engine optimization
Search engine optimization
 
About search engines
About search enginesAbout search engines
About search engines
 
Free SEO Beginner's Guide From New Web Dimension
 Free SEO Beginner's Guide From New Web Dimension Free SEO Beginner's Guide From New Web Dimension
Free SEO Beginner's Guide From New Web Dimension
 
Search engine
Search engineSearch engine
Search engine
 
Beginners guide to seo
Beginners guide to seoBeginners guide to seo
Beginners guide to seo
 
On page optimization factors
On page optimization factorsOn page optimization factors
On page optimization factors
 
What is SEO ? (Search Engine Optimization) - Types of SEO and techniques
What is SEO ? (Search Engine Optimization) - Types of SEO and techniquesWhat is SEO ? (Search Engine Optimization) - Types of SEO and techniques
What is SEO ? (Search Engine Optimization) - Types of SEO and techniques
 
Seo
SeoSeo
Seo
 
Search Engine Optimization
Search Engine OptimizationSearch Engine Optimization
Search Engine Optimization
 
Vocus Smart Start Website Analysis
Vocus Smart Start Website AnalysisVocus Smart Start Website Analysis
Vocus Smart Start Website Analysis
 
Seo Presentation
Seo PresentationSeo Presentation
Seo Presentation
 
Lvr ppt
Lvr pptLvr ppt
Lvr ppt
 
How a search engine works report
How a search engine works reportHow a search engine works report
How a search engine works report
 
Basic SEO mini workshop for copywriter
Basic SEO mini workshop for copywriter Basic SEO mini workshop for copywriter
Basic SEO mini workshop for copywriter
 
Legal Publish SEO Webinar
Legal Publish SEO WebinarLegal Publish SEO Webinar
Legal Publish SEO Webinar
 
Search engine
Search engineSearch engine
Search engine
 

Ähnlich wie SEO Interview FAQ

Seo presentation ! BATRA COMPUTER CENTRE
Seo presentation ! BATRA COMPUTER CENTRESeo presentation ! BATRA COMPUTER CENTRE
Seo presentation ! BATRA COMPUTER CENTREjatin batra
 
SEO Presentation
SEO PresentationSEO Presentation
SEO Presentationganeh17
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaigntouchdown777a
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaignbelieve52
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignTrafficInjectors
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaignbobtravpa
 
SEO Training in Hyderabad | SEO Classes in Hyderbad | SEO Coaching in Hyde...
SEO Training in Hyderabad |  SEO  Classes in Hyderbad | SEO Coaching in  Hyde...SEO Training in Hyderabad |  SEO  Classes in Hyderbad | SEO Coaching in  Hyde...
SEO Training in Hyderabad | SEO Classes in Hyderbad | SEO Coaching in Hyde...Prasad Reddy
 
Internet Tutorial 03
Internet  Tutorial 03Internet  Tutorial 03
Internet Tutorial 03dpd
 
Seo presentations
Seo presentationsSeo presentations
Seo presentationsdigitalmktg
 
SEO Tutorial - SEO Company in India
SEO Tutorial - SEO Company in IndiaSEO Tutorial - SEO Company in India
SEO Tutorial - SEO Company in Indiaannakoch32
 
Report on search engines
Report on search enginesReport on search engines
Report on search enginesAmandeep Kaur
 
Intro to Search Engine Optimization
Intro to Search Engine OptimizationIntro to Search Engine Optimization
Intro to Search Engine Optimizationoxenfoord
 

Ähnlich wie SEO Interview FAQ (20)

Seo Manual
Seo ManualSeo Manual
Seo Manual
 
Basics of SEO
Basics of SEO Basics of SEO
Basics of SEO
 
SEO
SEOSEO
SEO
 
Seo presentation ! BATRA COMPUTER CENTRE
Seo presentation ! BATRA COMPUTER CENTRESeo presentation ! BATRA COMPUTER CENTRE
Seo presentation ! BATRA COMPUTER CENTRE
 
Seo basics
Seo basicsSeo basics
Seo basics
 
SEO Presentation
SEO PresentationSEO Presentation
SEO Presentation
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaign
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaign
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaign
 
Essentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation CampaignEssentials of Search Engine Optimisation Campaign
Essentials of Search Engine Optimisation Campaign
 
ON PAGE SEO
ON PAGE SEOON PAGE SEO
ON PAGE SEO
 
SEO Training in Hyderabad | SEO Classes in Hyderbad | SEO Coaching in Hyde...
SEO Training in Hyderabad |  SEO  Classes in Hyderbad | SEO Coaching in  Hyde...SEO Training in Hyderabad |  SEO  Classes in Hyderbad | SEO Coaching in  Hyde...
SEO Training in Hyderabad | SEO Classes in Hyderbad | SEO Coaching in Hyde...
 
Internet Tutorial 03
Internet  Tutorial 03Internet  Tutorial 03
Internet Tutorial 03
 
Seo presentations
Seo presentationsSeo presentations
Seo presentations
 
What is seo
What is seoWhat is seo
What is seo
 
Seo.pptx new ppt
Seo.pptx new pptSeo.pptx new ppt
Seo.pptx new ppt
 
SEO Tutorial - SEO Company in India
SEO Tutorial - SEO Company in IndiaSEO Tutorial - SEO Company in India
SEO Tutorial - SEO Company in India
 
Report on search engines
Report on search enginesReport on search engines
Report on search engines
 
Seo presentations
Seo presentationsSeo presentations
Seo presentations
 
Intro to Search Engine Optimization
Intro to Search Engine OptimizationIntro to Search Engine Optimization
Intro to Search Engine Optimization
 

Kürzlich hochgeladen

Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...
Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...
Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...Niya Khan
 
VIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service Bhiwandi
VIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service BhiwandiVIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service Bhiwandi
VIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service BhiwandiSuhani Kapoor
 
Vip Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...
Vip  Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...Vip  Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...
Vip Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...shivangimorya083
 
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service BhilaiVIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service BhilaiSuhani Kapoor
 
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
Call Girl in Low Price Delhi Punjabi Bagh  9711199012Call Girl in Low Price Delhi Punjabi Bagh  9711199012
Call Girl in Low Price Delhi Punjabi Bagh 9711199012sapnasaifi408
 
Résumé (2 pager - 12 ft standard syntax)
Résumé (2 pager -  12 ft standard syntax)Résumé (2 pager -  12 ft standard syntax)
Résumé (2 pager - 12 ft standard syntax)Soham Mondal
 
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证obuhobo
 
VIP Kolkata Call Girl Lake Gardens 👉 8250192130 Available With Room
VIP Kolkata Call Girl Lake Gardens 👉 8250192130  Available With RoomVIP Kolkata Call Girl Lake Gardens 👉 8250192130  Available With Room
VIP Kolkata Call Girl Lake Gardens 👉 8250192130 Available With Roomdivyansh0kumar0
 
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big BoodyDubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boodykojalkojal131
 
VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...
VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...
VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...Suhani Kapoor
 
Internshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University CertificateInternshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University CertificateSoham Mondal
 
Experience Certificate - Marketing Analyst-Soham Mondal.pdf
Experience Certificate - Marketing Analyst-Soham Mondal.pdfExperience Certificate - Marketing Analyst-Soham Mondal.pdf
Experience Certificate - Marketing Analyst-Soham Mondal.pdfSoham Mondal
 
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call GirlsSonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call GirlsNiya Khan
 
Delhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...
VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...
VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...Suhani Kapoor
 
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Callshivangimorya083
 
VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...
VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...
VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...Suhani Kapoor
 
Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...
Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...
Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...Suhani Kapoor
 
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...Suhani Kapoor
 

Kürzlich hochgeladen (20)

Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls Greater Noida 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...
Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...
Neha +91-9537192988-Friendly Ahmedabad Call Girls has Complete Authority for ...
 
VIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service Bhiwandi
VIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service BhiwandiVIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service Bhiwandi
VIP Call Girl Bhiwandi Aashi 8250192130 Independent Escort Service Bhiwandi
 
Vip Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...
Vip  Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...Vip  Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...
Vip Modals Call Girls (Delhi) Rohini 9711199171✔️ Full night Service for one...
 
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service BhilaiVIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
VIP Call Girl Bhilai Aashi 8250192130 Independent Escort Service Bhilai
 
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
Call Girl in Low Price Delhi Punjabi Bagh  9711199012Call Girl in Low Price Delhi Punjabi Bagh  9711199012
Call Girl in Low Price Delhi Punjabi Bagh 9711199012
 
Résumé (2 pager - 12 ft standard syntax)
Résumé (2 pager -  12 ft standard syntax)Résumé (2 pager -  12 ft standard syntax)
Résumé (2 pager - 12 ft standard syntax)
 
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
女王大学硕士毕业证成绩单(加急办理)认证海外毕业证
 
VIP Kolkata Call Girl Lake Gardens 👉 8250192130 Available With Room
VIP Kolkata Call Girl Lake Gardens 👉 8250192130  Available With RoomVIP Kolkata Call Girl Lake Gardens 👉 8250192130  Available With Room
VIP Kolkata Call Girl Lake Gardens 👉 8250192130 Available With Room
 
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big BoodyDubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
Dubai Call Girls Demons O525547819 Call Girls IN DUbai Natural Big Boody
 
VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...
VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...
VIP Russian Call Girls in Bhilai Deepika 8250192130 Independent Escort Servic...
 
Internshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University CertificateInternshala Student Partner 6.0 Jadavpur University Certificate
Internshala Student Partner 6.0 Jadavpur University Certificate
 
Experience Certificate - Marketing Analyst-Soham Mondal.pdf
Experience Certificate - Marketing Analyst-Soham Mondal.pdfExperience Certificate - Marketing Analyst-Soham Mondal.pdf
Experience Certificate - Marketing Analyst-Soham Mondal.pdf
 
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call GirlsSonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
Sonam +91-9537192988-Mind-blowing skills and techniques of Ahmedabad Call Girls
 
Delhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Ex 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...
VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...
VIP Call Girls Service Cuttack Aishwarya 8250192130 Independent Escort Servic...
 
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip CallDelhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
Delhi Call Girls South Delhi 9711199171 ☎✔👌✔ Whatsapp Hard And Sexy Vip Call
 
VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...
VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...
VIP Call Girls Service Jamshedpur Aishwarya 8250192130 Independent Escort Ser...
 
Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...
Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...
Low Rate Call Girls Gorakhpur Anika 8250192130 Independent Escort Service Gor...
 
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
VIP Russian Call Girls in Amravati Deepika 8250192130 Independent Escort Serv...
 

SEO Interview FAQ

  • 2.
  • 3. What is a search engine and how does it work? On the Internet, a search engine has three parts: 1. A spider (also called a "crawler" or a "bot") which travels to every page or representative page on every searchable web site, reads it, then using hypertext links on those pages, travels throughout the other pages linked by that web site. 2. A catalog or Index which is created by programs compiling the pages read from those web sites, and... 3. A program which receives your search request, compares it to the entries in the index, and returns the results to you. An alternative to using a search engine is to explore a structured directory of topics. Yahoo, which also lets you use its search engine, is the most widely-used directory on the Web. A number of Web portal sites offer both the search engine and directory approaches to finding information Not all search engines are created equal, but all of them have a few basic components that are essential to their use. Some components are more visible than others to the average user, but all of them must be working in tandem to create a high performance search tool. The three basic actions that have to be performed for a search engine to be useful are: Gather information, analyze information, and display information. The only major difference between major search engines is how these tasks are performed and how often they are performed. Gathering information Spiders are the programs that search engines use to collect information about web sites on the Internet. These programs traverse the world wide web gathering the content of web sites and store that information for later processing. There are two basic ways that spiders can find your web site. You can tell the search engine about your web site, or let it find your site on its own. Typically search engines will have a place on their web site which allows you to suggest a site to them. After a site has been suggested, the search engines spider will visit that web site to collect information about it. Spiders also follow the links on each web site to find linked sites to visit. This is how a spider will find your site by itself. The more web sites that link to your site, the more likely a spider will find your site without you telling it your sites URL. Usually search engine spiders will revisit your site when you submit your URL again. When the spider finds a link to your site, or after a specified amount of time has passed since its last visit.
  • 4. Depending on the number of web sites that the spider needs to visit and the resources that the spider has at its disposal, it can take days or months for a spider to visit or revisit your web site. Displaying information Search engines take a search request from a user and display a list of web pages that relate to that topic. These returned sites give clues to the algorithm used to analyze the web pages in the search engines index. When a search engine displays the file size of the web page or a percentage next to the web site, it can be used to help figure out how to optimize your web pages better for that search engine. Some search engines return results in the order of relevance, others mix up the results to make sure the web sites returned are from different sites. No matter how a search engine displays the information requested by a user, this result is typically the first impression of your web site. It is important to follow any guidelines that search engines give and do research on how each search engine analyzes web pages so that you not only get a good ranking for your search, but the description of your site is accurate as well. What is SEO? SEO = Search Engine Optimization, ie getting your site ranked higher so more people show up at your doorstep. In theory we’re interested in all search engines. In practice SEO = Google. What are SERPs? SERPs is an acronym for Search Engine Results Pages. Basically they are the search results you receive when doing a search at a search engine. What is anchor text? Why is it important? Anchor text is the visible hyperlinked text on the page. For example, let's examine this code: <a href="http://www.sitepoint.com/forums/">Webmaster Forums</a> The anchor text for this link is "Webmaster Forums". This is important in search engine rankings because the search engines use anchor text to help determine the relevance of a page being linked
  • 5. to for those keywords. By having this link pointing to their forum's web page, SitePoint Forums will perform better in searches for the phrase "webmaster forums" (and other similar phrases as well). What are Meta tags? Meta tags appear in search results as well as on your site’s pages. Meta tags help optimize your site regarding the search engines. Also, meta tags help browsers understand what your site’s pages are about in regards to satisfying their search needs. The large majority of search engines do not use Meta Tags as part of their ranking algorithm. Some will claim Google uses Meta tags in its algorithm. This is entirely untrue. Google, however, will use a meta description tag if it is unable to discern a description for a webpage on its own (if the page has no text and no description in the open directory [dmoz] it is likely Google will use the meta description tag in its SERPs). Please note that it is only using this description in its SERPs, not its algorithm. Should you use Meta Tags in your site? Yes. They do have some affect in some search engines and even though that effect is almost zero it is still more then zero so is worth the time. How much time should I spend on my Meta Tags? Ten minutes. Write a nice concise description of your page and throw in a sampling of keywords (which you should have handy if you've optimized your pages properly). You should spend no more time then this on them. Use your time to promote your site and get quality inbound links. How many keywords should I use? As many as you want. If you start to think you may have too many, you probably do. This means you need to divide your page into subpages with each one taking its own topic.
  • 6. Which are the most important area to include your keywords? Page title and Body text are the most important areas where we can include keywords for the SEO purpose. What is a Title? The "Title" of a web site is probably the single most important element for natural search engine positioning. The Title is placed within the "head" of the html, is generally 12-15 words long and should be descriptive in nature. What is a keyword? A "keyword" or "keyword Phrase" is the word or words a person types into the search box on a search engine to look up subject matter on the Internet. If you are looking for a flag for your home or office, you might type in "American Flags". The Search Engine screens its database for those web sites it has obtained and looks for the words, "American Flags". Through programming, it then finds and places in order those web sites which it believes to be a match and displays them in order of relevancy. With proper design of a web site, you should have a keyword meta tag area within the head of your html to list the words or "keywords" which best describe your web site. It is important to reflect carefully when choosing your keywords. If you sell boats, but you are only licensed to do so in Maine, then your keywords might best be "boats for sale in Maine" or "Maine Boats", etc. What is a Description? The "Description" of your web site also resides within the "head" of your html and is usually a sentence or two containing approximately 15 words which best describe your web site. What is "body content relevance"? "Body content relevance" is the written "non-image" text on the page of the web site which is descriptive in nature and relates to the title, description and keywords. It is not mandatory to have relevant body content, but it most definetly will assist your ranking on the search engines.
  • 7. What is an algorithm? The term algorithm (pronounced "AL-go-rith-um") is a procedure or formula for solving a problem. The word derives from the name of the Persian mathematician, Al-Khowarizmi (825 AD). A computer program can be viewed as an elaborate algorithm. In mathematics and computer science, an algorithm usually means a small procedure that solves a recurrent problem. What is Google Webmaster Tools? Google Webmaster Tools is a no-charge web service by Google for webmasters. It allows webmasters to check indexing status and optimize their websites. It has tools that let the webmasters submit and check sitemaps, generate and check robots.txt files, list internal and external pages linking to the site, view statistics related to how Google crawls the site, and more. Webmaster tools is a free service by Google from where we can get free Indexing data, backlinks information, crawl errors, search queries, CTR, website malware errors and submit the XML sitemap. What is Google Analytics? Google Analytics help you analyze visitor behavior in regards to your site. Analytics tools can tell you how many visitors you had each day, what pages they crawled, how long they were on each page, etc. Google Analytics is an invaluable tool in helping to augment your site’s ability to attract browsers. What’s an XML sitemap? An XML sitemap is a list of pages of a web site accessible to crawlers or users. It lists the pages on a web site, typically organized in hierarchical fashion. This helps visitors and search engine bots find pages on the site. What is a robots.txt file? A robots.txt file on a website will function as a request that specified robots ignore specified files or directories in their search. This might be, for example, out of a preference for privacy from search engine results, or the belief that the content of the selected directories might be misleading
  • 8. or irrelevant to the categorization of the site as a whole, or out of a desire that an application only operate on certain data. If you do not wish to block any files from the search engines then you do not need to use a robots.txt file. Having one will not improve your rankings by itself nor make your site more attractive to the search engines. In fact, you should only use one if you absolutely need it as an error in your robots.txt file may result in important pages not being crawled and indexed and you will never know unless you check your file for errors at some point in the future. If you do want to use a robots.txt file to prevent 404 errors in your logs make this the only content in your file: Code: User-agent: * Disallow: What is keyword proximity? Keyword Proximity is a measurement criteria of the closeness of the keywords within the Page Title, Meta Description and Body Text. What is keyword prominence? Keyword prominence is the location of the keywords in the page title, meta description and body text. Difference between exit rate and bounce rate? Bounce rate is the percentage of people who leaves a particular website just after visiting a single page on this and exit rate refers to the percentage of people who leaves from a particular page. What was caffeine update? Caffeine update was rolled out by Google in June 2010 and the main purpose of this update was to include more fresh results in the search index, at least 50%.
  • 9. What is 301 redirect? A 301 redirect tells search engine spiders that a page of content has permanently been moved to another location. This ensures that there are no ‘dead’ or non-working, active links on a page within your site. What is 302 redirect? It is a temporary redirect. What is 404? It is a server error code which is returned by the server what a particular webpage or the file is missing from the webhost server. What is Page Rank and is it important? Page Rank is a way for search engines to ‘grade’ a site and its associated pages. No one is completely certain as to the ‘exact’ science of how page rank is formulated, but it is understood that a number of elements such as age, number of backlinks, and amount of content, are used to formulate it. What is a Landing Page? A landing page puts your customers close to the final sale. A good landing page offers intriguing copy an opportunity for your visitors to make a purchase or desired conversion depending on the desired end result of your site’s existence. Still have Search Engine Optimization Questions? Click below to chat with one of our experienced representatives. They will be happy assist in getting you the additional SEO answers you need. What is referrer spam? Referrer spam is when a spammer sends fake referrers to your server. They do this because they know most web stats lists referrers as hyperlinks. They then submit your stats to the search
  • 10. engines in the hopes that they will crawl your stats and find that link. They also hope you click on that link yourself. What is a doorway page? Doorway pages are web pages that are created to rank high in search engine results for particular phrases with the purpose of sending you to a different page. They are also known as landing pages, bridge pages, portal pages, zebra pages, jump pages, gateway pages, entry pages and by other names. What is cloaking? Search engine optimization technique in which the content presented to the search engine spider is different from that presented to the users' browser; this is done by delivering content based on the IP addresses or the User-Agent HTTP header of whatever is requesting the page. The only legitimate uses for cloaking used to be for delivering content to users that search engines couldn't parse, like Macromedia Flash. However, cloaking is often used to try to trick search engines into giving the relevant site a higher ranking; it can also be used to trick search engine users into visiting a site based on the search engine description which site turns out to have substantially different - or even pornographic - content. For this reason some search engines threaten to ban sites using cloaking. Hidden Text/Hidden DIVs Hidden text/DIVs are only bad if you are using them to manipulate the SERPs. There are many practical uses of hidden text/DIVs that enhance a web page without being malicious. Good uses of hidden text/DIVs: Dynamic menus, dynamic page content Bad uses of hidden text/DIVs: Text that is present on the page but cannot be viewed by human beings at any time What's on-page SEO? On-page SEO refers to the things you do on your own site to enhance it’s ranking in the search engines. This includes but is not limited to: • Creating content around specific keywords.
  • 11. • Formatting/designing your site so that the most important keywords are emphasized and appear near the top of the page. • Including the chosen keywords in meta tags. • Including the keywords in the navigation menu and other links. • Using your keywords in other parts of your site, such as the title of the page, the file name, etc. • Using related keywords on the site (see the question on LSI for more information). What's off-page SEO? Off page SEO refers to those things you do outside of your own web pages to enhance their rankings in the search engines. This is a glorified way of saying, “get links” and did I mention, “more links”. What's the difference between SEO and SEM? While some people use SEO and SEM interchangeably, SEO (search engine optimization) is actually a part of SEM (search engine marketing). SEO refers to the process of using on and off page factors (typically free) to get your web pages ranked for your chosen keywords in order to get more search engine traffic to your sites. SEM takes it a step farther to include using paid search engine listings and paid inclusion to get more traffic to your websites. What's the difference between paid and organic search listings? Organic search engine listings are the main results users see when they do a Google search. The websites appearing in the organic listings appear because those sites are most relevant to the user’s keywords. Indeed, most of these sites appear in the top of the search engine results because the webmasters of these sites have used SEO tactics to ensure top rankings. The paid (or “sponsored”) listings usually appear on the top, bottom and to the right of the regular organic listings. Usually these are pay per click (PPC) ads, which means the website owner only pays when someone clicks on his ad (as opposed to paying for impressions). This isn’t an either/or game. Just because you do SEO doesn’t mean you can’t/shouldn’t use PPC and vice versa.
  • 12. SEO is not free traffic, it takes time and/or money to get good organic rankings but in the long run it’s usually cheaper than PPC. Why do I need SEO services? SEO services help your site rank better in the search engines. Better rankings drives more traffic to your site, creating the ability for better exposure and revenue streams. How does Google view my site? Google crawls each site often and sporadically using ‘spider bots.’ The bots read your pages and help Google catalog your site and its associated pages. How long does it normally take to see SEO Results? Many sites usually engage in an SEO program for at least six months in order to achieve good results, yet ‘desired results’ will vary with each client. Patience is one of the best aids to an SEO campaign. It may be best to understand that SEO tactics are ‘working’ all the time to achieve your desired results; and, once your SEO results are achieved, they are long lasting. What is link popularity? Link popularity refers to the number of web pages on the Internet which are recognized by a search engine to have a hyperlink reference to your site, or in other words are "pointing" to your web site as a reference. How do backlinks affect my rankings? Backlinks help improve your rankings. Search engines see backlinks as positive ‘votes’ for your site. Search engines highly associate your site’s backlinks with your site’s ability to satiate a browsers search wishes. What is a quality link? A quality link is: 1) On topic (The page linking to your page is about the same main topic) 2) Ranked well for the keyphrase you are after (In the top 1,000)
  • 13. 3) Contains the keywords you wish to rank well for 4) Has high PR (PR 4 or higher) I left out high traffic because that is irrelevant from an SEO point of view. But if you're looking at the big picture that would be #5. How many backlinks do I need? There is no fixed, ‘golden’ number of backlinks. Ideally you want to acquire backlinks from reputable sites in an ongoing fashion. How do I get a lot of backlinks to point to my site? A good place to start is to submit to directories. Start with the free ones and then decide whether pay ones are worth it for you. Here's a great place where you can find a free directory listing that sorts them by PR, Alexa rank (worthless), and more. What is the best criterion to identify the value of a backlink? The authority of the domain, quality of the content on the page where the backlink is provided and then the page rank of the website. I was thinking of doing <seo trick here> to my site but I'm afraid the search engines might think it is spam. Should I do it? No. Why? If you're not sure if it will get you in trouble with the search engines or not then it's probably something you shouldn't be doing. Another good reason not to do it is accessibility. Many webmasters employ hacks and tricks in an effort to increase their search engine ranking. Often times these tricks are at the expense of the usability of their website. Not only to those who have disabilities but to anyone who's trying to navigate their site. What does the Submission Process Actually Do? The SUBMISSION programs send your web site address, "URL" to search engines and links using what is referred to as add-a-URL strings. After receiving the URL, engines use a "spider" to then parses through the HTML code looking for tags that begin with "<a href=". After the
  • 14. entire page has been parsed, a small "Web BOT" travels the links it found, searching for more links using the same procedure until all of the pages at that URL address have been found. When will my Submissions appear on the engines? Every engine and directory is different. In some cases, your submission will appear within a few days. In some cases your submission may be much longer and in some instances, your web site may never get listed by that submission. Because of this, the idea is that the more engines you submit to, the better your visibility will be and if you submit regularly (every month), you have a better chance of getting added to the engines that didn't add you the last time. Many engines and directories put you in a queue. Some will manually add you when they get a chance. Some will wait to check your site out for content. What is the difference between submission and placement and when will my first page paid placement list on the search engines? With search engine submission, we do not guarantee that a search engine will place your web site. With search engine placement, we ask for you to allow ten days for placement on the search engines. You will receive a ranking report at the email address you provided on your order form. What happens if I use includes for my pages? Will the search engines see them? The search engines don't care about what server side technology you use. All they see is the (x)HTML your server side code generates. To see what they see simply load your page in your favorite web browser and then view the source. What you see is exactly what they see. Should I submit my website to the Search Engines by hand or use software? Do it by hand. It will not take long to do and will ensure that you are successful in submitting each form with the correct information. There is a constant debate about how search engines feel about automated submission software. Since there is a reasonable chance these are frowned upon by the search engines, and since you can do anything they can do on your own, you might as well avoid them.
  • 15. How often should I submit my website to the search engines? Once. Resubmitting your url does not get you indexed faster or improve your rankings. Also, resubmitting your site will never cause your site to be banned. If so, then all you would need to do is submit your competitors' sites repeatedly until they were banned. How often should I submit my website to the search engines? This is a very common myth that is 100% untrue. The file extension does not affect your rankings in any way. After all, no matter what server side programming language you use, and what extension you choose to use with it, they all just spit out HTML in the end. That's all a web browser will see and that all a search engine will see. Sites with .com rank higher then with <TLD here> This is another common myth that is untrue. The only time a domain extension can affect your ranking is if the search is based by country. The country-specific TLDs (e.g. .co.uk) will have priority over non-country specific TLDs (e.g. .com or .net). One observation many make is that .coms tend to rank higher then other domain extensions. They assume it is because .coms are given preferential treatment. This is a poor assumption. .coms seem to rank higher then other extensions because they are by for more popular then any other domain extension (there are more .coms than .net, .org, .biz, .edu, .gov, and .info combined) so they naturally have a greater chance of ranking higher vs other domain extensions through sheer quantity alone. .coms also tend to be older sites so they have had a chance to establish themselves whereas newer domain extensions have not. They have also used this time to acquire more backlinks which is an important factor in search engine algorithms. It is also commonly believed that .gov and .edu sites are given preferential treatment from search engines. This is also untrue. Web pages on .edu and .gov domains tend to rank well because they contain quality content and many webmasters will link to their content as a result. Both of these are key elements in SEO. But the fact that they are .edu or .gov domains does not benefit them directly in the SERPs.
  • 16. Pages with query strings don't rank as well as without query strings Another common myth that is untrue. The only way variables in a query string can affect a site in the SERPs is if it has a sessionID or something that looks like a sessionID in it (e.g. id=123456). These usually prevent indexing of these pages or limit the amount of pages indexed. But query strings do not affect the page's ranking. Neither in a positive way or negative way. Should I use relative links or absolute links? Absolute links. It is recommended by Google as it is possible for crawlers to miss some relative links. I just changed from .html to .php. How can I switch without losing my rankings? There are two ways to do this: 1) Tell Apache to parse all .html files a .php file. Using this method you do not have to change any files extensions or worry about any redirects. To do this, place this code in your httpd.conf file: Code: AddType application/x-httpd-php .php .html 2) Use a 301 redirect to redirect from the .html files to the .php files. You can do that by placing this code in the root directory of your website: Code: RedirectMatch 301 ^/(.*).html$ http://www.yourdomain.com/$1.php I just changed my domain name. How can I switch without losing my rankings? You'll need to do a 301 redirect from the old domain to the new domain. Fortunately this is not difficult to do. You'll need to add the following lines of code to a file called .htaccess and place it in the root directory of the old domain: Code: RewriteEngine On RewriteCond %{HTTP_HOST} ^(www.)?old-domain.com$ [NC] RewriteRule ^(.*)$ http://www.new-domain.com/$1 [R=301,L]
  • 17. Why aren't all my pages being indexed? If your site is less then six months old stop reading now. Your site is too new to be worrying about getting all of your pages indexed. Be patient. It takes time to crawl through your whole website and add your pages to the index. If you are sure your pages are search engine friendly then you have nothing to worry about. If your site is six months old or older you need to check your website to make sure all of your pages can be found and indexed. Have you: 1) Made a human sitemap? 2) Made a Google or Yahoo sitemap? 3) Used search engine friendly URLs? 4) Used search engine friendly navigation? An additional note: get incoming links. These are important for the search engines' algorithms and may play an important part in how deep the search engines will crawl your website. How do I check if my site is search engine friendly? Turn off JavaScript, CSS, and cookies in your web browser and view your website. This is how the search engines most likely see your website. If you can successfully view your content and navigate your website your site is mostly search engine friendly. The only other thing to check is your URLs. Not using a session ID or 'id=' in your query strings is also very helpful. What does it mean to have your site indexed by the search engines To be indexed by the search engines means your webpages have been crawled and included in the database of the search engines. Your pages are now available to be included in search results of user queries. This doesn't mean your pages are guaranteed to be included. It just means they are available. The pages will still need to be relevant to the search terms before they will be included in the SERPs.
  • 18. Which is better for domain name and/or url: hyphen (-), underscore(_), or plus sign(+)? Hyphens and underscores are the best keyword delimiter you can use in your domain name or URL. They are seen as equal by all of the major search engines. Many say that separators are not necessary as search engines can find keywords in URLs without assistance. They are smart and most likely can pick some keywords out of a URL. But they are not that smart. Sometimes it is not obvious where one keyword ends and another begins. For example: expertsexchange.com can be seen as "experts exchange" and "expert sex change". These are obviously two very different topics. In this case a hyphen or underscore would clearly separate the keywords and solve this problem. Will too many hyphens in your domain name cause the search engines to label your site as spam? No. This is a myth caused by many spam sites using multiple hyphens in their domain name. Many people have wrongly concluded that only spam sites would need to use more then one hyphen. The truth of the matter is that having more then one hyphen in your domain name will not result in your site being penalized. The more likely scenario is that having multiple hyphens will result in a flag being set at the search engines and a manual review being done to see if the site is spammy or legitimate. One thing to keep in mind when choosing a domain name with hyphens in it: you users. When using a domain with multiple hyphens you make it more difficult for your human visitors to remember and type in your domain name. Domain names with more then one hyphen should only be used if you are attempting to market your website through the search engines. If you plan on doing offline advertising, including word of mouth, one hyphen or less is recommended. Does the order of the keywords used in a search affect the search results? Yes. Do a search and see for yourself. Does the order of the keywords in a domain name/URL matter? Yes. You will typically rank better in the SERPs for the phrases that use the words in the same order as your domain and URL then if they are not in the same order.
  • 19. Does using automated SEO software cause a website to be penalized? No. This is a common myth that is untrue. If it were true you could get your competitor penalized or banned by using automated SEO software to resubmit their website every 60 seconds. Naturally this does not happen (nor should it). Some webmasters will try to say that Google says in their guidelines that you shouldn't use automated software like Web Position Gold. The reason for this is that most of these tools scrape Google's SERPs to find your site's ranking information. This is in violation of Google's terms of service. Software that uses Google's API is acceptable for querying their servers. Also, if you constantly use SEO software to query a search engine's servers you might find that they ban your IP address to prevent you from using their resources any further. However, this has no effect on your web pages' rankings. Can search engines see password protected pages? Search engines are not different from regular users in most ways. They cannot go anywhere that a regular user cannot go. If you have a password protected area of your website that cannot be accessed without a login and password then the search engines cannot see it. Which is better for SEO: text links or graphical links? Text links are better for SEO. Text links can contain the anchor text that your page wishes to rank well for and that is an important factor in all three search engines, especially Google. Image links are still valuable but do have less benefits compared to text links. This is true despite image tags having the ALT attribute available. The ALT attribute can contain keywords but thanks to keyword stuffing they now are virtually worthless. (You should be using your ALT attributes for usability and accessibility and not SEO anyway). Does validation help your ranking? Short answer: No. Longer answer: No. But having a webpage that validates is a good idea. A webpage that has been validated to a W3C standard contains no errors and therefore can be easily parsed and understood by the search engine crawlers. An invalid webpage runs the risk of being misinterpreted or just not read at all.
  • 20. Can the search engines read javascript? Probably not. But we can't say no for sure because some JavaScript is so easy to read it is hard to imagine that it does not get interpreted. An example of an easy to interpret snippet of JavaScript would be: Code: <script type="text/javascript"> document.write('<a href="http://www.example.com">Trying to hide this link from search engines</a>'); </script> To ensure that the search engines don't read your JavaScript it should be inserted into a web page using an external file and that directory should be blocked using robots.txt. Why should I not launch an incomplete website? 1) Users will remember that your site was incomplete and will be less willing to come back 2) Search engines may index incomplete pages and cache them and then not refresh their cache for months or years 3) Other webmasters will not exchange links with incomplete sites 4) Directories won't accept submissions from incomplete sites Keep in mind this generally covers your "under construction" kind of incomplete sites. You certainly can launch a site and then continually add to it and grow it. Even adding whole new sections. But a site that is obviously incomplete just shouldn't be set loose in the wild until it is ready to go. What is the best way to determine my online marketing budget? Determine your potential return on investment. Online marketing tactics help bring more traffic and business to your site, raising your revenue. There is always a need for online marketing; yet, be sure your provider is presenting you with quantifiable results. What are the best ways to optimize my site? Search engine optimization involves a high number of tactics, which all help to optimize your site. A combination of online marketing and search engine optimization is a good way to achieve
  • 21. great optimization. Unlike, short-term advertising, search engine optimization presents tenacious results. How quickly will I see results? If you target long tail keywords you can see results pretty quickly but always remember SEO is a long term strategy not a set and forget thing. If you’re after more competitive keywords prepare to commit to it for at least three months of consistent effort. Should I rank my own content or articles on other sites? Yes – but let’s qualify that. Because you can’t control what third-party sites do, you should focus the vast majority of your efforts on ranking content on your own sites. However, you can leverage high-ranking third-party sites by posting SEO’ed content on them and then you including a link back to your own site. Not only do you get the SEO benefits of the backlinks, you’ll also get indirect search engine traffic from people clicking through to your main site. How many keywords should I put into my <title>, <a>, and <h1>..<h6> tags? You should only put the few keywords that are most relevant to your pages. The more you put in each tag, the more you dilute the value each keyword is given. <h1>Advanced PHP Programming</h1> is better then <h1>Advanced PHP Programming Is Really Cool And Stuff Dude</h1> What’s the difference between organic SEO and Paid results? When a browser conducts a search, they will be confronted by both organic results and paid results (those which are highlighted and usually placed on the very top or right-hand side of the page). It is the quest of every business to achieve first-page, organic-SEO results because they are long lasting and organic results are more respected by browsers.
  • 22. What is the best way to maximize the frequency of crawling of your website by search engines? Frequently adding new, original and quality content on the website. What is keyword density and how does it help? Keyword density refers to the ratio of particular keywords in your copy as compared to the rest of the copy. Having good keyword density improves the likelihood that both search engines and Web browsers will associate your site’s content with your chosen keywords. Why do I need to write copy for my web site? Content is king when it comes to the Web. Remember that the Web’s purpose is to provide information to users; having fresh copy implemented regularly on your site is one of the top ways to achieve good rankings and to intrigue visitors to visit your site. How can Social Media be used for SEO? Social media presents opportunities to acquire backlinks to your site’s pages, articles, press releases, etc. Social media is a popular and ever-growing aspect of the Web. Engaging in social media works well to generate good publicity for your site while helping SEO initiatives as well. Why does my company need Reputation Management? Having an online business means you are open all the time. Competition can be fierce in many industries. Reputation management helps your business build and maintain a ‘good’ name within your industry and with customers. What other factors affect rankings besides backlinks? Where you’re getting your links, the quality of these links, the relevancy of these links, how many links you have and what keywords you’re using as the anchor text all affect your rankings. But there are other factors that affect your ranking, including but not limited to: • On page optimization factors – this is how well you’ve optimized your tags, content, formatting, keyword proximity, site map, and links on your web page. This also includes whether you use your keywords at the top of your page and in your “alt” tags (both good things).
  • 23. • Having a lot outgoing or reciprocal links pointing to “bad” sites (like link farms) – can negatively impact rankings. • Whether you have unique content (which the SE’s like). • How frequently you update your site. Faster isn't necessarily better. Check what ranks well for your niche and aim to match it. • Whether your domain includes your primary keywords. • Your domain’s age, reputation, IP address and whether it’s a top level domain (e.g., a .com is better than a .info although probably not by much). • Shady practices such as keyword stuffing or using text that’s the same color as the background can negatively affect your rankings. Only an issue if your site gets manually inspected and you don't have a legitimate reason for it. • Showing one page to the search engines and other page to visitors negatively affects your rankings. (Cloaking and doorway pages.) • Frames negatively affect your rankings. • Using content that the search engines can’t read, like audios, flash, videos, graphics (without alt tags), etc. • Whether you have a robots.txt file that tells the search engine bots to stop crawling or indexing your site. Does domain age help? Yes – search engines view older domains as more trustworthy, which means older domains may have a slight advantage. But this is only true if the older domain has a good reputation (e.g., it hasn’t been blacklisted, penalized or banned from the search engines). Why would I want to 301 redirect an aged domain? Google passes link juice/authority/age/ranking strength (call it what you like) from one domain to another if you do a 301 redirect on it. For the less tech savvy out there the 301 code means “permanently moved” and is a way to announce that your site that was once “here” is now “there”.
  • 24. The upshot of this is that you can buy an aged domain and “301” it to the site you’re trying to rank instantly passing on all that lovely ranking power that it’s acquired just by sitting in some domain squatters account for 10 years. Just make sure they do a domain push at the same registrar it was originally registered at or all these effects are lost. Also, you have to wait up to 2 weeks to see the benefits. They are not instant! What is rel="canonical"? If you have two or more pages with similar content, you can tell Google which is your preferred page to show in the search engine results. This is referred to as your “canonical” page. If Google agrees this designated page is the best version, it will show this preferred page in its index. To tell Google which page you want listed as the canonical page, add the following bit of code into the head section of the similar (non-canonical) pages: <link rel="canonical" href="http://www.example.com/filename.html"/> Naturally, you should replace the example.com/filename.html with your actual domain name and file name. For example… Example.com/file1.html is your preferred canonical page, the one you want displayed in the search engine results. You don’t have to add any tags to this site. Example.com/file2.html and Example.com/file3.html have similar content to example.com/file1.html. As such, you’d place the canonical code within the <head> tag of these two sites to tell Google that example.com/file1.html is the most important page.
  • 25. The most common reason to do this is to tell Google that these pages are all the same – • Example.com • www.example.com • www.example.com/index.html • Example.com/index.html Don’t go overboard with this and certainly don’t use it on stuff like paginated comment pages because they are “similar” but contain the same post. They contain enough unique content to be treated as unique and Google will start to ignore your legitimate canonicals if it finds too many instances of you misusing it. Yes, Google thinks it’s smarter than you, deal with it and move on. What's the truth about duplicate content? There is no duplicate content penalty when it comes to multiple sites. Otherwise, your shady competitors could just create near-clones of your site to make your site disappear. But that doesn’t happen. Indeed, run a search for a PLR article and you’ll likely see many SE results for that same article. TIP: Nonetheless, it’s better if you have unique content, rather than competing with others for the same keywords using similar content. What about duplicate content on your OWN site? In other words, what happens if you have two web pages with the same content but different file names? In that case, refer to the question on rel-canonical for instructions on how to deal with this. What is a doorway page/cloaking? Cloaking refers to showing one page to a search engine and a different page to your human visitors. Doorway pages are optimized pages that pull in SE traffic, but this traffic is immediately redirected (either manually or automatically) to a different page. Google and other search engines do NOT like these practices.
  • 26. What are meta tags? Meta tags are information that you put between the <head> tag of your web page’s source code. These meta tags primarily tell search engines and other user agents about your site’s content (description), keywords, formatting, title and whether you want the search engines to crawl (and index) the page. There are also some tags that are shown to the user, such as the title tag (which is the title that appears at the top of your browser). Note that the big search engines no longer take these tags into consideration when ranking your web pages (with the exception of the title tags). Some smaller and more specialized search engines still utilize the keywords and description tags when ranking and displaying your site. What is the "freshness" factor? Search engines such as Google prefer “fresh” (newly updated) web pages and content over stale content. That’s why when you first add content to your site – such as a new blog post – this page may sit high in the rankings for a while. Eventually it may sink to a more realistic ranking. It’s this “freshness factor” that allows your pages to get those higher rankings, even if the ranking is temporary. Thus updating your pages frequently can help push them to the top of the rankings. This is one of the primary reasons why you hear people talking about how “Google loves blogs”. Google doesn’t love blogs, Google loves regularly updated sites. A computer’s IP address is it’s address on the Internet. A C-Class block of IPs are ones which are next to each other. Links from the same IP have very limited value. Links from the same CClass IP block have a little more value but still not much. Links from different C-Class IPs are worth the most.
  • 27. Not as important as it once was, especially when it comes to sites hosted on huge shared server clusters like those at HostGator/ThePlanet, BlueHost and others. The shortage of available IP addresses is driving this. Most importantly tons of domains all on the same IP or C-Class that all interlink are the fastest way to announce to Google that you’re trying to cheat the system. This may have worked a couple of years ago, now it’s just a flashing neon sign telling Google to deindex you. What is LSI? LSI is short for latent semantic indexing. This refers to different words that have the same or similar meanings (or words that are otherwise related). For example, “housebreaking a dog” and “housetraining a puppy” are two entirely different phrases, but they mean about the same thing. The reason this is important is because Google analyzes webpages using LSI to help it return the most relevant results to the user. For example, a page that has the keyword “housebreaking a dog” but NO other similar words (like housetraining, paper training, potty training, puppy, dogs, puppies, etc) probably really isn’t about housebreaking. End result: Google won’t rank it as high as a web page that does include a lot of relevant, related terms. What does this mean to you? When you create a web page around a keyword, be sure to also include the keyword’s synonyms and other related words. Pure LSI analysis isn't scalable enough to handle the volumes of data that Google processes. Instead they use more streamlined and scalable content analysis algorithms that have some basis in LSI and other related technologies. It also appears that this analysis is ongoing and not just a one time run through the system. Cliff Notes: Don’t write content that a drunk 4th grader would be ashamed of. Spend the extra couple of minutes to write decent stuff and you’ll be fine.
  • 28. Should I build links for human beings or the search engines? Both but make sure you know which one you’re going for at any point. If you want human beings to click the link then make sure your content high quality and worth that click. If it’s never going to be seen by a human then don’t spend a week writing a beautifully crafted piece of prose use automation or anything you can lay your hands on to get links fast. What is an XML Sitemap? This is a listing of all the pages on a website, along with important information about those pages (such as when they were last updated, how important they are to the site, etc). The reason to create a sitemap is so that the search engines can easily find and crawl all your web pages. This is really only important if you have a large and complex site that won't be crawled easily. A 10-20 page HTML mini-niche site doesn't really need one while a 20,000 page product catalog might benefit from one. Also avoid automating this on WordPress autoblogs since sitemap generation is a processor hog and can get you kicked off of shared hosting. What's the sandbox? The disappointment webmasters feel when Google's stupid algorithms don't appreciate their site. It can't be them so it must be Google's fault. What is robots.txt for? This is a file some include in some or all of their website directories. Search engine robots (bots) look at this file to see if they should crawl and index pages on your site, certain file types or even the entire site. An absence of this file gives them the green light to crawl and index your site. If you don’t want search engine bots to crawl your site, then create a robots.txt file in your root directory that includes this bit of code:
  • 29. User-agent: * Disallow: / You can also create a meta tag that keeps the search engines from indexing your site: <meta name="robots" content="noindex"> Important: Only “well behaved” bots read robots.txt so don’t use it to “protect” content on your site just to keep Google from indexing stuff. Most importantly be aware that malicious bots will look for pages you’re asking not to be indexed and go to them with priority to see why. What's a spamblog? A spamblog (or splog) is a blog used primarily to create backlinks to another site. Splogs tend to be populated with fake articles, commercial links and other garbage content. In other words, they provide little or no value to a human reader. As such, the search engines tend to de-index these sites once they discover them. What's an autoblog? An autoblog uses an automation tool to pull in content from other sources and post it on the blog. In other words, it’s an easy way to automatically and frequently update a blog. They are a great way to build foundation sites to provide link juice to your higher ranking, more competitive sites but a good way to get sites banned if you don’t know what you are doing. Most importantly there is a lot of discussion about how legal they are due to reproducing content. I’m definitely not going to get involved in that discussion and I ask you not to turn this thread into a flame fest discussing it.
  • 30. What's an "authority" site? An authority site is one that is seen as influential and trustworthy by search engines, and thus it tends to rank well. Authority sites tend to be well-established sites that have a lot of high-quality, relevant content as well as links from other authority sites, Obviously, getting your own website recognized as an “authority site” will boost your rankings. However, it’s also beneficial to get backlinks from these authority sites. What are "supplemental" results? These are results that are displayed in Google’s index after the main results – especially if Google’s trusted websites didn’t return many results. These supplemental results are no longer labeled as “supplemental” results. However, this secondary database still exists to index pages that have less importance, such as duplicate content on your site or orphaned pages. For example, if you have multiple pages on your site with the exact same content, then Google will index your most important page in the main index, and place the duplicate page in the supplemental index. Does changing hosting affect my ranking? No. Your webhosting does not affect your rankings. You can change hosts without it affecting your rankings. The only issue you may run into is if you fail to make a smooth transition to your new webhost. Downtime will naturally prevent the search engine crawlers from crawling your site properly. Extended downtime may cause indexing issues. To switch hosts properly follow these easy steps: 1) Set up your website on your new webhost 2) Change your DNS to point your domain name to your new webhost 3) Leave the website up on the old server for at least one week to make sure DNS has propagated completely. After one week you can safely take down the site from the old server.
  • 31. The most common mistake users make when switching hosts is not leaving the old site up while DNS propagates. Make sure you don't wait until the last minute when switching hosts or you may run into trouble. What is a "good crawlable design"? • Don't use flash - flash is SEO suicide. Some say the search engines can crawl flash but even if they can they certainly can't crawl it as well as HTML. (See below). • Don't use JavaScript to create page content - For the most part, search engines don't read JavaScript. If you use JavaScript to create your pages' content it is as good as not being there when the search engines come around. • Interlink your pages - search engines find your pages by following other links in your pages. Be sure to link to your pages liberally especially important pages. • Use search engine friendly URLs - Although search engines can crawl query strings just fine, using a URL that appears static is a good thing. Errors can occur on long or complex query strings and this eliminates that possibility plus it is a great chance to get keywords into your URL • Use semantic markup - HTML is a powerful tool that search engines use to determine the context of a page (that's another reason why flash sucks for SEO: no html). Use HTML properly to give keywords more weight within your pages. See the Search Engine Optimization FAQ for more on that. • Use a sitemap - sitemaps make sure your pages are easily found by the search engines (good for humans, too). Flash and SEO An all Flash website is handicapped versus a semantic website (HTML). Even optimizing non content aspects of your pages will still put an all Flash website at a severe disadvantage. The problems with using Flash include: 1) It's a one page site. How many one page sites do you know that rank well? 2) You lose the power of semantic markup. No HTML = no clues for the search engines as to the importance of keywords.
  • 32. 3) Expanding on point 2, you don't have any anchor text since you don't have any internal links. That just kills you in Google. 4) There isn't a whole lot of proof that the search engines can read flash as well as HTML. You have only one available tool for trying to SEO the site and its effect is minimal. Put alternative content between the <object> tags. This has the same effect as the <noscript> tags for JavaScript. If you are making an all flash site, your only real hope is to try to be successful in a massive incoming link campaign. Otherwise you have to target marginally competitive keywords or niche keywords as you virtually don't have a prayer of ranking for anything even remotely competitive. Your only other option is to create a second version of the site so it can be read by search engines, users with accessibility issues, and users who don't have flash. Of course you've doubled your development costs by doing this as you have two websites to maintain now. Is Ajax bad for SEO? Any content available via Ajax should also be available without Ajax if a site is designed properly (i.e. accessible). That means search engines should still be able to access that data even though they don't support Ajax/JavaScript. If you cannot then it isn't a flaw in using Ajax, it is a flaw in the development of the site. Do outbound links help my rankings? No. This is a common myth that is untrue. The whole outgoing link thing has been mentioned by amateurs going on 4 or 5 years now, no one has ever proved it, and many have shown evidence of it not mattering. Newer people to SEO tend to see Google's goal as one to police the webmaster community and make sure everyone plays fair. Google is not a referee, they are a search engine, they care about serving relevant results.
  • 33. Newer people to SEO also fail to understand what PageRank is. PageRank is a measure of perceived page quality, if an outgoing link adds to a page's quality it will get more incoming links and thus rank better. If it doesn't add to a page's quality, no bonus will be had. There is absolutely no reason for Google to second guess themselves and add arbitrary blanket bonuses or penalties to all sites because of a perceived notion of a certain attribute always making a site better or worse. So, in short, because Google measures incoming links, they have no need to measure outgoing links, or anything else that supposedly marks a site as having a higher "quality." in the end, if it truly does have a higher quality, it'll get more incoming links naturally. Then there is the fact that outgoing links are strictly under the control of the webmaster, like meta tags, and so assigning them any weight leads to the same problems that brought around the downfall of meta tags. Finally, there are all the thousands or millions of sites and pages that rank perfectly well without any outgoing links. Certain types of sites, such as blogs, normally have outgoing links and it would look abnormal for them not to. However most other site types normally do not have outgoing links and haven't traditionally had them, going back to the 90s, long before Google came about. Most business, commercial, ecommerce, or service sites do not have outgoing links. Not because they're hoarding PR, but because they're trying to sell something and do not want to distract from the user experience or send users away. You must not remember a time before incoming link algorithms. In those times to measure quality search engines had to guess based on on-page factors, and it was hard to impossible. The invention of incoming links, or PageRank, search engines had a perfect way to measure the quality of a site, and so then only had to discern topicality. Why would they take a step backwards and again start using on-page factors to measure quality? Google has a lot of smart people working for them, they realize that if external links truly do add to the usefulness of a site then that site is already receiving a bonus because more useful sites garner more incoming links. This is also true for anything else that supposedly adds usefulness. They aren't going to say "Hey, we have this really good algorithm here, but lets second guess it
  • 34. and make an assumption that pages without outgoing links need to be penalized for being less useful." Why would you ever make an assumption about something that you can already discretely measure? Also, do not forget, Google itself created the nofollow link attribute to give webmasters an easier way to block links. In the end, if Google did give value to external links, it'd be meaningless. As soon as it was confirmed (which no one has been able to do) all the spammers and everyone else would just add one or two links to their pages. It would do nothing to increase relevance. Does a page's traffic affect its rank? No and here's why: 1) The search engines don't have access to the data they would need to use this as a ranking factor. They do not know how much traffic a web page gets, as it is not publicly available, and thus cannot use it to determine its rank. (For those of you who want to say, "But there is Google Analytics", that service is used only by a small percentage of websites and unless every web site decided to use it on every web page the data is far too incomplete to be used this way). 2) It would be a self-fulfilling prophecy if the search engines used its own SERPs as a means of determining it's search engines results. Obviously the number one ranked page for a search is going to get more traffic than a page not on the first page. If traffic was the indicator of where a page belonged there would be little or no way for a page to ever move up simply based on the fact that the pages ranked higher would be receiving more traffic from the search engine based on the mere fact that they are ranked higher. 3) Traffic volume can be manipulated. Spammers and black hats could easily write bots to artificially inflate their page views and thus their rankings. Plus you can purchase traffic from traffic providers or buy expired domains and redirect them to your site. It would just be too easy to do. (I can also see newbies hitting refresh for hours on end....) 4) Traffic is not an indicator of quality content. It is only an indicator of good marketing.
  • 35. What about reciprocal links? In general, reciprocal links are bad for SEO and should be avoided. Here's why: 1) They are a clear attempt to manipulate the search results which is a big no-no. That's why Google specifically outs them in their webmaster guidelines. Basically they see it as vote swapping. If you have an excessive amount of reciprocal links you run the risk of incurring penalties. (No one knows how many it takes to incur a penalty so it isn't wise to push your luck). 2) You risk being considered part of a link farm. If you link to a website that is considered a link farm and they link back to you, you may be seen as being part of the link farm. Link farms violate the search engine's TOS and are a quick way to get banned. 3) The links themselves carry virtually no value, or worse, cause you to lose strength from your pages. Because the links are on unrelated pages or pages that have little value for your niche (e.g. wrong context) prevents them from holding any value in the search engine's eyes. What little value they may have gets lost when you send a link back to their website thus negating any value that link may have had. Even worse, if your link is "worth" more then their link you will actually be hurting your site with that link exchange. 4) Many webmasters are dishonest and will remove your link or hide it from the search engines. No incoming link means no gain for you. Link exchanges should be saved for websites in your niche that are well established and ahead of you in the rankings. What keyword tools should I use and how do I use them? Good tools to use for keywords research are Google's Adwords Keyword Suggestion Tool and Google Trends. You have to keep in mind that tools like Google's Adwords Keyword Suggestion Tools and Wordtracker are not to be taken literally. You are supposed to look at the volume of keyword searches by volume relative to each other and major search terms. That will give you an idea of how frequently a search term is being used. The exact number isn't important unless you're conducting trend analysis over an extended period of time. Even then, the exact number doesn't really offer any useful information. A number within 5% - 10% (or maybe more) of the exact
  • 36. number is just as useful. Those rough numbers will clearly expose which terms are popular and which are not. The fact that Google's Adwords Keyword Suggestion Tools or Wordtracker showed no results for a keyword means that it's search volume is extremely low which is all you need to know. Whether it is 1 search or 100 searches doesn't matter. You now know what kind of volume it has and what to expect in terms of competitiveness and traffic. For example: Let's use these fictitious results for 'stymieebot' stymieebot 15000 stymieebot clone 6000 stymieebot repellent 5500 stymieebot stickers 5200 stymieebot t-shirts 4950 stymieebot hoolahoop 300 stymieebot mask 180 stymieebot uzi 15 stymieebot cologne 1 What we can tell is 'stymieebot' is clearly the most popular search term related to 'stymieebot'. The number of searches could be 18,000 or 12,000 and it still would clearly be the primary search term we would hope to rank well for and the most competitive (most likely). 'stymieebot clone', 'stymieebot repellent', 'stymieebot stickers', and 'stymieebot t-shirts' make up the second tier of results. They're grouped relatively close together and their order really is irrelevant. Their order will almost certainly change month-to-month but their average search volume will most likely remain the same. They'll always be searched far less then just 'stymieebot' but still get a decent number of searches each month. Their numbers don't matter because we know how popular they are relative to 'stymieebot' and that they are searched often enough to be worth targeting.
  • 37. 'stymieebot hoolahoop', 'stymieebot mask', 'stymieebot uzi', and 'stymieebot cologne' make up the third tier of results. They're seldom searched for and either will be longtail keywords or ignored completely. The exact number of searches are irrelevant because relative to the first two tiers we can see traffic from these terms will be sporadic at best and can assume they will be easy to target. How do improve my rankings for country specific search? To rank better in country specific search you should: 1) Use the country specific TLD 2) Host the site in that country 3) Set the geographic location for the site in Google Webmaster Tools What directories should I submit my site to? There are four tiers of directories: 1) Truly quality directories - these directories are well known, actually used by some people (although not really a whole lot), and their links have decent value (just decent value, not great value). You can count the number of these directories on two hands and probably have fingers left over. These directories include Dmoz and Yahoo. Links from these sites are the most valuable types of links you can get from a directory. However, even then they are not that strong. Links from related websites are much better. 2) Quality niche directories - These directories exclusively list sites in a certain niche: yours. These directories don't carry the weight of the first tier but because they are in a related niche they are better then general directories. 3) General directories with an editorial process - These are your run-of-the-mill, just-likeeveryone-else directories that litter the Internet. What separates these from the bottom tier of directories is that these directories actually monitor their listings and try to list only quality sites and reject spam sites and Internet waste. Links from these directories are not worth very much, Basically if you are seeking links from these kinds of directories you are going for volume as opposed to quality. Over time these can be helpful in your rankings for long tail and medium competitiveness keywords.
  • 38. 4) General directories with no editorial process - These directories accept anyone. They are full of crap sites and probably engage in a lot of link exchanges. These directories are worthless and should be avoided. What is the story with Alexa? Alexa's rankings are generally considered to be inaccurate at best. Their rankings depend on a user having their toolbar or their spyware installed in order to track their surfing habits. Plus their software is limited to the Windows operating system further limiting the reach of their software and accuracy of their results. With the possible exceptions of selling/buying a website and applying to and ad service, Alexa serves no useful purpose and important decisions should not be made based on its results. If you want to improve your ranking in Alexa just install the toolbar into your browser. Be sure to visit your site daily. This will cause your site to jump in the rankings after a few weeks. Get your friends to do it, too, and you can make a significant impact on your rankings.
  • 39.
  • 40. What is Page Rank? Page Rank (PR) is a numeric value from 0-10 that Google assigns to your individual web pages, and it’s a measure of how important that page is. Google determines this importance by looking at how many other high quality, relevant pages link to a particular page. The more links – and the better quality those links are – the more “votes” a page gets in terms of importance. And the more “votes” a site gets, generally the higher the PR. How often does Google update Page Rank? It used to be every 3 months but it’s becoming more and more erratic. Does PR matter? Yes and no. Originally PR was all that mattered in the search rankings but today that’s just not true since there are a myriad of other factors that Google considers when weighting who should appear where. That said, high PR is always worth having just don’t obsess over it. What is the "Google Dance"? When “stuff” changes the SERPs fluctuate, sometimes wildly. One day your site could be number 1 and the next nowhere to be seen. One of the main contributing factors to that is how Google sees your backlinks (which you’re consistently building, right?). Don’t obsess over it, just keep building and you’ll be fine. How does Google personalize my results? If you’re signed into Google, then Google keeps track of what search engine results you’ve clicked on. And even if you’re not signed in, Google keeps track of what results people who use your computer click on. Over time, Google starts to detect a pattern. For example, if you seem to always click on Wikipedia results, then Google will start showing you more Wikipedia results. If you always click on health results from webmd.com, then you’ll get more webmd.com results when you run a health-related search.
  • 41.
  • 42. What is a backlink? This is when a third-party website links to your website. For example, if you write and submit an article to an article directory, then you’ll have an incoming link – a backlink -- from the directory. The search engines prefer one-way incoming backlinks from high-quality, relevant websites. What is anchor text? When you create a link, the anchor text is the clickbable part of the link. For example, in the phrase, “go to Google,” Google is the anchor text. The reason this is important is because you want to use your keywords as your anchor text on incoming links. So if you’re trying to rank for “gardening secrets,” then those two words should make up the anchor text for several of your backlinks. What is a do-follow/no-follow link? There are two types of “nofollow” attribute. The robots meta tag version – <meta name="robots" content="nofollow" /> Which tells (well behaved) bots/crawlers/spiders not to follow links on the page And the link attribute <a href=”http://www.google.com” rel=”nofollow”> Which tells search engines not to count the link in terms of ranking pages. In theory these links are worthless for boosting your search engine rankings. In practice you’ll often see some benefit, especially when mixed in with a load of dofollow links.
  • 43. Links are automatically “dofollow” in the absence of the rel=”nofollow” attribute. There is no rel=”dofollow” attribute. Types of backlinks? TBD Can paid links harm my ranking? Google’s official stance is that buying links is an attempt to manipulate rankings – and Google frowns on this practice. In reality, however, it’s very hard for Google to penalize you for buying links (and they wouldn’t be able to tell for sure anyway). Indeed, if there was a penalty, then you could destroy a competitor simply by purchasing links to their site and then reporting them to Google. Poof, competition gone. Of course it doesn’t work that way. As such, if there’s any “penalty,” it may just be that Google doesn’t “count” links from paid sources. TIP: Google does penalize the sites that are selling these backlinks – so if you buy backlinks, be sure that the backlinks aren’t coming directly from the penalized sites. Are reciprocal links bad? They’re not bad, per se, especially if they’re coming from relevant, high quality websites. However, one-way incoming links tend to be more valuable in terms of SEO. What is a one-way link? This is a non-reciprocal link. That means that Site A links to Site B, but Site B does NOT link back to Site A. The search engines prefer to see one-way links from relevant, quality sites.
  • 44. What is three-way linking? Three-way linking is a way for two webmasters to exchange links so that each person’s website gets a one-way link (rather than a reciprocal link). In order to make this work, at least one of the webmasters has to have a second site in the same niche. Here’s how it works: Webmaster 1 links his Site A to Webmaster 2’s Site B. Then Webmaster 2 links his Site C to Webmaster 1’s Site A. Thus Sites A, B and C all have one-way incoming links, like this: Site A -> Site B -> Site C -> Site A What is a site wide link? These are links that are found on every page of a website. For example, many people have a link to their “home” page (the index page) on every other page of their web site. That’s a site wide link. What is pinging? Pinging is informing web-crawling bots (such as search engines or directories) that you’ve updated the content on your web page. The goal is to get these bots to crawl and index your new content immediately. For example, if you post a new article on your blog, you can use pingomatic.com or pingler.com to inform multiple bots about this change.
  • 45.
  • 46. What is link velocity? This refers to how quickly you gain backlinks. For best results, maintain a consistent link velocity. Most importantly don’t build a load of backlinks (especially with fast indexation techniques) and then stop. Google sees this as a news article that was interesting for a short period of time but no longer relevant so stops ranking it. “Too many links” or “links built too fast” are rarely a problem but inconsistency is. Can I build links too fast? Yes and no. If you’ve got a brand new domain name and you fire up some of the more powerful link spamming automation software you’ll get you domain flagged quicker than you can say, “help me my site is gone”. If you’re building links manually or controlling your usage of serious spam software you’ll be hard pushed to build links too fast on any domain that’s already been aged a bit. Just be consistent. If you think you can build links too fast on any site here’s an experiment for you next time you’re having a slow weekend. Go out and buy the fastest, spammiest link building software you can lay your hands on and pick a Wikipedia article that currently ranks quite well. Go nuts. All you will do is strengthen its position. What is page rank sculpting? There are various techniques available to channel link juice through the links you actually want to receive it and thus rank them higher. In theory Google has corrected this but several experiments have shown this isn’t the case, although the actual PR passed through the links no longer gets affected. What is a link wheel? A link wheel refers to setting up multiple pages on multiple third-party websites (usually at least five) as a means of getting backlinks to your main site.
  • 47. You link these properties to each other, but not reciprocally. For example, you link your EzineArticles article to your Squidoo page, then link your Squidoo page to HubPages… and so on. Finally, you link each of these third-party pages to your main site. By using sites with a ton of content (and other SEOs backlinking them) you’re naturally tapping a bigger seem of link juice. Take advantage of this by writing high quality content for them so human beings follow the links as well since they will rank alongside your money site. What is a mininet? This is like a link wheel, except that you own all the sites that you’re linking together. You may link together a series of smaller niche sites, with each smaller site linking to your main site. For example, you might link your dog housetraining site to your dog obedience site, and then link your dog obedience site to a site about training dogs to do tricks. All of these smaller niche sites would then link to your main dog training site. What makes a good site for a link wheel? Web 2.0 properties and other websites that have a high Page Rank. The best ones are sites which you get a page that will be automatically linked to from all over the site. Article directories like EzineArticles are perfect for this since you get tons of internal links to kick things off with. What is link bait? This means “baiting” others into linking to your site. Typically, this means posting unique, controversial, extremely useful or otherwise entertaining content or tools so that others naturally link to your web page. In other words, you create and post viral content. What is a link farm? Link farms consist of large networks of sites whose sole purpose is to generate pages that can be used to link out to other sites that are actually worth something.
  • 48. They are pretty much essential to rank for more highly competitive keywords but don’t attempt this unless you really know what you are doing. Google is smarter than you! What is a footprint? TBD How do I search for footprints? TBD What is a proxy? A proxy server is one that sits between your computer and the Internet, and using one allows you to go online somewhat anonymously. If you get online using a Proxy, no one can trace your IP address back to you and your computer. For example, you can use a proxy to set up multiple EzineArticles.com accounts.
  • 49.
  • 50. How do I get my site indexed? Don’t bother submitting your site through the traditional methods. The fastest way to get a site to appear in Google’s index is to create backlinks to it. Use social bookmarking sites to create lots of easy win links from sites that are spidered regularly and submit any RSS feeds you’ve got to directories. If you’re really keen to get indexed as fast as humanly possible – • Stick Adsense on your pages (even if you remove it later) as this forces Google to spider you. • Setup an Adwords campaign to your domain (Google has to spider you to determine your quality score). • Search for your domain name. • Perform site: and link: searches on your domain. • Visit your site using accounts with some of the most widespread ISPs (eg AOL) since their logs are used to find new content. • Email links to your site to and from a Gmail account. How do I get my backlinks indexed? The slow way is to wait for the search engines to naturally find them. The faster way is to ping the page after you leave a backlink. For truly fast backlinking social bookmark them or create RSS feeds with links in. How can I tell if my site has been visited by a spider/bot? By checking your traffic logs and statistics. Most traffic analyzing software will recognize and label the bots and spiders that crawl your site. You can also recognize these visitors manually, as the “user agent” is usually labeled something obvious, such as “Google Bot.”
  • 51.
  • 52. What percentage of people click on the first listing in Google? Only Google knows for sure, but estimates range from about 40% to 50%. AOL once released their data, which suggested that 42% click on the first listing. “Heat map” studies tend to lean more towards 50% or more. How do I use Google alerts to monitor the SERPs? All you have to do is get a Google account and then go to Google Alerts. There you enter the keywords you want the tool to monitor the SERPs for, choose “comprehensive,” choose the frequency you want to receive the alerts and then enter your email address where you want to receive the alerts. Once you’ve completed those simple steps, you’ll get alerted when new pages that use your keywords appear in the search engines. You can also use this tool to monitor your backlinks as they appear in Google. Just enter this search term into the alerts field: link:www.yourdomain.com/filename.html Replace the above URL with your actual link, of course. How can I track the number of backlinks I have? There are a variety of tools available to you, such as using the Yahoo! Site Explorer, Google Webmaster tools (check the links report) and SEO Quake. Using these tools is preferable to searching directly in Google. That’s because searching manually generally yields only a sample of the sites that are linking to your site. Ultimately they’re all wrong! Don’t obsess about tracking these things just focus on building more.
  • 53.
  • 54. What makes a good keyword? A good keyword is one that your target market is searching for regularly. An even better keyword is one that’s not only searched for regularly, there’s also very little competition in the search engines. That means you have a good chance of ranking well for that keyword. How many people are searching for my keyword? You’ll need to use a keyword tool to find out the answer. Example tools include the Google keyword tool, WordTracker.com, MarketSamurai.com and any number of other similar tools. What is the "true" competition for a keyword? Forget all that rubbish you see in just about everyone’s WSO “proof” about how they outranked a bajillion other sites for some phrase or other. The only listings that matter are on page 1 so the only people you are competing with are on page 1. I would much rather compete with a billion PR0 unrelated sites than 10 PR9s that have been around over a decade and you should too! Find out the page rank for the top ten listed pages and find the number of backlinks they have. That’s your competition. What are long tail keywords? Highly niche searches. For example, “dog training” is a short tail keyword, while “how to train a deaf dog” is a long tail keyword. Long tail keywords tend to have less people searching for them than short tail words. On the other hand, they also tend to have less competition in the search engines, thus it can be easier for you to get top rankings for these words.
  • 55.
  • 56. What is the official Google/Yahoo/Bing policy on SEO? The search engines encourage you to design your site and organize your content in a search engine friendly way. This includes proper use of meta tags, creating information-rich sites, including words on your site that your users are searching for, using site maps and more. However, they all strongly discourage any attempts to manipulate your search engine rankings, such as keyword stuffing, link spamming, cloaking and similar practices. Why doesn't Google tell me how many links I have? Google only shows a sample of backlinks, because generally it’s only webmasters who are seeking this information. As such, webmasters who know ALL of their competitor’s backlinks can just go and get links from the same sources (which may be viewed as manipulating the rankings). By only showing a sample, Google helps reduce this practice somewhat. They also make some claim about the amount of resources required to list all this information which I guess would be true if they didn’t have to have it stored for a million other reasons. Bottom line, they don’t want you to have it, get over it. Who is Matt Cutts? Matt Cutts is a Google employee specializing in SEO issues, and thus he’s seen as the authority on all things Google. He frequently talks about SEO practices, Google’s policies, link strategies and other Google issues. You can find his personal blog here. He’s an incredibly talented and influential individual but never forget that he has Google’s best interest at heart. Not everything he says can be taken as Gospel. Google webmaster tools Google offers webmasters a variety of free tools that allow you to do things like: submit your site map, get more info about how often the Google bot is crawling your site, get a report of any errors the bot found, see the internal and external links pointing to your site, determine how your URL is displayed in the SERPs, etc.
  • 57. You can access the full set of Webmaster Tools here. Automation, Outsourcing and 3rd Party Stuff Can anyone guarantee a 1st place ranking? No. Because the search engines can and do change their algorithms, and because a third-party site may drop or change your links, no one can guarantee a first place ranking for a keyword. However, SEO experts can create high rankings – even first place – for certain keywords. They just can’t guarantee those placements, as the algorithms and third-party links are not under their control. What is a backlink packet? Instead of searching for high-PR, .edu, .gov and authority sites to place your backlinks, you can save time by purchasing a “packet” that lists these types of sites for you. These packets typically include “do follow”: • Blogs where you can make comments. • Forums where you can set up profiles. • Directories where you can post content and backlinks …and similar sites. The bonus of these packets is that they save you time since you don’t have to seek out these sites yourself. The downside is that sometimes the webmasters change their policies once they get an onslaught of these links. For example, the owner of a high-PR blog may change to “no follow” links or disallow comments altogether. I bought a packet of "high pr links" but all my links are PR0, what happened? Usually this is because the main page of the website – such as the main page of the forum – has a high PR. However, the actual place where you put your link – such as your profile page – is PR0 because you basically just created the page when you created your profile.
  • 58. What automation tools are there? There are a variety of tools you can use to help automate the SEO process, including: • Tools to structure your content in a search engine friendly way. (Hint: Content management systems and blogs like WordPress do this naturally, but you can also use SEO plugins to make your blog content even more search-engine friendly.) • Keyword tools. • Tools to automatically submit or update content, such as tools that submit to directories or tools that automatically update your social media content (such as ping.fm). • Tools that automate social bookmarking. • Tools that help automate tasks like building link wheels. • Tools to create content, such as article spinners, scrapers and autoblog tools. • Pinging tools (like pingomatic.com or pingler.com). • Tools that automate link-building, such as blog and guest book commenting tools. What SEO service should I use? This question is far too contentious for a forum FAQ like this so I’m not going to name specific services. Instead here’s some general advice on selecting SEO services. Don’t fall for hype about “ranking for the most competitive terms in the SEO industry”. SEO companies that do this are pouring their resources into this highly competitive game because of the PR boost its worth. Ultimately that cost has to go somewhere. Instead find SEO firms that focus on customer testimonials showing good results. Don’t get involved in “my links are better than your links” battles. Nothing annoys me more than seeing arguments about how so-and-so’s link packet is more effective than such-and-such’s. Just focus on building a large variety of links and you’ll be fine. What does an SEO host give me that a regular one doesn't? Multiple C-class IP addresses. So even if you host multiple websites with one host, you get different addresses. And that means you can build a mininet more easily without being detected.
  • 59.
  • 60. 301 A permanent server redirect - a change of address for a web page found in the htaccess file on apache servers. Also useful for dealing with canonical issues. Adwords Google Pay Per Click contextual advertisement program, very common way of basic website advertisement. Adwords site (MFA) Made For Google Adsense Advertisements - websites that are designed from the ground up as a venue for GA advertisements. This is usually, but not always a bad thing. TV programming is usually Made For Advertisement. Affiliate An affiliate site markets products or services that are actually sold by another website or business in exchange for fees or commissions. Algorithm (algo) A program used by search engines to determine what pages to suggest for a given search query. ALT text A description of a graphic, which usually isn’t displayed to the end user, unless the graphic is undeliverable, or a browser is used that doesn’t display graphics. Alt text is important because search engines can’t tell one picture from another. Alt text is the one place where it is acceptable for the spider to get different content than the human user, but only because the alt text is accessible to the user, and when properly used is an accurate description of the associated picture. Special web browsers for visually challenged people rely on the alt text to make the content of graphics accessible to the users. Analytics A program which assists in gathering and analyzing data about website usage. Google analytics is a feature rich, popular, free analytics program. Anchor text The user visible text of a link. Search engines use anchor text to indicate the relevancy of the referring site and of the link to the content on the landing page. Ideally all three will share some keywords in common.
  • 61. Astroturfing (the opposite of full disclosure) attempting to advance a commercial or political agenda while pretending to be an impartial grassroots participant in a social group. Participating in a user forum with the secret purpose of branding, customer recruitment, or public relations. Authority (trust, link juice, Google juice) The amount of trust that a site is credited with for a particular search query. Authority/trust is derived from related incoming links from other trusted sites. Authority site A website which has many incoming links from other related expert/hub sites. Because of this simultaneous citation from trusted hubs an authority site usually has high trust, pagerank, and search results placement. Wikipedia, is an example of an authority site. B2B Business to Business. B2C Business to Consumer Back link (inlink, incoming link) Any link into a page or site from any other page or site. Black hat Search engine optimization tactics that are counter to best practices such as the Google Webmaster Guidelines. Blog A website which presents content in a more or less chronological series. Content may or may not be time sensitive. Most blogs us a Content Management System such as WordPress rather than individually crafted WebPages. Because of this, the Blogger can chose to concentrate on content creation instead of arcane code. Bot (robot, spider, crawler) A program which performs a task more or less autonomously. Search engines use bots to find and add web pages to their search indexes. Spammers often use bots to “scrape” content for the purpose of plagiarizing it for exploitation by the Spammer.
  • 62. Bounce rate The percentage of users who enter a site and then leave it without viewing any other pages. Bread crumbs Web site navigation in a horizontal bar above the main content which helps the user to understand where they are on the site and how to get back to the root areas. Canonical issues (duplicate content) canon = legitimate or official version - It is often nearly impossible to avoid duplicate content, especially with CMSs like Wordpress, but also due to the fact that www.site.com, site.com, and www.site.com/index.htm are supposedly seen as dupes by the SEs - although it’s a bit hard to believe they aren’t more sophisticated than that. However these issues can be dealt with effectively in several ways including - using the noindex meta tag in the non-canonical copies, and 301 server redirects to the canon. Click fraud Improper clicks on a PPC advertisement usually by the publisher or his minions for the purpose of undeserved profit. Click fraud is a huge issue for add agencies like Google, because it lowers advertiser confidence that they will get fair value for their add spend. Cloak The practice of delivering different content to the search engine spider than that seen by the human users. This Black Hat tactic is frowned upon by the search engines and caries a virtual death penalty of the site/domain being banned from the search engine results. CMS Content Management System - Programs such as Wordpress, which separate most of the mundane Webmaster tasks from content creation so that a publisher can be effective without acquiring or even understanding sophisticated coding skills if they so chose. Code swapping (bait and switch) Changing the content after high rankings are achieved. Comment spam Posting blog comments for the purpose of generating an inlink to another site. The reason many blogs use link condoms.
  • 63. Content (text, copy) The part of a web page that is intended to have value for and be of interest to the user. Advertising, navigation, branding and boilerplate are not usually considered to be content. Contextual advertisement Advertising which is related to the content. Conversion (goal) Achievement of a quantifiable goal on a website. Add clicks, sign ups, and sales are examples of conversions. Conversion rate Percentage of users who convert - see conversion. CPC Cost Per Click - the rate that is paid per click for a Pay Per Click Advertiser CPM (Cost Per Thousand impressions) A statistical metric used to quantify the average value / cost of Pay Per Click advertisements. M - from the Roman numeral for one thousand. Crawler (bot, spider) A program which moves through the worldwide web or a website by way of the link structure to gather data. Directory A site devoted to directory pages. The Yahoo directory is an example. Directory page A page of links to related WebPages. Doorway (gateway) A web page that is designed specifically to attract traffic from a search engine. A doorway page which redirects users (but not spiders) to another site or page is implementing cloaking. - Previous Definition revised based upon advice from Michael Martinez Duplicate content Obviously content which is similar or identical to that found on another website or page. A site may not be penalized for serving duplicate content but it will receive little if any Trust from the search engines compared to the content that the SE considers being the original.
  • 64. E commerce site A website devoted to retail sales. Feed Content which is delivered to the user via special websites or programs such as news aggregators. FFA (Free For All) A page or site with many outgoing links to unrelated websites, containing little if any unique content. Link farms are only intended for spiders, and have little if any value to human users, and thus are ignored or penalized by the search engines. Frames a web page design where two or more documents appear on the same screen, each within it’s own frame. Frames are bad for SEO because spiders sometimes fail to correctly navigate them. Additionally, most users dislike frames because it is almost like having two tiny monitors neither of which shows a full page of information at one time. Gateway page (doorway page) A web page that is designed to attract traffic from a search engine and then redirect it to another site or page. A doorway page is not exactly the same as cloaking but the effect is the same in that users and search engines are served different content. Gadget see gizmo Gizmo (gadget, widget) small applications used on web pages to provide specific functions such as a hit counter or IP address display. Gizmos can make good link bait. Google bomb The combined effort of multiple webmasters to change the Google search results usually for humorous effect. The “miserable failure” - George Bush, and “greatest living American” - Steven Colbert Google bombs are famous examples. Google bowling Maliciously trying to lower a sites rank by sending it links from the “bad neighborhood” - Kind of like yelling “Good luck with that infection!” to your buddy as you get off the school bus - there is some controversy as to if this works or is just an SEO urban myth.
  • 65. Google dance The change in SERPs caused by an update of the Google database or algorithm. The cause of great angst and consternation for webmasters who slip in the SERPs. Or, the period of time during a Google index update when different data centers have different data. Google juice (trust, authority, pagerank) trust / authority from Google, which flows through outgoing links to other pages. Googlebot Google’s spider program GYM Google - Yahoo - Microsoft, the big three of search Hit Once the standard by which web traffic was often judged, but now a largely meaningless term replaced by pageviews AKA impressions. A hit happens each time that a server sends an object - documents, graphics, include files, etc. Thus one pageview could generate many hits. Hub (expert page) a trusted page with high quality content that links out to related pages. HTML (Hyper Text Markup Language) directives or “markup” which are used to add formatting and web functionality to plain text for use on the internet. HTML is the mother tongue of the search engines, and should generally be strictly and exclusively adhered to on web pages. Impression (page view) The event where a user views a webpage one time. In bound link (inlink, incoming link) Inbound links from related pages are the source of trust and pagerank. Index Noun - a database of WebPages and their content used by the search engines. Index Verb - to add a web page to a search engine index.
  • 66. Indexed Pages The pages on a site which have been indexed. Inlink (incoming link, inbound link) Inbound links from related pages are the source of trust and pagerank. Keyword - key phrase The word or phrase that a user enters into a search engine. Keyword cannibalization The excessive reuse of the same keyword on too many web pages within the same site. This practice makes it difficult for the users and the search engines to determine which page is most relevant for the keyword. Keyword density The percentage of words on a web page which are a particular keyword. If this value is unnaturally high the page may be penalized. Keyword research The hard work of determining which keywords are appropriate for targeting. Keyword spam (keyword stuffing) Inappropriately high keyword density. Keyword stuffing (keyword spam) Inappropriately high keyword density. Landing page the page that a user lands on when they click on a link in a SERP Latent semantic indexing (LSI) This mouthful just means that the search engines index commonly associated groups of words in a document. SEOs refer to these same groups of words as “Long Tail Searches”. The majority of searches consist of three or more words strung together. See also “long tail”. The significance is that it might be almost impossible to rank well for “mortgage”, but fairly easy to rank for “second mortgage to finance monster truck team”. Go figure. Link An element on a web page that can be clicked on to cause the browser to jump to another page or another part of the current page.
  • 67. Link bait A webpage with the designed purpose of attracting incoming links, often mostly via social media. Link building actively cultivating incoming links to a site. Link condom Any of several methods used to avoid passing link love to another page, or to avoid possible detrimental results of indorsing a bad site by way of an outgoing link, or to discourage link spam in user generated content. Linkerati internet users who are the most productive targets of linkbait. The Linkerati includes social taggers, forum posters, resource maintainers, bloggers and other content creators, etc who are most likely to create incoming links or link generating traffic (in the case of social networkers). Suggested by lorisa. Link exchange a reciprocal linking scheme often facilitated by a site devoted to directory pages. Link exchanges usually allow links to sites of low or no quality, and add no value themselves. Quality directories are usually human edited for quality assurance. Link farm a group of sites which all link to each other.- Previous Definition revised based upon advice from Michael Martinez Link juice (trust, authority, pagerank) Link love An outgoing link, which passes trust, unencumbered by any kind of link condom. Link partner (link exchange, reciprocal linking) Two sites which link to each other. Search engines usually don’t see these as high value links, because of the reciprocal nature. Link popularity a measure of the value of a site based upon the number and quality of sites that link to it