This document provides an overview of SEO best practices from the perspective of an SEO expert. It discusses common issues SEOs see like indexation problems, duplication of content, and canonicalization issues. It provides examples and explanations of each problem and recommends solutions like using canonical tags and 301 redirects. The presentation emphasizes planning early for SEO and implementing best practices from the start to avoid issues later. It also notes that responsive design helps with SEO and mobile users.
9. Things SEOs start doing
• Messing with the navigation
• Telling you to move pages around
• Change around the images
• Asking you to change how the site
behaves
• Mess with URLs
• Giving UX/UI Advice
52. New content, same URL
Serving new content through JavaScript or
AJAX
53. New content, same URL
Header
Navigation
Content Content
www.bestsiteever.biz/page2.html
New
Content!
54. • Make sure you every page can be reached
through clicks
• Whenever content changes, the URL
changes.
Sure-fire way to keep SEOs away:
55. THE THREE BIG PROBLEMS
Indexation
Duplication
Canonicalization
56. Your site says the same thing as
a bunch of other sites
Your site says the same thing as
a bunch of other sites
Your site says the same thing as
a bunch of other sites
Your site says the same thing as
a bunch of other sites
Your site says the same thing as
a bunch of other sites
Your site says the same thing as
a bunch of other sites
114. Fixing these problems
1. Diagnosis
2. Implementation
3. Fixing Internal Links
4. Fixing External Links
5. De-indexing content
6. Re-indexing content
7. Begging Google to please give us one last
chance!
I wanted to discuss some of the more common conflicts that can happen between SEOs and other web folk, particular designers and developers. The goal is to give a high level understanding of some of the more technical and sometimes invasive SEO issues that can happen. At the very least to understand them, but hopefully to avoid them all together so everyone can understand how to make a website that lives up to its potential.
So let’s start at the beginning.It all starts with a website. Business owner or marketer decides to create a website or revamp an existing one.They assemble their team. Then at last the SEO
Everyone gets a site together they loveSo they then bring the SEO so the page ranks.He or she…
Then you start to feel like you’re lady right here with the SEO being this awesome dude with the elbow pads. I’m sure you’ve all been familiar with this guy – the annoying guy hovering behind you nitpicking where you clikc when all you want to do is get your work done. Nobody likes this guy. I mean just look at him.So this brings me to the secondary title of the webinar – the one I think is actually more accurate….
Because this is really what it’s about. Everyone has stuff they need to get done and work they need to do. You don’t need elbow pads guy here butting into your day to day. So…Speaking mostly to designers and devs and how you can avoid the big issues so the SEO will leave you alone.Marketers and business owners – you can learn what the potential confilcts might be and avoid themSEOs – if you don’t learn something then you should at least learn that these are pain points.
Like many things, most conflicts come from expectations. What I’ve found…Things people expect…Because this is what people
It may start out with that, but then everyone starts to get a bunch of stuff they didn’t sign up for.
So we aren’t just trying to make your life miserable (most of the time)Whenever an SEO ventures out of the world of keywords and content and deep into the dev and design territoy it’s because there is something specific they are looking for.
We are worried.Specifically searchesAlso, going I’m going to skip over a lot of the basics so I assume you already know what works
… and what doesn’t’ work anymore.
And instead focus on the major technical problems that will result in conflicts with the SEO that you might not be aware of.Typically these problems can fall into 3 major categories.I’ll go over these problems specifically, and hopefully give an understanding of the underlying concerns the SEO has when they are attempting to address these.
Let’s start with indexation.Can be caused by many things but basically it’s this…
No content = content problemBut you have content = indexation problem
It begins with a query. Google then goes out and finds all the sites that mention the same word you typed.
After it has all those sites, it orders them into what you see….It does this using things like offsite factors, little bit of page rank, trust factors and whatever else it usesBut what’s important is that none of that happens until
But before any of that happens it starts with actual words on your site. There’s some exceptions, but not many.When you don’t have the words on your site that match the query, you have a content problem
The most direct way this happens is also fairly common. SEOs and devs familiar…Classic robots block.Makes sense…
Sometimes you might wonder
Your site is linked to other sitesMore links more connected you are.
If you are a big brand and you lauch with robots blocked you’ll probably get blocked
The robots.txt is the most common overlap. Most developers are familiar with but still it slips through all the time.
Still there are some other indexation issues that SEOs will tend to get pushy about.
And they do something called the “rational surfer model”
Classic example would be Data from TGN
Or Fassbender from Aliens Ep 1
OrZuckerberg from the social network. A robot pretending to be a human
So they get to your website and start clicking around
And clicking around
One of these is when great text is placed in an image.
Another one of the more is putting great content inside an image, which search engines can’t read.
But this is what the search engine sees. I understand the importance of branding…
Sometimes this is not an option like banners and logos
Avoid this
There are some solutions – some bad (this was a legit tactic a few years ago)
Usewebfonts instead. Load any font on the site.
Those are 2 of the more direct ways….
Another thing SEOs sometimes do is try to get you to put links in your navigation you don’t want.
When the SEO is getting involved with UX and UI stuff like navigation and architecture, it’s because they’re concerned about this.
Often this is because there’s other things robots can’t use like search bars and filters. So if you have a page that can only be reached by these, the search engines won’t ever see them.
It’s because they’re trying to create something like this. Each page should have a clickpath that is both descriptive of it’s content and demonstrative of it’s relevance to the rest of the site. The page’s home should reflect the site’s hierarchy and architecture.
And before we get any further, it doesn’t mean putting a tiny little link in the footer. Search engines ignore this. The link needs to be in the main navigation in a place where a person might actually use it.
But no one ever clicks there!Yes still. Sorry but there needs to be a way for robots to get to the site. Basically…
Those are 2 of the more direct ways….
When the content on the page changes….
When the content on the page changes….
When the content on the page changes….This meansNo way for search enigesBut also No way to share
When the content on the page changes….
Another group of issues that SEOs are always looking are problems with duplication or duplicate content. Most people understand what this is but just in case
Duplication is when your site says the same thing as a bunch of other sites.
Things that are most often associated with duplicated contentYour site shouldn’t be
In fact this is such a big deal, that Google has launched a series of updates to specifically address these.
The basic reasoning is this. Right here we have the top 4 results. These will get around 66% of clicks. That means that the people at Google know that if these 4 things aren’t what someone is looking for, they’ll bail. Either to another search or, worse, to Bing. So Google’s not going to show a result if it contains stuff that is already found all over the place on other sites.
Search engines only show unique content
But scraping, copying and pulling in other content are all variations of this. But there’s another fairly common cause of duplicated content that people don’t think of as much.
Which is another thing SEOs bring up. And that’s when you push out your own content to other sites. So you might be thinking that it’s ok because you’re the original provider, well you probably are a bit better off, but still you have to look at it from the POV of the guys running the search engines.
They can either…
Relate back to devs and designers
So here are your backup to give to SEO plans:
But what about if it’s not on purpose
The last group of big issues comes from what’s called canonicalization. This will typically cause the most headache for an SEO, but first let’s go over what exactly it is before the causes. Since this has a lot of crossover with duplicated content, I want to spend a few minutes going over the differences.
And your date shows up at your door.
Then comes a new date. And he just copied everything the first guy did.
And the third guy says all the same things as well. 3 different dudes, they all are saying the exact same thing. That’s a duplication problem.
A canonicalization problem happens when date #1 comes up.
Then comes date #2 and it’s the exact same person.
Then again, maybe there’s a new color or something but still it’s the exact same person. That’s a canonicalization problem.
But how does it happen? And why should I worry about this.SEO’s worry about this because
Google is crawling every site online. So it has a set amount of time it has to crawl your site. Parameters, especially spider trap, means that it can spend all day on one page and not actually get to any of your other content.
Google is crawling every site online. So it has a set amount of time it has to crawl your site. Parameters, especially spider trap, means that it can spend all day on one page and not actually get to any of your other content.
Google is crawling every site online. So it has a set amount of time it has to crawl your site. Parameters, especially spider trap, means that it can spend all day on one page and not actually get to any of your other content.
Still what does this have to do with it.
Most common occurs right on the homepage… Let’s say you’ve got showing up here. But also here. 2 different URLs same page. But then this can also happen.Now let’s suppose you homepage can also be indexed under the individual page name. Maybe there’s no capitalization normalization in place.Now let’s say there’s a query parameter from mobile. There’s another one.Now let’s say your site also appears as a secure version. There’s another version of all the previous URLs
The way to solve this is URL normalization.In apache you can fix this in the .htaccess file or a server config file like it. This will also address page-level normalization if you’re using folder-based pages instead of pages.IIS in the Information Services Manager
But the other, more complicated way that this can happen is something that can cause the biggest conflict between a developer and an SEO. And those are parameter issues. There’s two basic problems that happen with this, so we’ll start with the simple one.
Let’s start with a product. On a product page. Talk about how you get why people use parameters.
The problem is you have a bunch of different URLs that are all for the same product page. This is a simple parameter issue, meaning that you got a single page, that is being indexed in several different ways. It can be a problem because Google won’t know which to index, splits your authority, and may not display the page all together because of it.
Let’s say….
Now you got a super popular…
Instead of…
If something needs a unique URL like a size or color or slight variation, you can use a #tag parameter which are currently ignored by search robots, but still can be used to deliver content. (at least for now)Sort of the opposite of the indexation issue I discussed earlier
If something needs a unique URL like a size or color or slight variation, you can use a #tag parameter which are currently ignored by search robots, but still can be used to deliver content. (at least for now)
The only thing worse than a little parameter problem is a big parameter problem. Tthis is the worst thing that can happen to a site. Problems with this can take months to undo so avoiding the issue all together will save a ton of headache.ypically big parameter problems happen from some version of this: You have a product page or a category page. And then some other links maybe to a related product.
Clicking on a referring product tacks on a reference parameter. Then you click back on A
And then another parameter is tacked on.
Solutions in analytics
The biggest solution, is, you guessed it to avoid parameters that aren’t specific to a piece of content. If the content doesn’t change, the URL shouldn’t.
People like thisSeos like this but they still ask you to do other thigns and..You can also use the canonical tag, but you shouldn’t depend on it. Here’s why
In case you’re not familiar with it, it’s a great little tag that you put in your page that basically does this – it says “hey I know the URL up there says we’re here, but it’s really this one over here.If honored, it transfers authority and can address canonicalization issuesBut it’s far from perfect. And here’s why. It IS a great little tag.
But in the end it’s just that - one little tag on your page. And google doesn’t have to follow it. If you’re not using it right it will ignore it all together.
And maybe for good so if that’s your only defense, you got nothing if google chooses to ignore it.
So if there’s one thing you can learn from this about how to keep SEOs from bugging you about canonicalization it’s this:One page should have one URL. Always. If you can make a site that does that, you will never have an SEO mention canonicalization or say that sort of irritating word to you.
Still what does this have to do with it.
Still what does this have to do with it.
And that’s this.
So in summary here’s the big things.As long asThe Content can be foundThe content is originalEach piece of content displays under one URL You won’t avoid all major SEO issues and concerns.
Actually – it’s a lot. And if you make a site that avoids those three issues then you’ll end up with an SEO doing what SEOs actually like doing.
…because preventing
But fixing
So the best solution?
Do this.This way you can…
Plus here’s a secretIf all the content on the site is indexable, original, canonical and sharable…We don’t have much else to do in the prelaunch
Which are the exact things that you probably expected them to do be doing in the first place.