Robots.txt is a text file that you upload to the root of your site. A plain text file acts as a notice to Search Engine Spiders, instructing them what to do. When a search engine spider visits your site & you have decided to block ALL content on your website use this protocol: User-agent: * Disallow: /myfolderiwantblocked Disallow: /otherfolderiwantblocked 301 redirects are the most economical & expedient solution to accidently duplicated URLs. If for instance http://lmax.com needed to be redirected to http://www.lmax.com the following would needed to be placed in the .htaccess file & uploaded to the root directory: Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} ^example\\.com RewriteRule ^(.*)$ http://www.example.com/$1 [R=permanent,L] Or alternatively: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^domain\\.com$ [NC] RewriteRule ^(.*)$ http://www.domain.com/$1 [R=301,L]