- You have information that you do not want turning up in search results: private data, family pictures, coupon codes, limited offers, test/demo pages, etc.
- You have duplicate content: landing pages for e-mail campaigns, highly-templated pages, etc.
- Use the robots.txt exclusion file to tell the search engine spiders crawling your site not to index particular pages or folders
- Add appropriate metatags to your pages which will then tell search engine spiders to exclude those particular pages from their results
You may chose to use the metatags if you do not have the ability to upload files to the root of your webserver (for example pages on a shared site like Yahoo! Geocities). The code you should use is:
This code should be pasted into the content of your HTML.
1 comment:
Thank you for the feedback!
Post a Comment