Luca Vassalli's Website - Home page


Home
Index

1.Introduction

2.Ethical SEO

3.Spider's view

4.SEO spam

5.General topics

 

4.1 Cloaking

Cloaking is the technique of returning different pages to search engines than to people. The reason is that good SEO often requires sacrificing some of the visual attractiveness of the page and changing the textual content into somewhat that may look unattractive to human visitors. Furthermore if you apply very good SEO and you are worried that someone may steel your pages, cloaking is the solution. The spiders will see the optimised page and the users the graphical appealing one.
Identification is usually done either by checking the visitors' IP address, or their user-agent string.
The first technique is better but it requires maintaining an up to date database with all the IP addresses of the crawlers, which change often. You may have to periodically buy the list.
Given that there is the risk to be banned, you need to cloak carefully.
First of all, since often Google save a copy of the page in a cache, you need the tag:
<META NAME="GOOGLEBOT" CONTENT="NOARCHIVE">
This is necessary to avoid that the human users will see the over optimized version of the page.
Secondly, it is better that your title, Meta description and the first row of text are the same with both your search engine optimized and human visitor pages. It is better than also the sizes of those pages are close to each others.
Cloaking can be useful also to solve the problem of session id. Indeed some spiders of the major search engines do not spider pages that have session ids in their URLs. If fact every time the spider would arrive to the website its session id would be different, since the URLs would contain the session id, all the pages, which it would spider, would be new pages, for it. Hence it would run the risk of spidering a potentially infinite number of pages. However by spotting page requests from the spiders, and delivering modified pages without the normal session ids in the link URLs, you would allow all the crawlers to spider your website.
Sometimes the word "cloaking" is misused.
Cloaking is not simply "IP delivery", because that means that you deliver different pages according to the IP address, on the contrary cloaking makes difference only between search engines and human users, it involves hiding the normal pages of a website from search engines.
Cloaking is neither hidden text since it assumes that there are two different pages, not just one which looks different to humans or to spiders, like hidden text.
Cloaking is finally not auto-redirection based on the IP address, like Google does when it sends the users to its local search engine version when they type the .com address into the browser's address bar.