Related Post

.

Saturday 17 September 2011

Blackhat Seo: What is cloaking ?

There are various Blackhat Seo methods to get high traffic to your website, one of them is cloaking.
Cloaking is a search engine optimization technique used for presentation for different forms of content to various search engines and different presentations to the users. This is a deceptive method for hood winking search engines for the purposes of getting the desired ranks for desired keywords.

When a user requests a particular web page from the site then site responds to the user with the normal web page of the site but when the same request is made by the search engine spiders then instead of the site page another page that has been specially designed for the search engine spiders is returned and the normal site page is hidden from the search engine spiders. Such a page is said to be clocked.

Cloaking is resorted to for the purposes of showing contents based on the request of IP address as well as the user agent HTTP header of the user. A user identifies search engine is exhibited a minutely prepared as well as optimized website for the purposes of ranking for desired keywords. If the user is non-search engine real websites content are shown to them.

The main purpose of cloaking is to hide the code of high ranking pages from visitors and users so that the content cannot be stolen by anybody. And to provide search engine spiders with pages which are highly optimized.

 There are certain search engines which consider cloaking as spam and this sort of behaviour is against the edifice of good guidelines. Google webmaster cloaking methods as well as guidelines spell out the following.
• Cloaking for JavaScript – If any user uses enabled JavaScript it is shown one version while if the same user is using turned off JavaScript then it shows another version of a website.
• Header cloaking for HTTP referrer – If any user coming from a particular website they will be presented a cloaked version of a website based on the HTTP referrer.
• Cloaking for IP address – Cloaking system checks the IP address of the requestor. If the IP address of a search engine’s spider, the engine page will return. Cloaking uses IP address delivery, however so do other things such as geographical location for location based content. Cloaking uses IP address delivery, but IP address delivery is not itself cloaking.
• Cloaking for user agent – Based on user agent different versions of websites are delivered. If a search engine determines a script to gain access to a website based on the signature of its user agent it will show its cloak version of the website.
• Header cloaking for HTTP accept-language – It enables to show various language versions of a website without asking the users web browser language.
• Auto redirecting – Auto redirecting is also called as poor man’s cloaking. But it is not cloaking at all. This method is the same page is returning to all requestors, and does not make difference between search engine spiders and people.

Working of Cloaking
Every website contains one or more ordinary web pages. If we want a page to be cloaked in the website then we need to create another page for that particular page and is designed in such a way that it is ranked high in the search engines. If we are taking into consideration more than one search engine then a page is created for each search engine. This is done because every search engine use different algorithms and different criteria for the ranking of the web pages. So pages are designed by taken into consideration these criteria.

The pages designed for search engines are generally different from the normal content page. Sometimes a page ranks high in a particular search engine if it is stuffed with keywords. So such a page is created for the search engines.
Whenever a web page of the site is requested, a programme embedded in the site detects who is making the request. If the request is made by the normal visitor then the normal web page of the site is returned but if the search engine spider is requesting for the web page then the special page designed for search engine is returned. So in this way search engine spiders can never know the normal website page and the normal user can never know the page designed for search engines.
Cloaking is ethical
There are various views of people. Many people think that cloaking is a unethical way for reaching top position in search engine rankings. But this is not true. According to these people it is wrong because search engine is ranking the pages according to what they believe it to be and not according to what it actually is. That idea is purely a matter of principle, and nothing at all to do with ethics.

But it can be considered ethical n the way if the ranking is based on the topic or content the visitors will see the page listed in the search results and by clicking this if they get to the correct result then it is purely ethical. it doesn't matter how the page came to be ranked in that position but what matters is that the end result should be correct and ranking should be based according to the topic.

Cloaking can be used unethically, by sending people to sites and topics that they did not expect to go to when clicking on a listing in the search results, and that is an excellent reason to be against the misuse of cloaking, but it doesn't mean that cloaking is unethical. It just means that, like many other things, it has unethical uses.

Conclusion
Any type of cloaking with fraudulent purposes may result in prohibition of the search engines which makes you cautious when you are utilizing the services. Frequently, search engines hold cloaked pages by frequent visiting of search engines in timely intervals. The holders of the websites who use cloaking for securing their search engine rankings not to be taken unaware sooner or later. Supposing this process is identified web pages may be deleted temporarily or permanently from the list of various search engines.


Twitter Delicious Facebook Digg Stumbleupon Favorites More

Adverts

BannerAd BannerAd BannerAd BannerAd