A variety of strategies are used to segment content delivery. The idea is to serve content that are not meant for search engines in a un spiderable format( placing text in images flash files and plugins)
However dont use these formats for the purpose for cloaking, rather you should use them for if they bring substantial benefit to users If you want to show the search engines , you dont want visitors to see ,you can use
CSS formatting( preferably not display:none) as the engines might have filters to track this
However keep in mind that search engines are very way of webmasters to use such tactics. Use cloaking only if it brings substantial user benefits
Tactics to show different content for search engines and users Robot.txt files : This file is located at the root level of your domain( www.domain.com/robots.txt) which you can use to 1)Prevent crawlers form accessing non public from parts of your website 2) Block search engines from accessing index scripts,utilities or other types of code, 3) Auto discovery of XML sitemaps.
must reside in it root directory and should be in small case. Any other format is not valid for search engines
: The basic syntax of robot.txt is fairly simple. You specific a robot name such as google bot and specify an action. Some of the major actions you can specify are
: Use this page for telling the bots not to iindex your site in its SERPs. ( this might be used when you wish to hide duplicate content pages in your site)
ليست هناك تعليقات:
إرسال تعليق