How do you identify problems with your site which search engines fail to see and leads to search spiders missing indexing your site.
Problem 1 For example here is a simple scenario “ your webmaster is working on a site on a staging server"as you don't want the search engine to see this page which as they are duplicate versions of your page which you normally keep on a staging server which is keept as "NoIndex" This is something that search engines cannot see
Normally when you move the site from the staging server to the live server you should remove the NoIndex tags. However in case you forget removing the Noindex , you will see phenomenal drop in traffic .
Problem 1 For example here is a simple scenario “ your webmaster is working on a site on a staging server"as you don't want the search engine to see this page which as they are duplicate versions of your page which you normally keep on a staging server which is keept as "NoIndex" This is something that search engines cannot see
Normally when you move the site from the staging server to the live server you should remove the NoIndex tags. However in case you forget removing the Noindex , you will see phenomenal drop in traffic .
Problem 2 : Some webmasters implement robot.txt that prohibits crawling of your site in the staging server. If this site gets copied over when the site is switched to live server , the consequences will be just as bad as the NoIndex example
Problem3 : A key difference between a person using a browser and a search engine spider is that the person can manually type a URL into the browser window and retrieve the page the URL points to. Search engine crawlers lack this capability.Instead, they’re forced to rely on links they find on Web pages to find other pages. This is one of the reason why inbound links are so very crucial
Problem3 : A key difference between a person using a browser and a search engine spider is that the person can manually type a URL into the browser window and retrieve the page the URL points to. Search engine crawlers lack this capability.Instead, they’re forced to rely on links they find on Web pages to find other pages. This is one of the reason why inbound links are so very crucial
Problem :5 The best way of understanding this and detecting this and taking appropriate action is to use an analytics software to find pages in your site that gets page views but no referring site traffic. Though this itself is not conclusive enough but it does provide a clue on what is going wrong in your site. The reverse of this situation is also true. If you see content on your site that is getting search referrals even though you don't want it or expect it, you may want to hide that content
Auditing what you have missed in search optimisation:Another data point which you can use to find if search engines are not able to see you content is check if your content is being picked up by search engines. For example if you have a site with 1000 pages with good inbound links and after 3 months you see only 20 pages are indexed thats enough clue that there is a problem
ليست هناك تعليقات:
إرسال تعليق