Wednesday, October 28, 2009

Having Issues with Page Crawling and Robot.TXT ?

Google Engineer Matt Cutts Discusses major questions about sites having issues being crawled on your URL by Googles Searchbot "Googlebot". The Engine reads the Robot.txt or Robots.txt file which allows or blocks the engines from accessing a certain area or your entire site however you have your file configed. He gets to the bottom of lots of key questions.Very helpful for those having issues.Enjoy. xxxliveshows.tv



No comments:

Post a Comment