#102 of the things to keep in mind when working on a big website: If you have a giant robots.txt file, remember that Googlebot will only read the first 500kB ( http://code.google.com/web/controlcrawlindex/docs/robots_txt.html ). If your robots.txt is longer, it can result in a line being truncated in an unwanted way. The simple solution is to limit your robots.txt files to a reasonable size :-).
24 plus ones
Shared publicly•View activity
View 31 previous comments
- John, do you agree with all this ? Thanks for your answer!!!Feb 1, 2012
- you just want me to eat that hat, don't you? :)
There is a good chance that John will not answer. It's not rudeness. One of the benefits of social media like Google+ (over, say, email) is that we can all help each other out, John would quickly become overwhelmed if he was expected to answer every question anyone fired at him. On the other hand, if he happened to see someone posting factually incorrect information on a thread he started he would probably jump in and correct that.Feb 1, 2012
- Hope so. I think that this question needs an official answer from a googler. Hope that +John Mueller will do this to be sure that our robots.txt file is not too large (and change the documentation) :)Feb 1, 2012
- Big b, little B, they look so similar :). Yes -- this is for bytes. I'll tweak that doc with the next update. Thanks!Feb 1, 2012
- Thank you John!!Feb 1, 2012
- And +Alan Perkins, sorry, you will have to eat another thing tonight ;)))Feb 1, 2012
Add a comment...