A robots.txt file can be important for a sufficiently large website that might want to keep some of its page away from search engines. Most smaller websites won't even need this file, but for others, it can be sometimes difficult to make and maintain the correct robots.txt files, as finding the directives within a large robots.txt that are blocking individual URLs can be tricky. For that, you need a debugging or testing tool to check out any problems with your robots file. And as always, Google is there to help.
Read more »
EmoticonEmoticon