According to Wikipedia :
"The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to advising cooperating web crawlers and other web robots about accessing all or part of a website which is otherwise publicly viewable. "
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.
so as the specific folder or file should not be included by search engine .
BY use of robots.txt we can protect our important folders from webserver in effective & simple manner






















