technical
Beginner
Robots.txt
A text file that tells search engine crawlers which pages or sections of a website they should or shouldn't access.
Detailed Explanation
The robots.txt file is a standard used by websites to communicate with web crawlers. It can prevent search engines from accessing certain parts of your site, like admin areas or duplicate content, while allowing access to important pages.
Examples
Blocking search engines from /admin/ directory
Allowing access to CSS and JavaScript files
Specifying crawl delay for different bots
Related Terms
Terms That Reference This
Need Help Implementing SEO?
Our experts can help you understand and apply these SEO concepts to grow your business.