Start Free Trial
View all

What is Robots.txt?

A Robots.Txt File is a simple text file located in a website’s root directory that instructs bots on what to index and what to ignore. This is used mainly to avoid overloading a website with requests and not as a mechanism for keeping a web page out of Google, and it works mainly for good bots like web crawlers. It plays a crucial role in shaping a website’s presence in search results and can significantly impact SEO, along with conserving server resources, protecting sensitive files, and optimizing sitemap usage.

Here is a simple robots.txt file with two rules:

User-agent: Googlebot
Disallow: /nogooglebot/
User-agent: *
Allow: /
Sitemap: https://www.example.com/sitemap.xml

Read more about robots.txt

One platform, to solve all your knowledge base challenges

AI-Powered Knowledge base for customers and teams