← Back to SEO Guide

robots.txt Basics

robots.txt tells search engine bots which pages or folders to crawl — and which to avoid.

Impact on SEO

Incorrect robots.txt settings can block search engines from crawling and indexing your site, resulting in invisible pages and lost traffic.

Example

User-agent: *
Disallow: /admin/
Disallow: /private/

Sitemap: https://example.com/sitemap.xml

How to Fix

Avoid blocking important URLs or your sitemap.xml file. Use Disallow only when absolutely necessary, and always test changes in Google Search Console.

Not Sure What’s Hurting Your SEO?

Scan your site now and find out in seconds.

Run Free SEO Scan