All articles SEOquick article

Robots.txt in 2026: how to create, test, and avoid breaking indexation

A practical robots.txt guide: what to block, what not to block, why robots does not remove pages from Google, and how to connect it with sitemap.

In short

A practical robots.txt guide: what to block, what not to block, why robots does not remove pages from Google, and how to connect it with sitemap.

Visual guide

Robots.txt decision tree

Robots manages crawling. It is not a privacy or deindexing tool.

Need crawl? allow Waste crawl? disallow Need hide? noindex Need map? sitemap

Robots.txt is a small file with large risk. One line can block important pages from crawling, while another can leave filters, parameters, and technical folders open for crawl waste.

The main rule: robots.txt controls crawling, not guaranteed removal from search. Google warns that robots.txt should not be used as a way to hide web pages from search results.

What robots.txt is good for

Job Use robots.txt? Note
Reduce crawl waste yes filters, parameters, technical folders
Block CSS/JS usually no Google should understand the page
Remove a page from index no use noindex or remove the URL
Point to sitemap yes useful for search engines
Hide private data no use authentication, not robots

Basic example

User-agent: *
Disallow: /wp-admin/
Disallow: /*?sort=
Allow: /wp-admin/admin-ajax.php
Sitemap: https://example.com/sitemap.xml

Before publishing, check that you are not blocking service pages, blog pages, images, CSS, or JavaScript needed for rendering.

Common competitor mistakes

  • blocking the entire site with Disallow: /;
  • blocking theme resources and hurting rendering;
  • blocking URLs and expecting them to disappear from the index;
  • forgetting sitemap;
  • not checking robots after redesign or migration.

Next step: if you are unsure, start with a technical audit, check the site in UNmiss, and add sem.chat so users do not get stuck after search visits.

Sources

SEOquick

Want to apply this to your site?

We will review the current situation, find the first growth levers, and suggest a practical working format.