Disallow only homepage ( / ) and allow all other pages for robots.txt

52 views Asked by At

I need to prevent Google web crawler from crawling only my homepage, located in /

But I need to allow all the other pages to be crawled. How can I achieve that?

I tried doing:

User-agent: *
Disallow: /

User-agent: *
Disallow:

But it's not working

1

There are 1 answers

1
Tobias Schwarz On BEST ANSWER

You need to use the following for this:

User-agent: *
Disallow: /$

The path of the URLs is compared against the Disallow directives. $ designates the end of the match pattern so the Disallow directive will only match https://example.com/ but not https://example.com/foo.