Keeping robots.txt blank

475 views Asked by At

I have couple of wordpress sites and with the current google seo algorithm update a site should be mobile friendly (here)

My query here is as follows, Currently I have written a rule in robots.txt to disallow crawling the url's with wp-

User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /feed
Disallow: /*/feed
Disallow: /wp-login.php

I don't want google to crawl the above url's. Earlier it was working fine but now with the recent google algorithm update, when I disallow these url's It will start giving errors in the mobile friendly test (here). As all my CSS and JS are behind the wp- url's. I am wondering how can I fix this one. Any suggestions appreciated.

1

There are 1 answers

4
Seb On

If you keep the crawler away from those files your page may look and work different to Google than it looks to your visitors. This is what Google wants to avoid. There is no problem in allowing Google to access the CSS or JS files as anyone else who can open your HTML-source and read links can access them either.

Therefore Google definitely wants to access the CSS and JS files used on your page: https://developers.google.com/webmasters/mobile-sites/mobile-seo/common-mistakes/blocked-resources?hl=en

Those files are needed to render your pages.

If your site’s robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.

If you are dependent on mobile rankings you must follow Googles guidelines. If not, feel free to block the crawler.