I have enabled the Yoast premium plugin setting to Prevent crawling of internal site search URLs i.e. Add a disallow
rule to robots.txt
file to prevent the crawling of URLs like ?s=
, /search/
and /page/*/?s=
.
Under Advanced Crawl optimization setting which seems to be not working on my WordPress site(example.com) where the Yoast Premium plugin is installed.
As per this setting, a disallow rule should be added to the robots.txt
but the same does not happen at my end.
This is my robots.txt
after enabling and saving the option Add a disallow
rule to your robots.txt
file to prevent the crawling of URLs like ?s=
, /search/
and /page/*/?s=
# START YOAST BLOCK
# ---------------------------
User-agent: *
Disallow:
Sitemap: https://example.com/sitemap_index.xml
# ---------------------------
# END YOAST BLOCK
There is no change being made to this robots.txt file placed in the root of the WordPress site. Why?
I have enabled Prevent crawling of internal site search URL settings for the
Advanced Crawl optimization
; however, I cannot see the added directives to myrobots.txt
file.Enabling the settings Prevent crawling of internal site search URLs
Add a
disallow
rule to myrobots.txt
file to prevent the crawling of URLs like?s=
,/search/
and/page/*/?s=
.”, this should add the following directives in the robots.txt file:However, I observed that the crawl directives would not be added to my default
robots.txt
file. Instead, it has already been added to the staticrobots.txt
file here:https://example.com/?robots=1
.the code added to static robots.txt which is the Yoast output.