I am getting the Soft 404
error on Google Search Console
which means when a URL returns a 200
(success) status code along with a page informing the user that the page does not exist. It means that it returns a user-friendly message Sorry, No Results found
but not a 404 HTTP response
code. This page is an error page based on its content, therefore Google Search Console will show a soft 404
error in the site’s page indexing report. These pages are not indexed or served on Google. Suppose, The page is https://example.com/?s={search_term_string}
and I do not want the Google bot to even discover it when it crawls on website pages through a submitted sitemap
. How can I disallow it in the robots.txt
file so that the Google bot
ignores it during the next crawl
? Thanks
You can edit your
robots.txt
file inWordPress
by following the below steps:FileSter
plugin..Htaccess
,robots.txt
, etc.robots.txt
file and open it withtextarea
.robots.txt
file and add the following code to therobots.txt
file and save it:[Disallow: /?s=*]
User-agent: *
Disallow:/wp-admin/
Disallow:/wp-includes/
Disallow: /*?show
Disallow: /*search?search
Disallow: /?s=*
soft 404
error, andstart validation
again. This time this page will be ignored.Hi, You can also refer to Google’s robot.txt file at any time to understand how this file works.