SEO

Robots.txt That Return 500/503 HTTP Standing Code For Prolonged Time Will Take away Your Web site From Google

Robots.txt That Return 500/503 HTTP Standing Code For Prolonged Time Will Take away Your Web site From Google
Written by admin


Robots.txt That Return 500/503 HTTP Standing Code For Prolonged Time Will Take away Your Web site From Google

Gary Illyes from Google stated on LinkedIn that in case your server returns a 500/503 HTTP standing code for an prolonged time period to your robots.txt file, then Google might take away your web site utterly from Google Search.

That is even when the remainder of your web site is accessible and never returning a 500 or 503 standing code.

It isn’t only a 500/503 HTTP standing code that you might want to fear about, it’s also a problem in case your web site does these community timeout points.

Once more, it must be for an “prolonged time period,” which was not outlined, however I assume it’s greater than only a day or two.

Gary wrote, “A robots.txt file that returns a 500/503 HTTP standing code for an prolonged time period will take away your web site from search outcomes, even when the remainder of the positioning is accessible to Googlebot.” “Identical goes for community timeouts,” Gary added.

Gary referred to the HTTP docs and added, “if we will not decide what’s within the robotstxt file and the server would not inform us a robotstxt file would not exist, it might be far more hurtful to crawl as if all the things was allowed (eg. we would index martin’s awkward hat footage by chance).”

We all know that Google recommends utilizing a 503 server standing code for when your web site goes offline or down briefly for lower than a day (hours not a number of days). If it goes offline for longer, than attempt to add a static model of your web site as a substitute.

Simply watch out with lengthy down time – not that you simply seemingly have a selection.

Discussion board dialogue at LinkedIn.

About the author

admin

Leave a Comment