Shared publicly  - 
Website outages and blackouts the right way

tl;dr: Use a 503 HTTP status code but read on for important details.

Sometimes webmasters want to take their site offline for a day or so, perhaps for server maintenance or as political protest. We’re currently seeing some recommendations being made about how to do this that have a high chance of hurting how Google sees these websites and so we wanted to give you a quick how-to guide based on our current recommendations.

The most common scenario we’re seeing webmasters talk about implementing is to replace the contents on all or some of their pages with an error message (“site offline”) or a protest message. The following applies to this scenario (replacing the contents of your pages) and so please ask (details below) if you’re thinking of doing something else.

1. The most important point: Webmasters should return a 503 HTTP header for all the URLs participating in the blackout (parts of a site or the whole site). This helps in two ways:

a. It tells us it's not the "real" content on the site and won't be indexed.

b. Because of (a), even if we see the same content (e.g. the “site offline” message) on all the URLs, it won't cause duplicate content issues.

2. Googlebot's crawling rate will drop when it sees a spike in 503 headers. This is unavoidable but as long as the blackout is only a transient event, it shouldn't cause any long-term problems and the crawl rate will recover fairly quickly to the pre-blackout rate. How fast depends on the site and it should be on the order of a few days.

3. Two important notes about robots.txt:

a. As Googlebot is currently configured, it will halt all crawling of the site if the site’s robots.txt file returns a 503 status code for robots.txt. This crawling block will continue until Googlebot sees an acceptable status code for robots.txt fetches (currently 200 or 404). This is a built-in safety mechanism so that Googlebot doesn't end up crawling content it's usually blocked from reaching. So if you're blacking out only a portion of the site, be sure the robots.txt file's status code is not changed to a 503.

b. Some webmasters may be tempted to change the robots.txt file to have a “Disallow: /” in an attempt to block crawling during the blackout. Don’t block Googlebot’s crawling like this as this has a high chance of causing crawling issues for much longer than the few days expected for the crawl rate recovery.

4. Webmasters will see these errors in Webmaster Tools: it will report that we saw the blackout. Be sure to monitor the Crawl Errors section particularly closely for a couple of weeks after the blackout to ensure there aren't any unexpected lingering issues.

5. General advice: Keep it simple and don't change too many things, especially changes that take different times to take effect. Don't change the DNS settings. As mentioned above, don't change the robots.txt file contents. Also, don't alter the crawl rate setting in WMT. Keeping as many settings constant as possible before, during, and after the blackout will minimize the chances of something odd happening.

Questions? Comment below or ask in our forums:
julien ringard's profile photoPhilip Horger's profile photoSiva Gopi's profile photoJennifer Slegg (Jenstar)'s profile photo
I'll share this out a bit later ... hopefully we can catch as many as possible before things go horribly wrong.
There is a wordpress plugin that people are advocating to automatically blackout your WP site during the protest ( It should be noted that the plugin uses a 302 redirect, not a 503 status code.

I think that since the plugin is using 302 (temporary) rather than 301 (permanent) it should not negatively impact google crawling, right? This article from Matt Cutts seems to indicate that it might cause urls to be re-indexed to the new destination page until it is recrawled (

I'll defer to the google experts on what impacts that may have.
Can I fetch a 'blackout' message with AJAX and overlay it on the page for all pages, or will this too be considered duplicate content?
Pierre, what's about 502 status code for blackouts? It'll do the same for Google. We're using this currently when our main application is down but the http proxy is still up.
you could use jquery dialog to simply create a modal window and display the text. We just need to make the users aware of the protest. This will have the least impact on your domain and content
+Adam Covati Cross-domain 302 redirects need careful consideration as Matt explained in his post. If you think about it, for a 302 redirect from URL A to URL B (where they're on different domains) most of the time we'd be wanting to show the user the destination. An important example situation where this comes up is if URL A is a URL shortener URL; in that case users would really want us to return URL B in our search results.

Also consider if the destination's servers will be able to handle your and other websites' traffic being redirected to it.

So it's best to stick with having URL A return a 503 header and perhaps link to URL B.

+Yuli Cherkashin Depends on how you've set it up. Please see this blog post for considerations: If you have a specific implementation, please ask with details on the forums.

+Matthias Dietrich 502 would work too; actually any 5xx status error codes. We use 503 because it's the most semantically apt status code in this situation.
+Adam Covati I would suggest using a 503 response and a JavaScript redirect if you want to do that - it tells googlebot to ignore this and come back later, but will still redirect real users.
Sounds like the perfect case for cloaking. I know, I know, bad girl for even saying the word out loud.
For those of you thinking of using JS based solutions;
will that be processed by G?
Will it show in their render?
What happens to your Isntant Preview is G crawl and see it?

Stick to a Standard and Established method - use a 503 response.
Why not pause Googlebot for the blackout? :P
This all depends on how much indexed content you want Google to retain. Issuing 503 Service Unavailable halts crawling, therefore Google's cache retains more content from a previous indexing. If, however, you want to use HTTP codes for their purpose (a temporary replacement page is not a server error so the most simplistic content replacement still returns 200 OK) then a redirection is more appropriate. 303 See Other is close, but I consider 307 Temporary Redirect to be more accurate. The only problem is old non-HTTP/1.1 capable browsers that don't understand 307 Temporary Redirect but do understand 303 See Other (and treat them like 302 Found)

+Matthias Dietrich 502 Bad Gateway is even worse than 503 Service Unavailable. Altering the well defined meanings of error codes would also make it problematic if a true 502 / 503 code occurred. And what about all the site monitoring systems that look for error codes? You've got to turn them off (and remember to turn them on again once you're done)
Great information. Thank you.

+Pierre Far What about the use of an overlay (loaded via js) which leaves the content in tact and still barely visible (in modern browsers)? It's not a "real blackout" in that the content can be accessed with a bit more effort. Here's a quick sample of what I refer to: It's a lot simpler to add this to several sites in my case, than to do the 503 on each one of several, but I don't want to miss-step.
+Dale Allyn
As I jsut mentioned - that may result in G indexing the "page" as that ... and it showing in your Instant Preview?

Aside from that, I think the overlay approach should be safe enough ....... so long as you provide a way to turn it off?
(else it could poss. be construed as cloaking?)
> Googlebot's crawling rate will drop when it sees a spike in 503 headers

+Pierre Far are you saying we need to take great care any time we deliberately implement a policy to return a HTTP 503 response for some URLs, or will it only happen when most or all requests result in a 503. We were about to start issuing 503 responses for a class of URLs we would really rather people/bots weren't requesting, which represent about 30% of our millions of indexed URLs, but if this is going to cause us problems with the other 70% of URLs that we do want crawled and indexed then we'll need to rethink that strategy and maybe use a 400 series response instead.
I remember, once upon a time when I was a TC, asking for G to provide proper information about how G handles the difference responses.

Maybe, one day ...... they will?
+Lyndon NA Right. In the cases where I'm considering using the approach I mention, the text color and content are far from the overlay color, so I'm expecting that it should pass the "cloaking test". The content is still available to Safari Reader and such as well. Delicate mine field, this. ;)
+Pierre Far +John Mueller I think 503's are really useful and we always encourage them, however it may sound odd, but some of our biggest clients are the ones that have the most problem with this due to the complexity of load balancers, firewalls and general datacenter setup. We can't always account for all server outages, but in most cases we know in advance when scheduled downtime is about to happen. I expect some other big websites have the same problem. Therefore would it be possible to add a "Scheduled Downtime" feature to Google Webmaster Tools that had the same effect on crawling as a 503? This could, work in one of two ways:

1) The simplest and probably safest. Through the console we request a retry after period and have an option of say 1hr-24hrs. The request expires automatically at the end of the period.

2) Like the Google Search Appliances give us an option to schedule or more specifically un-schedule crawls.

I think option 1 is the safest for most use cases though, and would also be handy for lots of webmaster/designers who aren't confortable tweaking status headers at server level.
The suggestion by +Edward Cowell would also cover the number of site owners/designers that either lack access to alter server responses, or the technical skills to do it (without risk of screwing themselves).
+Edward Cowell Would you also want the clients' sites to disappear from the search results at the same time ... or perhaps remain in the results but be tagged with a "Site Is Down For Scheduled Maintenance" message to discourage (but not prevent) clickthrough?
+Lyndon NA that's what I was thinking to. Your average web designer doesn't really know how to do this, and if they have to do a technical implementation probably wouldn't. A GWT option would be very useful for them.
Hi +Alan Perkins for now I am purely thinking of a way of overcoming the issue of implementing 503's, not a functional change to their impact on results. As I see it, they are fit for purpose, just tricky to implement in some instances. Also it's prudent to try and keep the workings as close to the current 503 specification as possible so that it requires the least changes in Googlebot functionality. Effectively I propose Google just update Googlebot to make webmaster tools an excepted 'alternative' first source for the 503 status headers for a given website, the same as they might receive from that website anyway....if the website could implement them :-)

Changing the display results? - It might be a good idea, I imagine that's a more complex and different consideration, as then you are also bringing query handling into the problem. The Googlers might have some thoughts on this.

ps. See you at SES London.
+Edward Cowell
LOL. Well, I would have said most SEOs also dont know how ... but as SEOMoz seems to have caught up with the few of us (we covered this ages ago in GWC forums), that may mean at least a higher % of SEOs at least know about it (betting most still haven't got a clue how to implement it though :D).

It would be a nice feature ... but I wouldn't hold your breath. G tends not to implement anything unless it is loudly requested by the masses, or benefits them someway.

+Alan Perkins
Why would you want the SERPs updated automatically?
There are instances when you want the site active, but simply not crawled ... the 503 is perfect for that.
So if G were to implement it (unlikely), you'd also want some sort of flag to indicate whether you want the site noted as "down" or not.
+Mark Hughes +Dale Allyn Using a DIV on top of the page could result in the content from that DIV being indexed temporarily. If you use JavaScript to inject it from a robotted JavaScript file, that will help, but it may still show up on the Instant Preview. It usually cleans itself up quickly though (the preview image might take longer to refresh), but if that's ok with you, then it'll work as well.

+Jake Weisz Pausing Googlebot would mean that updates on active websites won't get indexed either (eg maybe there are other kinds of breaking news that people want to know about).

+Alan Perkins If these are URLs that people & crawlers shouldn't access, maybe a 404 or 410 would be better suited? Using a 503 as a permanent state isn't optimal because it might be interpreted as a 404 in the long run anyway.

+Edward Cowell A "scheduled downtime" feature in Webmaster Tools sounds interesting, but I'm really wary about it adding unnecessary complexity (and errors) for most users. Other possibilities could be to just block (503) the robots.txt (keeping in mind that it's cached about a day), or setting the DNS to a low TTL and directing traffic to a different host for the interim.
+John Mueller Honestly, I would hope people needing breaking news would go directly to the sources. Though perhaps on the 18th, SOPA is all people need hear about anyhow? :P I would just suggest that pausing Googlebot would be almost like putting the pause button on Google, without actually shutting down the worldwide need of having access to reliable internet search.

With a bunch of major sites doing full blackouts, Google could practically "pause the internet" for a day. ;)
Thank you +John Mueller . That is consistent with my understanding and is a suitable option for some of my needs. In the case of my example, the DIV contains no content (or a small bit of content I don't mind being indexed) and is simply styled to mask the page. And yes, the DIV is to be injected after page load. The preview refresh is the only issue and will be a minor one in some cases (nearly all of mine).
For those interested, my company just released a JavaScript widget that will allow you to black out your site with an interstitial on the 18th. This will allow you to join in the protest on the 18th without having to worry about the details of building a custom page. Since it's just JavaScript, it's Google friendly. You can learn more, see what it looks like and grab the code at:
I just posted this on Hacker News and it's worth highlighting here when thinking about a Javascript-based blackout scenario (as opposed to a site outage situation).

A well-implemented Javascript overlay for the blackout message is a valid option in terms of search-friendliness, but keep in mind the following when thinking about it:

1. Googlebot does run some types of Javascript. We blogged about this recently:

This means that the content included via Javascript may be indexed until the next time we crawl and process the page and find it's gone.

2. Consider how the JS will affect your pages' instant previews. Overlays are likely to show up in the previews, and these will take some time to update as we re-generate the previews.

3. Consider your users. Some webmasters are suggesting keeping the blackout overlay visible without any means of hiding it. That may not be the best user experience, and may be annoying to your non-US based users.
Nice - G is getting faster.
That only took 2:30 hours :D

(Yes, a little sarcastic - but also genuinely impressed, that IS fast for official from G!)
I recall writing an include file for Apache for our previous blackout campaign at La Quadrature du Net:

It causes a complete blackout of the server.
Feel free to reuse and adapt to your needs, the html code at least.

Another reason for using 503 instead of a redirect, as mentionned in this file, is that redirects make the browser possibly send private informations the redirected site has no reason to obtain.
John it's usually technically no more difficult to 503 a robots.txt than an entire website, so in those cases where we can implement them that's not the problem. However having a switch in GWT would be a heck of a lot easier than getting it past some of the IT teams we work with, and I don't think adds complexity. If anything it makes it easier.
That's actually a fair point +Enrico Altavilla .. .else we may find a bunch of people using just the 503 status, and that may get a little ugly.
That said - can G confirm that they do adhere/acknowledge/at least try to pay attention to the Retry-After value?

(If JohnMu responds, can someone please C&P his answer, lest I won't see it)
Also, I would like to discourage the use of any type of server-side redirection (3xx), regardless of what you read in some articles out there. While a 503 status puts Google in a "stasis", a 3xx status code gives updated information about the URLs of your resources ad you cannot be sure about how that new information will be processed and used by Google. Don't mess with the URLs and the canonicalization of your resources: even a 302 -> 503 scenario can produce unwanted effects, difficult to recover from. I would avoid that.
+Lyndon NA (and +Edward Cowell) >> Why would you want the SERPs updated automatically? <<

Because the idea is that the site is down for scheduled maintenance, so the searcher experience is going to be lessened if they visit that site.

>> There are instances when you want the site active, but simply not crawled ... the 503 is perfect for that. <<

Yep, I was talking specifically about a "site is down for scheduled maintenance" flag in WMT - not a 503, which you could continue to use as at present.

+John Mueller >> If these are URLs that people & crawlers shouldn't access, maybe a 404 or 410 would be better suited? <<

Yes, maybe. Technically a 503 is more accurate but it seems a 404 would be a more prudent response.
Random point - old versions of IE used to replace the server returned error message with a standard one if the server returned error page was less than some number of bytes (512, IIRC, but maybe 1024). If you want to be sure people browsing to the site see the protest message, make sure the content of the error page is large enough not to cause IE to display it's stock message.

New versions of IE may also do this, but I haven't had to touch windows in a while so can't verify myself.
+Alan Perkins
Thanks for the clarification.

If I remember correctly, and if G hasn't changed things ...
... I believe that if GoogleBot hits a lengthy 503 9multiple tries and same response), it will drop the result from SERP?
It's important to note - it does not drop it from the Index, it simply filters it out of the SERPs.

Maybe they could flag/tag the listing with "seems out of order?" message ... and possibly slap in a nice "cached version here" link?
(As I'm sure most people don't realise there is a cahce, nor how to use it?)
+Lyndon NA Good idea, I like it.

+Ralf Bachmann All the answers are in Dutch so I'm not sure if someone said this exactly (despite having run the responses through Google Translate!), but in that instance I'd close the shop but not the site. Many Jewish retail sites, for example, have a message like "Please note weekend orders, placed on Fridays after 11:00 AM ET, will be processed on Monday." Bots continue to crawl and index as normal, but no transactions between humans take place.
the simplest thing tat cud b done is to pull out all your server power cables or (hard lines as some may call it) for a few hours and then plug them back in again .... this way ur energy is saved tooo and let google show watever error it wants to show .... no messing around with codes or google bots .... , surely wat is the meanin of blackout , if u want ur site to be still retaining the same rank in google .... wat if we drop a few ranks .... no prob , but we want people to know wat happens when someone messes with internet censorship
what't the best way to do this on a google blogger site?
Anyone have a "layman's term" version of this? I would like to take my website offline in solidarity with the protest, but all this is way over my head (though I would love to learn what it all means...).
+Pierre Far Actually, I think I got it. In web.config, it must be set up as such:

<httpErrors errorMode="Custom">
<error statusCode="503" path="yourfile.htm" responseMode="File" />

This looks to be returning a 503 correctly and still showing the file.
That's the problem, I wouldn't mind switching over to a 503 page but don't know how, on, blogger, posterous, or the Typo blog on my own host.
Hi +Nathan Chase. I'm not familiar with ASP.Net so I can't confirm. It's best if you ask your server admins or your hosting provider as they would know the best configuration change for you particular setup.
Heh, well, I am the server admin and the host is Windows Azure, so it was all on me to figure it out. It does work (despite Microsoft technical docs stating you can't do it). :)
+John Mueller URL is, I'm not sure what kind of site it is, I use Dreamweaver to edit it. I know remarkably little about this stuff despite designing and maintaining my own website for over 10 years. I highly doubt the site actually gets any traffic anyway, but thank you so much for the offer of assistance.
Hi Pierre. Thanks for your article. I am comfortable with Apache and will be serving my site up with a 503 tomorrow. I am wondering if you have any suggestions (from the Google Spider point of view) for people who cannot control the headers or don't know how to change them. Is there something that can be done in the HTML for the blackout page? Thanks.
(Esta es una traducción automática)

Pierre Far - 02:31 AM Ayer - Público
Cortes de página web y los cortes de la manera correcta

tl, dr: Utilice un código de estado HTTP 503, pero sigue leyendo para conocer detalles importantes.

A veces los webmasters desean llevar su sitio fuera de línea por un día o dos, tal vez por el mantenimiento del servidor o como protesta política. En estos momentos estamos viendo algunas de las recomendaciones que se hacen sobre cómo hacer esto que tienen una alta probabilidad de hacerse daño a cómo Google ve estos sitios web y por eso quisimos darle una rápida guía de instrucciones sobre la base de nuestras recomendaciones actuales.

El escenario más común que estamos viendo webmasters hablar acerca de la implementación es para reemplazar el contenido de todas o algunas de sus páginas con un mensaje de error ("sitio fuera de línea") o un mensaje de protesta. El siguiente se aplica a este escenario (en sustitución de los contenidos de sus páginas) y de manera que pregunte (detalles a continuación) si usted está pensando en hacer otra cosa.

1. El punto más importante: Webmasters debería devolver una cabecera HTTP 503 para todas las URLs que participan en el apagón (partes de un sitio o el sitio). Esto ayuda de dos maneras:

a. Se nos dice que no es el "verdadero" contenido en el sitio y no se indexan.

b. Debido a (a), incluso si vemos el mismo contenido (por ejemplo, el "sitio fuera de línea" del mensaje) en todas las direcciones URL, que no causa problemas de contenido duplicado.

2. Frecuencia de rastreo de Googlebot se reducirá cuando se ve un aumento en 503 cabeceras. Esto es inevitable, pero siempre y cuando el apagón es sólo un fenómeno transitorio, no debería causar problemas a largo plazo y la frecuencia de rastreo se recupera con bastante rapidez a la tasa de pre-apagón. ¿A qué velocidad depende del sitio y debe ser del orden de unos pocos días.

3. Dos notas importantes acerca de robots.txt:

a. Googlebot como está configurado actualmente, se evita totalmente el rastreo de la página si el archivo robots.txt del sitio devuelve un código de estado 503 para robots.txt. Este bloque de rastreo continuará hasta que Googlebot ve un código de estado aceptable para el robots.txt obtiene (en la actualidad 200 o 404). Se trata de un mecanismo integrado de seguridad para que Googlebot no termine el rastreo de contenido por lo general se bloquean para llegar. Así que si estás negando el acceso a sólo una parte del sitio, asegúrese de código en el archivo robots.txt de estado no se cambia a un 503.

b. Algunos webmasters pueden tener la tentación de cambiar el archivo robots.txt para tener un "Disallow: /" en un intento de bloquear el rastreo durante el apagón. No bloquear el rastreo de Googlebot como esta, ya que tiene una alta probabilidad de causar problemas de rastreo por mucho más tiempo que los pocos días se espera la recuperación de la frecuencia de rastreo.

4. Webmasters verá estos errores en las Herramientas para webmasters: que se informe que vimos el apagón. Asegúrese de controlar la sección de errores de rastreo con especial interés por un par de semanas después del apagón para asegurarse de que no hay ningún problema persistente inesperado.

5. Consejo general: cuanto más sencillo y no cambiar demasiadas cosas, sobre todo los cambios que tomar los tiempos diferentes para tener efecto. No cambie la configuración de DNS. Como se mencionó anteriormente, no cambie el contenido del archivo robots.txt. Además, no altera la configuración de frecuencia de rastreo en WMT. Teniendo tantos ajustes constantes como sea posible antes, durante y después del apagón se minimizarán las posibilidades de que ocurra algo extraño.

¿Preguntas? Comentario más abajo o preguntar en nuestros foros:
Pierre, thanks a million for this. I have a question about how caching services treat 503s. My site is Akamaized, and I woke up wondering if a caching site wouldn't treat a 503 as a "Don't fetch new content" signal and continue to display the cached page -- i.e. the site content I'm trying to black out. That would effectively foil my plan to serve up a SOPA-specific page instead of my standard "Service Unavailable" page. Know it's probably a question for elsewhere, but any insights?
+Brian Fitzgerald It probably depends on how your CDN handles that. I'd try to test that in advance just to be sure, or otherwise consider using something like the JavaScript from (with the caveat that it may result in any Instant Preview images that we generate during that time also showing that interstitial).
For some reason, my Googlebot visits have decreased a LOT during last 4 hours (from thousands per hour to 10 per hour). Maybe is it related to SOPA? I have not increased 503 pages (at least deliberately).
If you use PHP, for quick reference!


RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} !^/sopa\.php$
RewriteRule ^(.*)$ /sopa.php [L]


header('HTTP/1.1 503 Service Temporarily Unavailable',true,503);
header('Status: 503 Service Temporarily Unavailable');
header('Retry-After: 86400');
Hello everyone. We realize many webmasters are concerned about the medium-term effects of today's blackout. As a precaution, the crawl team at Google has configured Googlebot to crawl at a much lower rate for today only so that the Google results of websites participating in the blackout are less likely to be affected.
Im in I will be blacking out my website today!
+Pierre Far you know that "Scheduled Downtime" feature to Google Webmaster Tools I was suggesting might have come in really handy today :-)
+Pierre Far Those are AWESOME INSTRUCTIONS!!! Well, what can you expect from an employee that works in Google? Brilliant!
I wasn't aware that returning a 503 for robots.txt will result in crawling being halted. Thanks a lot for mentioning it!!!
We are taking our blog down for a month! Is that too long a period for 503 with additional retry date set?

I haven't found any specific information about that. And how should having your site closed for a month (with returning with the same content) to be handled, for not losing index rankings?

Any help would be appreciated!
Would this be acceptable:  IIS 7.5

Response.Status="503 Service is Unavailable"
Response.AddHeader "Location", "  /indexnew.jsp"

This is a redirect to our mobile site which will not be down
If these bills are, in fact, enacted, users who stream copyrighted content 10 times in 6 months may face up to five years in prison. Ridiculous!
pierre - great and helpful info. how wouldi go about enabling a HTTP 503 status code for the site's robots.txt ONLY?
+Janne Kurkinen did you ever get an answer to your question about taking your blog down for a month - or did you ultimately try this 503 method?  We're facing a similar question.  Thanks for any help/suggestions you can provide!
Add a comment...