Shared publicly  - 
What confuses or annoys you about moving your site? What would be your wish list for (say) Webmaster Tools or Google's indexing systems to help you? I want to know!

Let's define site moves properly as there are many kinds (and they're all in scope for this question :) ):

1. Site moves without URL changes. Here you change the underlying CMS or move to a different hosting provider without changing anything visible to users.

2. Site moves with URL changes. Here you're moving from HTTP to HTTPS URLs, or moving to a different hostname, or the paths of the URLs change (e.g. from to 

We're NOT talking about site redesigns.

Fire away :)
Poreddy Srinivas's profile photoStephen Dow's profile photoDavid Sanders's profile photoCristina Wood's profile photo
مرحبا مهندسنا بيير هل ممكن توضحها اكثر بالعربية ^ــ^
It would be nice to get a website migration report card indicating the number of migrated URLs, possible mismatches and errors, progress status of indexation of your new site.
+Pierre Far - I had some confusion around the correct application of a 301 redirect when moving from http to https.  I don't know if that falls specifically into WMT remit, but some [idiots] guides would always be useful.
In the list with 404-errors you have a button to mark the error as fixed, but not all 404-s need fixing. It would be nice to have to have a 'ignore' button (or 'removed permanently'-button).

After a site migration, Google often tries all the URL's it ever came across. Even with redirects in place the list with errors can be very long (>10.000). If you could ignore URL's that are not relevant (asking Google not to show them again) it would be easier to find the URL's that do need redirecting.
I'd really like GWT to not pester me for years after I've removed a site from my account. Getting messages saying that it's not accessible is not really helpful when I no longer care.
Hi +Pierre Far . I sometimes have to restructure a website's folder structure similar to the process you have described above. How can I ensure a well indexed and trafficked website holds it's position in #serps ?  Is performing a page for page mapping in .htaccess (301) sufficient? Also does this apply to moving a language section of a website onto a new sub-domain?

Thanks and great idea.
When migrating a site between domains, handling the traffic to the new domain while the migration hasn't completed so that the sum of the traffic to both the original domain and the new domain roughly matches that of the original.

I had a great discussion [1] with +Eric Wu in Google+ about migrating between sites and when possible, he opts for rel=canonical and once the traffic is moving over to the new destination URL - then implements 301 redirects. I love to hear if there is merit in this approach and under what conditions might it be worst/neutral/better than just going cold turkey with 301 redirects.

Great question! I think it would be helpful to be able to get a comparison report of the current site and the new site in a dev environment in order to punch through holes in site structure and meta info. Basically, a differential report on what Google will see as changed and whether or not they are good changes. 
Great questions and ideas so far; thanks! Keep 'em coming.
I find lots of sites with legacy pages and subdomains that site owners have forgotten about.  It would be great if Google notified webmasters about old versions of pages that are indexed.
I have very recently moved from one domain name to a new one, but the pages are an exact carbon copy. I simply re-branded.

For example from to 

It is Wordpress and I created a 301 in .htaccess. It appears that everything is working fine. My biggest concern is the social signals. Once I changed, I lost all my plusses that were displayed ont he page. I expected this.

But, unlike the other networks, it appears that these signals have followed the 301 and are now displayed on my new URL. Is this possible? 

Do you have it set up some how that the plusses and share counts are transferred to the new URL? I am very excited because this appears to be the case! +Pierre Far 
I once gave advice to a company that saw a big drop in search traffic after a migration. I found out they made a mistake in their redirects. Their redirects looked like this: 'Redirect 301 old-url new-url' (they forgot the domain in the new-url). In the browser this mistake was corrected automatically (the redirect seemed to work when tried), but it looked like Google ignored the redirect because of the syntax error. Fixing the error made the search traffic come back. 

To make a long story short: it would be nice if WMT showed the issues Google has with redirect statements on a web server. 
+Pierre Far I am about to move my site from Godaddy to Drupal. All URLs will be the same. All content will be the same but I anticipate minor changes in layout that will only enable the user to find what they need in a clearer fashion. I am really worried about losing my rank on long tail keywords but again all content will be the exact same, I made a database of all the content and it will be used on the new CMS. So, will I suffer is the question? I hope that by bringing a better UX that my onsite time spent goes up, my bounce rate goes down and goal would then be to increase in ranking. Whatever it is...I know that providing useful content to users is the bottom line in an easy to manage fashion...hopefully the new site will be far superior to the one I am with now. Thanks for opening up this dialogue. 
in both cases, ideally the possibility to submit a CSV or XML file in order to say to Google, ok this was address A, please transfer the search results/ranking to address B, without implementing 301 redirects before submitting the CSV/XML,and allow google to validate this mapping. once google is ok with it, a "switch" to tell google to use the new address, to be run concurrently with the actual server change. this to take some heat off from the actual moment of the server switch, when you're supposed at the same moment to have also all the 301 redirects in place according to the current process.
+James Lane ditto that idea for the http to https 301 guide for idiots.
What about something in search results, for example at a search for info:old-site-dot-com to have "this site now redirects to new-site-dot-com". It would help webmaster to see at a glance that the redirects have been absorbed by the machine, and users also would see at a glance what is happening to the site they are searching about.

We have had QA sites indexed and whilst e follow guidelines about removing them from the index, I still get reminders and notes questioning its health.  Is it possible to mark them as a QA site to stop Google trying to or crawling them, but then excluding from SERPs?  Could then disable notifications on that site also from Google.
Thanks for opening this up for suggestions +Pierre Far, I'm actually looking at a domain migration right now!

I think it would help if we could submit URL mapping to Google, telling you guys which URLs have moved and where to rather, than having to wait for Googlebot to crawl all the 301s on its own.
Also: the ability to tell you which URLs have not moved and will be culled.
It would be nice if we could 'request a full crawl' - perhaps a limit of two per account to avoid it being used for the wrong purpose.
Ability to specific the new domain but ALSO notify you if CDN file paths are changing.
Perhaps some kinds of alerts after change of address submission, telling us of major fluctuations in traffic, indexation or crawl errors on the new site.
Better accommodation of moving from sub domain to full site, and vice versa.
Maybe a specific section of the GWT interface to help webmasters iron out any issues post-migration and compare performance (search terms and CTR, avg. position) of new site vs. old site.

Thanks again!
I would like the ability to branch off a subdomain, folder, or page into its own site on its own domain name using the Google Webmasters site move tool. 

I had the situation where I started my currency converter as a subdomain of my personal website.  Once it became popular, it was clear that it deserved its own domain name. I temporarily lost a lot of traffic during that move because the site migration tool didn't support a subdomain to domain name move.
We are planning to move all our sub domains (,,, etc.) to main domain ( We are going to create folder for each sub domain and will place 301 redirection ( 301 redirection to Will it be a good strategy if we implement this kind of changes, because 2 of our sub domains lost their rankings (no manual spam message in GWT) in 2013 may be due to low quality links. We did everything like uploaded disavow file, contacted web-masters to remove low quality links and received good amount of responses. Please share your thoughts.
I think 301 redirects should be made possible completely through Webmaster Tools. Dealing with servers should be something Google parses. I also think most bloggers and website owners simply don't know how to do technical SEO.
I would love a little more intelligent assumptions around handling 301 redirects. Restructuring a website's CMS or its underlying heuristics can be a nightmare. i.e. Writing 301 rules for an ecommerce site with 1000's of products each with dozen or so variations, becomes overly tedious very quickly. I'd love for GWT to 'catch' urls that I may miss during an all day 301 rule-writing session. Ultimately an (old) indexed page in a SERP that would not end in a 404 error but would be intelligently mapped to a page whose content uniquely matches a (newly) cached page. Then provide me with a '301' or similar note in my dashboard instead of a 404 error, that I can go out and add a rule for later. Obviously the intelligent redirect would only occur for pages which do not have a 301 rule currently written.

This would be a huge help, but I'm assuming would add an additional dimension to the cubing mechanism for mapping cached and indexed results. I can dream though, right? lol
+David Sanders , interesting. I think some indexing problems of websites that have to redirect 301 thousands+ of old URLs to new URLs are because until all old URLs are recrawled, they stay in the Google cache as duplicate content of the new URLs.

Add a comment...