Important SEO Updates & News from Google Webmaster's Hangouts
Many thanks to +Steve Martin for providing this useful summary of tips, insights, and juicy bits from a recent Webmaster's Hangout with +John Mueller of Google.
Many thanks to +Steve Martin for providing this useful summary of tips, insights, and juicy bits from a recent Webmaster's Hangout with +John Mueller of Google.
New SEO Panda Leaks! Google Will Auto-301 redirect your Old Clone Sites
Highlights of the +John Mueller Feb 14th and Feb 24th hangouts.
Panda and Content Spidering: If you are adding new content to your website daily make sure that spidering is increasing in Webmaster Tools. If you are adding content and they're spidering drops it is likely a Panda quality problem.
HTTP and HTTPS: If you use both versions you need to have a robots.txt file for each version or it causes duplicate content.
Disavowed Links: If you remove links from your disavow file that have been ignored by Google the links will count again, good or bad. The disavow file works like a robot.txt file with a list of instructions to follow. Whether the link is suspect or good it will get the link juice back.
Don't use HTML Sitemap Pages: Google ignores these as a low quality page and will not trust it and could be a Panda issue.
Does Google Pass Juice to Links in Files: John Mueller says that Google will read links in any file (pdf,xls,doc,etc), but will not follow them with link juice. Only proper HTML anchor tagged links in files will pass link juice.
Does Google Use CTR Tracking: Mueller denied, but if you read between the lines you could figure out that they do. It is mentioned in the book, "In the Plex" on Click through analysis, fom SERP to site and back to SERP. If visitors are bouncing from that page, add some fresh content, video or links from internal pages that are ranking.
Panda: they use internal usage metrics. If it looks nicer in the SERPS and a quality issue of on page content. "Have to have to site optimized before sending traffic of any kind" this implies that they may watch paid traffic in terms of the bounce rate for panda but has previously denied it in past hangouts.
They will ignore 301 is a directive to follow if you forward to a to b and they like A they will ignore the page your redirecting to.
New Site Honeymoon: Make sure your traffic is responding to your pages for "New Site Honeymoon" split test to do A/B testing to see if it perfoms well. It implies that the algorithm puts your site in a category for user metrics and specifically "sharing". Social proof is something they look at and if your website is not getting social traffic it will disapear. Google has algorithms for rank, for trust and de-rank according to social signals.
Site errors: 500, PHP, MySQL errors will be a "Panda" quality factors. Do not have errors for more than 48 hours with our testing.
Google is Machine Learning It is alive, it is SkyNet. It is equivellant to a 3 year old child.
Geo targeting Algorithm Working on a Best Guess Basis Even if you have .com or .uk Your content has to reflect your geo-targeting href lang. Based on quality or trust they will decide whether or not you are relevant to that specific geo location. Not just by the URL. Everything is proportional based and machine learning. That is what they are using on-the-fly to rank websites.
Share Your Disavow File to Public: John Mueller stated that when you supply a Google spreadsheet for your disavow list that is set to "Public" otherwise the web spam team cannot access it. The Webspam team does not have access to your Google account backend. It is against Google's terms and conditions that they cannot get into your account.
Review snippets: John Mueller implied that the review snippet is a quality factor. It's a great way to share the site.
Cloned Sites: When you have multi URL with same design, topic and just trying to rank for keywords. You cannot use clones but you can have exact match domains if it is trusted. If you have non-exact match keywords pointed to the site is preferable.
Google could detect that the IP is the same, registered by the same person, the design is the same, back links, keywords and content topic, that they will choose the best version that they determine which is the main site and will canonicallize it for you.
Does Forwarding from a Penalized Domain to a New One Reset a Penalty? If you try to kill your domain that was hit by Penguin and you create a new domain and forward everything over that they will "help" by automatically setting a 301 redirect. So they look at old site information and forward all the signals, good or bad to the new site. They have 2 indexes one is the historical index and the ranking index to compare and make a decision based on the mined data.
"They don't want spammers to get away with cloning sites" John Mueller said.
They keep a changelog and can tell if you are maintaining the pages. They could figure out that these are all the same and that you are trying to manipulate the SERPS.
Site Navigation is a Ranking Factor: Internal navigation needs to be logical. The page on your site that you want to rank the highest you need to point your internal links to that page to "sculpt" you page rank internally.
If you are experiencing issues with Google's many algorithms you could visit us at http://www.gobiya.com/ for answers.
Highlights of the +John Mueller Feb 14th and Feb 24th hangouts.
Panda and Content Spidering: If you are adding new content to your website daily make sure that spidering is increasing in Webmaster Tools. If you are adding content and they're spidering drops it is likely a Panda quality problem.
HTTP and HTTPS: If you use both versions you need to have a robots.txt file for each version or it causes duplicate content.
Disavowed Links: If you remove links from your disavow file that have been ignored by Google the links will count again, good or bad. The disavow file works like a robot.txt file with a list of instructions to follow. Whether the link is suspect or good it will get the link juice back.
Don't use HTML Sitemap Pages: Google ignores these as a low quality page and will not trust it and could be a Panda issue.
Does Google Pass Juice to Links in Files: John Mueller says that Google will read links in any file (pdf,xls,doc,etc), but will not follow them with link juice. Only proper HTML anchor tagged links in files will pass link juice.
Does Google Use CTR Tracking: Mueller denied, but if you read between the lines you could figure out that they do. It is mentioned in the book, "In the Plex" on Click through analysis, fom SERP to site and back to SERP. If visitors are bouncing from that page, add some fresh content, video or links from internal pages that are ranking.
Panda: they use internal usage metrics. If it looks nicer in the SERPS and a quality issue of on page content. "Have to have to site optimized before sending traffic of any kind" this implies that they may watch paid traffic in terms of the bounce rate for panda but has previously denied it in past hangouts.
They will ignore 301 is a directive to follow if you forward to a to b and they like A they will ignore the page your redirecting to.
New Site Honeymoon: Make sure your traffic is responding to your pages for "New Site Honeymoon" split test to do A/B testing to see if it perfoms well. It implies that the algorithm puts your site in a category for user metrics and specifically "sharing". Social proof is something they look at and if your website is not getting social traffic it will disapear. Google has algorithms for rank, for trust and de-rank according to social signals.
Site errors: 500, PHP, MySQL errors will be a "Panda" quality factors. Do not have errors for more than 48 hours with our testing.
Google is Machine Learning It is alive, it is SkyNet. It is equivellant to a 3 year old child.
Geo targeting Algorithm Working on a Best Guess Basis Even if you have .com or .uk Your content has to reflect your geo-targeting href lang. Based on quality or trust they will decide whether or not you are relevant to that specific geo location. Not just by the URL. Everything is proportional based and machine learning. That is what they are using on-the-fly to rank websites.
Share Your Disavow File to Public: John Mueller stated that when you supply a Google spreadsheet for your disavow list that is set to "Public" otherwise the web spam team cannot access it. The Webspam team does not have access to your Google account backend. It is against Google's terms and conditions that they cannot get into your account.
Review snippets: John Mueller implied that the review snippet is a quality factor. It's a great way to share the site.
Cloned Sites: When you have multi URL with same design, topic and just trying to rank for keywords. You cannot use clones but you can have exact match domains if it is trusted. If you have non-exact match keywords pointed to the site is preferable.
Google could detect that the IP is the same, registered by the same person, the design is the same, back links, keywords and content topic, that they will choose the best version that they determine which is the main site and will canonicallize it for you.
Does Forwarding from a Penalized Domain to a New One Reset a Penalty? If you try to kill your domain that was hit by Penguin and you create a new domain and forward everything over that they will "help" by automatically setting a 301 redirect. So they look at old site information and forward all the signals, good or bad to the new site. They have 2 indexes one is the historical index and the ranking index to compare and make a decision based on the mined data.
"They don't want spammers to get away with cloning sites" John Mueller said.
They keep a changelog and can tell if you are maintaining the pages. They could figure out that these are all the same and that you are trying to manipulate the SERPS.
Site Navigation is a Ranking Factor: Internal navigation needs to be logical. The page on your site that you want to rank the highest you need to point your internal links to that page to "sculpt" you page rank internally.
If you are experiencing issues with Google's many algorithms you could visit us at http://www.gobiya.com/ for answers.

View 24 previous comments
I think I must add here that about point 1 you may get an idea what am I referring to, reading this excellent post http://www.seobythesea.com/2007/08/search-indexing-dead-ends-ibm-patent-explores-dangling-nodes/ from +Bill Slawski.Mar 10, 2014
+Mark Traphagen I was posting on our company G+ page and coming back here I forgot to switch to my account. I already deleted the comment and re-posted. My apologies for the inconvenience.Mar 10, 2014
Hi Mark how about XML sitemaps auto generated by an server?
some clients use web based editors and hosting ie Wix.com for myself
and I am old 90s designer (my partner does the modern CMSs) and I think its AJAX but not sure, anyways Mark, do you think this is ok other its difficult to get Wix to do any programming changes expecically with scripts, so thanks for any comment for those of us and client on the free and paid online platforms also web.com and many other ecommerce store sites run on similar platforms and most or their XML sitemapes are autoupdated and seem to have a good connection right with Google(TM) SERPs so just wondering if I would recommend to a client to add an XLM
site map along with robts.txt possibly ie ecommece https or not on a virgin server account before its crawled by Google(TM) the first time, as with a start-up, or small business thats new online. thanks Cheers!Mar 16, 2014
+Ricky Wright if the directories and/or pages included in the XML sitemap are not blocked by the robots.txt or by a meta robots directive "noindex", it would make sense to have XML sitemaps.Mar 16, 2014
Great stuff +Mark Traphagen, that helped clear up some things I was seeing.Apr 14, 2014
Interesting. HTML site map is no good.Sep 26, 2016
Add a comment...