Shared publicly  - 
22
25
Tony “Tiggerito” McCreath's profile photoAlberto Maria Rossi's profile photoBonnie Gibbons's profile photoWilliam Craig's profile photo
57 comments
 
anchor text seems to be a very likely contender!
 
Think I'm going to run with 'anchor text'.
DEJAN
+
2
3
2
 
Think about it. Which link signal seems useless? Anchor text can be useful to describe the next page.
 
Could be something as simple as font size or bold/italic as they are fairly old ways to tell the search engine the importance of the link (and therefore encourage bad practice). Having said this, I've seen some big movements recently on some sites and don't think that would have caused such an impact.
 
I'm in the nofollow counts for "something" camp.
 
I'm happy to rule out anchor text as being completely removed; a couple of sites that I know of are still ranking on the strength of linking anchor text.

The quote refers directly to the 'topic of a linked page' which suggests to me a descriptive signal. I'm going to go with alt text, and add that I still don't believe the title attribute carries any weight.
 
anchor text may be handy to describe the next page but it's also open to a lot of abuse, much like title attributes for links. there's been a lot of talk in the usualy 'seo predictions for 201x' articles about the phasing out of anchor text as a high value signal.
 
Hmmm....how about giving a bit of value to the nofollow attribute on a link?
 
Funny but I don't think it's PageRank, otherwise they wouldn't bother updating it in February.
 
I can only guess but all the three mentioned above anchor text, nofollow and PageRank have been abused so much that Google could turn them off.
I think though it's rather something smaller. It's probably something like number of links on a site linking to another site. I think in the recent past 10k links were still better than just one but by now I guess they could neglect the number completely. So whether a site links to you once or 10k times won't matter probably anymore.
 
Actually, I was intriqued by the many references to changes in how Google handles freshness, and I can think of some of the features they might have removed, and some of the new signals they might be looking at to do things like determine burstiness.

Regarding links, while that section starts by mentioning "link characteristics," they tell us that "we are turning off a method of link analysis that we used for several years." So is this something as simple as ignoring whether or not links might have underlines or not, or does it involve a larger process or method of link analysis?

I think there's still some value in the use of anchor text and in PageRank, but there are many different methods of link analysis that Google could potentially turn off.

For example, the local interconnectivity patent approach (http://www.google.com/patents/US6725259) that was inferred as being turned on in 2003 in the book In the Plex might be a candidate. That involved looking at the top-n (10, 100, 1,000) results for a query and reranking them based upon how frequently they link to each other. There's still some value to looking at interlinking when it comes to determining if one or more results might be ideal navigational results for a query, but is it helping to send better results to the tops of those results? It's something I would test on a regular basis to see if it does.
 
Interesting idea. So much like the ranking change in GWT - Google counts the highest quality link that points to your site form the same domain and ignores the rest? What about blogs then? (e.g. wordpress, blogger etc).
 
+Bill Slawski I also spotted the freshness focus and straight away connected to another research piece I read and blogged about here: http://dejanseo.com.au/detecting-active-blogs-through-user-centric-metrics/

In summary, using user-centric blogging characteristics to detect dead blogs and those who never took off to remove vast amount of spam from their index.

This only opens question how many other temporal factors they have at their disposal.

Just read the patent link you sent through. Wow. So they produce the initial set of results by observing global link graph and then re-rank by observing the micro-link graph comprised of the set of returned documents. I would imagine that would produce a lot of orphans?
 
For me, "several years" isn't the same as "almost since the beginning", so I don't think it's anchor text or Pagerank.

Anchor text has been abused a lot, but I'd love to see what the SERPs would look like without it. I think many users would complain. As for Pagerank, there's a difference between PR and TBPR - Google could switch off TBPR easily, but still use PR behind the scenes. I don't think this will happen very soon, though.

I agree with Tad that it will probably be something smaller and it really could be anything. Raw link numbers (sidewides), on-page link relevance, no longer placing extra weight on old links, how to deal with link spikes, or adjusting first-link-counts - just to name a few.

Off to do some testing & hoping someone will be able to squeeze something out of Matt Cutts at SMX West :)
 
+Wiep Knol Good luck with getting Matt to answer straight. I think we're getting skilled at interpreting vague answers from Google. The art is in crafting the right question to get more out of the answer!

We've done some basic observations and have eliminated sitewide footer links as being the element removed.

My guess it's something to do with temporal link observation, I just don't know which one.
 
The changes don't have to be live yet, as the post states "Most of the updates rolled out earlier this month, and a handful are actually rolling out today and tomorrow".
 
+Dan Petrovic The Google poster that you wrote about is a few years old, and I was hoping to see a followup research paper associated with it, but I can't say that I've seen one come out. I've suspected that Google has looked at a number of the heuristics described within it, and likely implemented a few of them.

I recently wrote a blog post about a recently granted Google patent that was originally filed in 2006, which described how they might filter some blogs out of blog search, and the description included some really broad, outdated and not very good rules for deciding whether or not they would include blog posts within blog search. These included considering the number of links within a post (with too many being bad), distance of links within posts that had links too far from the start of a post not included, and presence of links pointing back to the post or to other pages on the same domain.

I followed up that post with another one that (1) had 98 external links, (2) had links throughout the post instead of just a short distance from the start of the post, and (3) had 36 named anchor links towards the start of the post that linked back to different sections of the post. All three of those would potentially keep the post from being included in Google Blog search because that post broke three of the rules from the description of that patent. The post was showing in Google's blog search sometime shortly after I posted it.

While I suspect that Google did come up with filters to keep some blog posts from appearing in Google Blog search, I don't think many of the rules described within that patent were implemented as described in the patent.

But they could have been. All three were link analysis type heuristics, and if any of them were still in use by Google, they were ones that should be retired, because they were too broad and didn't do things like consider the target of the outgoing links (in my example, 97 of the 98 links were pointed to pages at the USPTO) or even the internal ones, which were named anchor links helping to make the blog post more usable by delivering readers to sections of the post that they might find most interesting.

Another of the rules from that particular patent would potentially filter some blog posts out of Google Blog search results if they linked to videos. The patent was originally filed a number of months before Google acquired YouTube. The intent was to avoid blog posts that might link to "undesirable" content, but it didn't distinguish between the kinds of content that those videos might contain. Again, a rule that was likely too broad when described in the patent, but which probably didn't get implemented as written.

I suspect that there are other "link analysis" methods that Google may have actually implemented that may have been based upon assumptions that didn't carry out as providing the value they were intended to give, or might have been based upon circumstances that have changed.
 
Really great post +Dejan SEO and +Dan Petrovic and great conversation about new ranking signals and link value factors . :]

"We have changed the way in which we evaluate links; in particular, we are turning off a method of link analysis that we used for several years."

In my opinion, they are talking about completely new method with new set of ranking signals. Further, it can means more value to social signals, specially from Google+ and less value to old school ranking factors, such as no follow, number of outgoing links, number of internal links, ALT attribute, exact match anchor text...

OFC, this is just a guessing, we need to test, test and test again. :]

FIY: +Mark Traphagen, +Neil Patel, +Ian Lurie, +Neil Ferree
 
The Google Webmaster Help Forum post is for the Panda Update (its start date is early March of last year).

+Kerry Rodden is still with Google (though possibly working solely on YouTube these days), and is on Google Plus, and might show up and tell us something about that poster maybe (fingers crossed). It is interesting, and if there was anything published that followed upon on it, it would be great to find out about it.
 
I'm not really sure. I do agree with Tad, Google can let go of the 3 (but very important factors in determining relevance/weight - Pagerank, anchor text, link attribute), though I'm also thinking that they'll probably want to start testing on smaller factors.

I'm guessing that it has something to do with the "age of the link/linking page" - removing the value from it - since people do use this as a way to buy links, and Google is pretty much relying on social signals these days, plus this as a ranking factor makes new and more related content rank below older pages.
 
With everyone's feedback I am convinced it's not something tangible they have changed but instead a way of processing link information.

Maybe they are talking about something similar to introduction of Pregel framework to speed up link graph processing as per Google Research post here: http://bit.ly/yyTHaf and as I illustrate in 3D here: http://bit.ly/kiZpGS
 
No way it is anchor text. For how much targeting we do with anchor text, there would be so many people in the SEO community already talking about how their sites were impacted.
 
I'm with Dan's last comment. I suspect it is something so subtle it will be tough to discern. 
 
I think it is NoFollow which might be the method they turned off. Gotta see what Matt Cutts would like to say about it.
 
I did some digging thanks to +Lyndon NA & +Sasch Mayer and found the paper published by Google in 2005. It fits the criteria of being a link analysis method that could have been used for a few years. It basically breaks down into two link analysis methods:

1) Query-Independent Connectivity-Based Ranking
2) Query-Dependent Connectivity-Based Ranking

Reference: http://static.googleusercontent.com/external_content/untrusted_dlcp/research.google.com/en//pubs/archive/9019.pdf

CC: +Barry Schwartz
 
Hardly wait to see +Lyndon NA's comments about this. Probably 1000 words or more. :]
I am expecting more.
 
+Ivan Dimitrijević
ROFL.

Okay, succinct summary :D

* No idea what was changed.
* No idea why it was changed.
* Possibly being distracted (they said "... in particular ..." , not "only" etc.) - so there may be several changes, not just one.
* We may be looking at them simply replacing/substituting old for new
* If you want to look at a list of possible items, you can shorten it off the bat simply by age - some items are too old or too new to be counted (they said "... several years ..."

(Fuller details on +Bill Slawski's post {https://plus.google.com/106515636986325493284/posts/AnZS1YuCK4u}
:D

(there - short enough?)
AJ Kohn
+
2
3
2
 
I'm with the majority on this one. I can't see it being anchor text or PageRank. These are still fairly strong indicators for topical relevance and authority. They've been abused but I sense that Google may be getting better at normalizing the abuse. It's likely something more subtle.

I'm thinking it's something to do with link position or number of links. I recall that Google changed their guidelines on number of links on a page from <100 to the more vague 'reasonable number of links' per page. (I still like less than 100 BTW.)

That change was made, in part, because Google could now crawl and index more of each page. Their bandwidth problems had been solved by Caffeine.

So when I think about this change I think about what link problems Google was trying to address pre-Caffeine that were obviated by that launch.

I wish I had more time to cogitate on it but I have to practice my presentation and get on the road to San Jose.
 
It cannot be PageRank - as that is a metic, not a signal/characteristic. Further, the PR you see is a total of various other metrics (inc. LinkValue and SiteTrust/Authority).

I don't see it being Link Text ... that's an old signal, and is likely used for SiteAuthority/SubjectAuthority for sites.

So that's those two out.

Personally, I think g are simply ditching an old and using a new - nothing to overly panic about. They get rid of an old method and use a new/better one.

Failing that (hey, I could be wrong :D), I'd be looking at correlation factors ... such as keywords in the link URL? (that doesn't mean keywords in a URL are ignored, merely that keywords from inbound link urls may be ignored).
 
Since it said "link characteristics", it is probably something very subtle like formatting of the link or some combination of minor things. Some improvement in spotting anchor text spam would be nice, though. Apparently the link signal change isn't very significant or we'd be hearing the outcry similar to all the Panda hand-wringing.
I am more interested in the refinements to local search that were mentioned. In particular, "we’re better able to detect when both queries and documents are local to the user..."
 
+Tadeusz Szewczyk Post is missing, but I'll take your word for it :-) I am sure there are plenty of people who have seen a drop, but so far I haven't seen any major freak-outs.
I am looking for them, though, since a pattern may be seen by looking at who gets hit the most. Within the last couple of weeks I did see some of my clients' competitors disappear almost completely, then reappear somewhat lower than before. Those were mostly keyword-rich domains.
 
Ah, "This post was originally shared with a limited audience". Let me cite then: "Google Update Finally Kills Sitewides?

I saw a few changes in my SERPs when I checked this morning, some pretty significant ones actually. A couple of players who have exact match domains and have been at number 1 for their respective search terms for around 2 years, have both suddenly dropped to the lower reaches of page 1.

I've been looking through a few posts for some suggestions or initial ideas, and came across this post on +Dejan SEO:
http://dejanseo.com.au/google-drops-one-link-signal/

The very last thing mentioned by +Tadeusz Szewczyk stuck me:
"It’s probably something like number of links on a site linking to another site. I think in the recent past 10k links were still better than just one but by now I guess they could neglect the number completely. So whether a site links to you once or 10k times won’t matter probably anymore."

Bingo. Checking Site Explorer for the 2 sites I'm investigating shows me very high link/domain ratios for their target terms. Closer analysis shows plenty of sitewides in operation, lots of footer and sidebar links.

It has been said for some time that additional links on the same root domain don't count for extra, so perhaps this is Google finally pulling the plug?"
 
+Mark Shaw
The question is - what makes that site rank for that term?
The DomainName match itself - or the inboundlinks that happen to use the Domain, and thus happen to have the keyword in the link text (or the other links with those words in the link text)?

It's seldom a single factor - and most points eem to hold various values at the same time.
g could ditch a single facet - and we'd likely barely notice.
 
There was some talk about trying to make it more difficult for exact match domains to rank for search terms, if Google used domain within the link as a signal of relevance by turning this off they would be reducing the ability of a site to easily rank for that term.
 
+Simon Dalley
I think someone sawa Patent or simialr about it...
... and I think G may have played with tweaking it up/down over the last few years.

The problem is - it's not a bad idea ... and there may be multiple aspects of it... and we don't know which parts G may be playing with.
 
After reading Bill's post, I wonder if it's something like: 4. Cross Language Information Retrieval, which will be hard to test for, but may be outdated in terms of what Google can now do with translation. Also I think #11 Links between Affiliated sites will be in the firing line at some point. If you follow the "don't make Google look stupid" rule of thumb, there have been a heap of posts around blog networks and how easy it is to get rankings using these systems, could be Google will take a closer look at these.
 
Hmmmm.

I'll be a little naughty here ]:P

It's not that "link schemes" fail, nor that "bought links" are easily detectable.
It's greed, stupidity and laziness that does most of the damage.

You can easily get away with arranged links, bought liks etc. - but it still requires time, effort, research etc.
You know - the things like these networks try to avoid.

It's like buying things out the back of a van - you cannot honestly expect them to be legitimate, of quality or to retain real value. Where as if you go out bargain hunting, visit some auctions and actually talk with people ... you tend to get better results.

The networks should be shot down.
Simply for the damage tehy do and the lack of accoutnability they have.
 
+Lyndon NA that +1 button was very ugly (new design) but I still pressed it for you make an excellent point.
 
I'm not seeing any new button?
(not unless it looks similar to the old one and I'm jsut unobservant)
 
nope - I still have the old one :D
 
position (header/footer/or in content) of the links.



WebRep
Overall rating
Add a comment...