Shared publicly  - 
Link data and keyword data. We lost both in significant ways in 2011. It's a big issue for SEOs, publishers and even to some degree, searchers. Here's my very long look at the first pushback I can recall search engines ever doing with publisher support services, after years of welcome advances. And I'm hoping this changes.
Daimon Hayes's profile photoJunior Frazão's profile photoBoxcar Marketing's profile photoAnthony Godinho's profile photo
actually I was saying now link building and other seo parameters are not enough, social parameters are also required.
Danny this is a great piece. I am glad someone with your stature in the industry has been relentless in bringing the issue of loss of keyword data and lack of link operator to the fore. I know you are not alone in this quest, but I know your voice carries a lot more weight. Nevertheless, with the current "silence" from Google and Bing on the topic and their companies knack for not admitting mistakes, do you really foresee a change to accommodate a loud minority? Bing is losing tons of cash, so I am not sure if they have the will to create any tools,and Google has no real "pressure" to change.
Google should just tell it like it is: "I'm sorry people, but doing SEO will become increasingly irrelevant during the next two or three years".
Danny, the problems you have highlighted are very valid but are caused by what will be an ever increasing problem and that is the monopoly that Google currently holds over the search market. When it was reported that Bing was mimicking Google SERP results it made it even worse because there isn't a other variable algorithm in place that is a good competitor. The ability to only see 30 days worth of keyword search results in Webmaster tools plus the issue of non disclosed keywords in Analytics is making things more difficult than ever before. But I think the biggest problem as you have identified is they have given us such a large amount of tools over the years and now by taking them away they have caused the issue - we wouldn't have known better if it had always been the case.

Maybe the larger issue is because of the increasing intelligence of SEO, Google have taken this stance to protect from competitors from being able to calculate accurately the algorithm behind the SERPs so it can't be mimicked. Or maybe they are just so big and without a major competitor they can do what they want :)
Is this data that only the search engines can provide, or can some other company mine the data and provide it for a fee?
+Scott Supak For the most part, other companies have to crawl the web themselves to get this type of data, and they don't see exactly what Google or Bing know.
+Danny Sullivan Maybe Google would open up some of the data for a third party to crawl and then charge for the results? Seems like with the demand there, Google would want to tap it if they could make some cash off it.
+Danny Sullivan A great article, a definite (and troublesome) trend.

A small correction re Bing - they removed support for the link: (and linkdomain:) operators a long time ago. What that search query you're showing with one result is doing is a simple keyword match, not a link search.
I think this piece is a great summary about how Google has been offering support to SEOs right from start but can do much more as they have all the data now in fact the data about social signals too.

The awareness of SEO has also improved a period of time and if Google at this stage continues to share more and more information it will become increasingly difficult for Google to maintain and improve the quality of search results. We saw that by 2010 the content and link spam had reached to a great extent for which Google had to come up with the Panda Update.

IMHO especially with regard to Keyword Referrer Data:

2011 was a year of changes and I think it is a period of transition to a better web and better search results as SEO is much beyond keywords and rankings.

When the businesses are at a loss for the complete keyword data the focus is shifted to the search queries in WMT which have a good CTR which is a true measure of quality over quantity.

This restriction makes the website owner think from a larger perspective and focus on the correlation of content and keywords rather than rankings. This will take SEO campaigns above the metrics of keywords and rankings and the focus will be on other quality metrics like CTR , conversions, bounce rate, etc. which will improve the quality of the web overall as the websites besides being rich in content will have to focus on good landing pages, a proper call to action, page load speed and good navigation which will ensure a better UX .

This lack of data will draw the line of distinction between a PPC campaign and a SEO campaign. The quality metrics will be CR and the CTR which again will make the client focus on content and the landing page design which will again be a quality step towards a better web world rather that discussing about keywords the client will be open to discuss about content and design.

Have shared my views also on
Red today a nice post on SEOmoz about a widely felt lack of trust towards many of best known Analytics services...Sad to aknowledge. Wouldn't it be better to get great results from companies that could? Web is evolution and getting better everyday.
Add a comment...