Shared publicly  - 
 
I'd love to see more of how the Webmaster Tools API is being used, or where people are otherwise reusing data from Webmaster Tools. Who has some neat examples, or ideas? What would you love to see in an updated API? 
44
8
Abdul Wahab Butt's profile photoBryan Passanisi's profile photoGuillemot Nicolas's profile photoMohammed ALAMI's profile photo
33 comments
 
Die Rankings würde ich gerne sehen für Domain, Google+ +John Mueller (moderner und entwickelter wie jetzt) und gerne auch eine View/Follower-Ratio.
Translate
 
+John Mueller I'd love to see more control over the filters starting in the search traffic area.

Other items that would be good to get via API
top queries + top pages
internal links report
index status
content keywords (watch for spam/hacked sites)
be able to create new URL removal requests
much more detail on crawl errors + crawl stats
url parameters reports/data
structured data reports
 
Idea  for long term - how about posting ALL links pointing to site in webmaster tools and allow webmasters to check off whether they want link to count or not - this will help fight negative SEO, links from unwanted sources and also to help clean up links that have been missed in removal and disavows. Much easier than disavow tool and good way to fight spam.
 
Webmaster tools data would be a lot more widely used if it was actually useful and safer to get.

1. There are scripts that log in to webmaster tools to download csv files that are not available through the API - these scripts use username/password and people store those on servers, and often use their primary Google login credentials.
At the very least pls make this data just an API call away.

2. I think what we would really all like to see is useful data

Return all keywords for a particular URL including position data & CTR based on a particular filter.

It might cost money to provide that data, and it still isn't as good as providing it in real time to our analytics, but that is a small amount of the data we have effectively lost.

It may be that some of this ability is already out there in some beta implementation.

Alternatively we can all start using headless browsers to scrape partial data from WMT, but I doubt that is the kind of implementation you want to encourage.
 
I'd like to see the "Modify site settings" and "Add and remove sites from your account" methods removed - or even better: a system to manage the access rights of API keys.

Those methods are in my opinion too destructive for an "ordinary" API. For instance a 3rd party, which gets an GWT API Key, might (by mistake) change settings like the default domain or similar.

Besides that more detailed CTR (and maybe bounce rates) reports would be awesome features.

PS: First point refers to:
https://developers.google.com/webmaster-tools/docs/2.0/developers_guide_protocol#AD_Deleting
and
https://developers.google.com/webmaster-tools/docs/2.0/developers_guide_protocol#AD_PreferredDomain
 
I think if you allowed more use of the data you would start seeing great implementations of webmasters monitoring their sites.
I have a very specific and useful and fabulous implementation I would love to do for small business owners and normal webmasters and one of the biggest things holding me back is not being able to give people their own data.
Please John, do me a favor and let me know if you have anything opening up for this api. thanks.
 
I've written a tool that downloads GWT data via the current API + a few tricks:

http://seo-website-designer.com/SWAT-Google-Webmaster-Tools-Exporter

Then I have a few other tools that use the GWT data:

A CTR v Position chart

http://seo-website-designer.com/GWT-Click-Through-Rate-Chart

Query analysis charts:

http://seo-website-designer.com/GWT-Search-Queries-Chart

And a backlink analysis tool:

http://seo-website-designer.com/SWAT-BackLink-Checker

When I get the time I'd like to pull these things more together, then add more charts and analysis tools. 

With the API I'd just like all the current data to be available in a clean and simple format. 
 
On GWT improvements I would love a way to mark an error as "I don't care". That way we can take out legitimate 404s from the report and see the errors that are real problems.
 
Establishing a data connector between GWMT and +Adobe Marketing Cloud (Analytics) would be great. Losing the keyword data in analytics was a blow to the user experience people because we lose insight into the user intent when they enter the site. Being able to see that keyword data through a secure channel would be very helpful. And doing that in a way that others could replicate, ie data connector would make it useful to many businesses.
 
Unterscheidung zwischen follow/nofollow Links
Translate
 
pls include search queries in the API
 
I would love to see improved documentation for getting up and running with the API. At the moment it is only Java focused. 
 
It would be great if we could get related search queries per page.
 
It would be great if we could get more links Data.
 
+John Mueller: What's the reason for Google Analytics "Search Engine Optimization" report still showing rounded search query data while Webmaster Tools shows more or less exact data. These numbers should be exactly the same from my point of view. 
 
The biggest limiting factor with using the Webmaster Tools API is that it uses the old data feed style API and isn't supported yet in the normal Google API PHP library.

We do a ton of stuff with AdSense, Analytics and Google+ APIs, and just yesterday we were going to start building some stuff with the Webmaster Tools API, but ended up putting that project on hold until the official library supports it (our backend is pretty tightly integrated with the current API, so rebuilding all that so we can use a different type of API just wasn't worth the effort).

Hopefully Webmaster Tools is added to the official PHP library before too long so we can revisit.
 
I want to build a dashboard in my company's backend for my SEO team to use.  One place to view analytics, rank data, and GWT data.  I'd love to see top queries, top pages, html improvements and links on that dashboard.  We can only currently pull in messages & crawl errors (which are helpful but don't paint the whole picture).  
 
+Pete Bruhn I have an SEO company and we have done some of this and work with in-house SEO teams or provide them access to our cloud-based SEO software.

1. We have Google Analytics, rank data and other API data (plus a few other tools) aggregrated into a web dashboard.  We have a 11:00 demo video at https://vimeo.com/87632780

2. Later this week we will be adding Reports for Top Queries and Top Pages to our system.

If you want to learn more, let me know.  
 
Crawl Stats (hits, hour, type bot)  for each url,which could filter by regex.

So we know that pages could spend much time to Googlebot, and we can detect errors in paginations or parameters for example
 
Please increase the Top 2000 daily search query limitation!
 
Often when I look at crawl errors and in particular 404s the sample URLs provided that link to the offending page are outdated. This makes the investigation quite tricky.
More complete information about back links (internal and external) would help.
When you cannot authenticate on (sub) domain level you don't get all the information (crawl stats for example) that you otherwise get.

Overall the tools are quite good but I think there could be more consideration to those who have to use them on a daily basis (which I think this discussion is about). So thanks for the interest and hopefully you are implementing some of the suggestions.

I guess the majority of the people want to play by the rules, it is sometimes just too complicated to understand what the rules are. If you cold focus on this thought when implementing changes to the tools then I think this would be a good thing!
 
NOT API but this needs to happen... You should be able to set an estimated page count for your websites. Then whenever the # of index pages grows to say 4 or 5 times that number you should get an alert that summarizes not-ignored parameters and title & description duplication... I manage dozens and dozens of websites and it is a pain to go into each one and figure out if there is a problem and what it is... it sucks that the information is spread out in several different places.
 
Using the API to automatically download the data and archive in a database and incorporate in client dashboards, etc, though the limited library means getting pretty creative to store some data.

I'd like to see:

API - 

1) Live, Direct Tableau Integration [like the GA connector]
2) > 90 days of data. [!important]
3) Make extracting Crawl Errors easier. The JAVA library breaks once it hits a Soft 404 in the data; had to write PHP to get around; not ideal.
4) Create a DAILY Site Summary Feed - Extracting all lines is great, but a site-summary feed of all metrics would be valuable: ranked KW count, ranked page count, median page/KW rank, impressions, clicks, 404s, 500s, etc. Every item in the feed is a day, go beyond 90 days.
5) Report more than 2000 KWs and Pages per day. Now that we're all 'not-provided' in GA, the 2000 daily limit presents a big challenge for larger brands [lots of branded terms].
6) Expand to more supported client libraries than just Java.
7) Add multi-site extracting to API capabilities, and add field for Site.
8) 'Linked From' is available in the error/issue reports, but 'Internal Linked From' would be helpful.

For the GWT Interface [initially]:
9) Add Redirect Chains to Crawl or Content Issues report and API data: [ex: 301 > 301 > 200].
10) Google Trends integration - not really an API request [yet], but incorporate Category>Subcategory indices, geography, and 'related sites & searches' into GWT interface.
11) Custom alerts, Anomaly Detection, AND integration with GA Intelligence Alerts. It should be easier to catch and diagnose issues as they come up, even if we have to define the criteria initially.
12) Better filtering - let me see 'ranked KW count' or 'ranked pages count' for the criteria I specify/search for.
13) More seamless integration b/w GWT & GA, data from both showing in both. Make GWT 'home base' for my SEOs, GA 'home base' for my analysts and marketers.
 
We are using Top Queries Data and Top pages data. Also Love to get Manual Actions. And all this data would be able to get using Oauth 2.0 credentials and with API.
 
it would be nice if search queries could be exported applying all the filters available on the web interface (all/web/mobile and country)
 
If any of you have time & a bit of experience with the programming required, feel free to let me know if you'd like to try some of the early versions out. We'd love to get as much feedback as possible, even if it's still very early. Send me a note directly if you're interested.
 
I often work with Google APIs with php, Analytics, Adwords  and WMT 
 
I really would like to have an EASY way to export ALL crawl-errors (not only 5.000) and also ALL "linked from" data for those crawl errors.

That way I could decide, which crawls errors are important enough to take care of...
 
+John Mueller We are definitely interested. Within Rocket Internet we are doing benchmarking across more than 50 countries and 70++ companies. Currently, we are mainly restricted to the Adwords API and the Google Analytics API and our marketing IT team would be able to integrate the webmaster API within a few days and provide valuable feedback to your team.
 
I'd really like to be able to extract the data used to build the crawl stats graphs - so that would be - per day figures for ... pages-crawled, kb-downloaded and page-download-time.  Just like with GA - I'd like to be able to retrieve those stats for any date.  Might as well have per day figures for pages-indexed, pages-blocked, pages-removed, crawl-errors - and then the daily figures for search stats, queries, page-impressions, clicks.
All easy, right?
 
Php librery please!!! 

I'm going crazy, but I'll get it and share :)
 
John, when will be a WMT API update? The information you can request is so limited, practically the API has no use.

Thank you
Asher.
Add a comment...