A friend of mine got such a message. Not only does it concern link from the Google cache, but it also shows the page where Googlebot is not allowed because of the robots.txt file (User-agent: * Disallow: /search).
There are more and more examples of bad links which can be found as bugs. I'm curious whether in this case, webmaster will still have to wait a few weeks until he finds out that maybe everything is OK with his site and this was only a mistake.
There are more and more examples of bad links which can be found as bugs. I'm curious whether in this case, webmaster will still have to wait a few weeks until he finds out that maybe everything is OK with his site and this was only a mistake.

View 15 previous comments
That was exactly what I meant ;)Jan 7, 2014
I asked the team about this. Sometimes they do this if the link is on the cached page, but no longer on the live one (eg if it's using a rotating link scheme).Jan 9, 2014
Thanks for the answer. So a webmaster should just wait until Googlebot recaches the website and sees that the link is no longer there, right?Jan 9, 2014
No. If the webmaster is taking part in a link-scheme that uses rotating / random link placement, they need to get out of that. Otherwise a web-spam manual review (eg for the reconsideration request) will just bubble up the new links.Jan 9, 2014
No, he's already removed the link. It's not a link exchange system.Jan 9, 2014
That sounds good then :).Jan 9, 2014