A friend of mine got such a message. Not only does it concern link from the Google cache, but it also shows the page where Googlebot is not allowed because of the robots.txt file (User-agent: * Disallow: /search).

There are more and more examples of bad links which can be found as bugs. I'm curious whether in this case, webmaster will still have to wait a few weeks until he finds out that maybe everything is OK with his site and this was only a mistake.
Shared publiclyView activity