Profile cover photo
Profile photo
John Vantine
295 followers
295 followers
About
Communities and Collections
View all
Posts

Post has attachment
I took some photos of a "fogbow" from a boat in the SF bay yesterday, and SFGate ran a story about the phenomenon. Pretty cool!
Add a comment...

What is the relationship between subdomain and root domain crawl rate?

If I introduce a number of (legit, non spammy) subdomains with a lot of pages on them, should I expect my crawl rate on the root domain to decrease as Google allocates resources on crawling the subdomains? Or does Google crawl them as separate sites?

I've been looking into data-driven internal link solutions such as BloomReach and Myers Media Group. I'd like to compare these offerings to a few others, but I'm having trouble identifying similar offerings. Does anyone know of any other SaaS internal link solutions?

Our old sitemaps were full of 404s. We were submitting 32.6 million URLs (!!!), and 4 million of them were indexed (12%).

We rebuilt the sitemap generator and removed the vast majority of the 404s (now less than 1% 404 rate). We’re now submitting 6.9 million URLs, and we have 2.3 million indexed (33%)

Why do we now have less pages indexed if we’re submitting a higher quality batch of URLs to Google? A higher percentage of the pages are indexed now, but the overall index count is still lower.

The new sitemaps have been live for about 2 weeks now, and the number of pages indexed is not increasing. Is this to be expected - does Google only index a certain percentage of total URLs submitted on larger sites like this? Or is something else at play here?

Does anyone have a sense of how Google evaluates pagespeed for SPA (single page application) sites? A site I'm working on was recently relaunched in React+Redux with isomorphic rendering, with an emphasis on site speed.

After a lot of performance enhancements, the site feels very fast. However, the initial payload is inherently large (resulting in 5-6s load times), but subsequent requests are lightning fast - we're using HTML5 pushstate to update the URL for each subsequent request, but the browser isn't fetching the entire page each time... Subsequent requests are handled by JS fetching data from the backend.

I have several page speed monitoring solutions in place (Monitis, GA, New Relic) but none of them give direct insight into how Googlebot might be evaluating performance with this type of setup.

Do they evaluate each page on a URL-by-URL basis, and look at the initial payload (and corresponding slow-ish load time) for each? Or is it possible that once they load the initial JS/HTML, they continue to crawl from there? Are they essentially "refreshing" for each URL, therefore associating each URL with a higher load time?

Post has attachment
We all knew this one was coming. Time to get creative with your UX if you need information from your users!
Add a comment...

Post has attachment
Check out the latest track from my brother's new musical project, Starsick. Warning: it'll get stuck in your head.
Add a comment...

Regarding search engines crawling JS, we all know that Google announced support, but has anyone heard of other search engines planning to follow suite?

I've been keeping my eyes/ears open, and have seen limited evidence of Bing crawling JS, but have not heard anything official.

Wanted to double-check here in case I missed something :)

Post has attachment
Last year I spent a week riding through the midwest. Finally took the time to write about the experience and share some photos.
Add a comment...

Post has attachment
Last week I shared a rich snippet that I discovered where an automotive site had some of their proprietary ratings being displayed in the SERPs without the help of structured markup (https://plus.google.com/u/1/+JohnVantine01/posts/57qGcEvJxDc).

I just came across another example that I wanted to share. My daughter and I have been playing a lot of Super Mario Maker for Wii U, and we were looking for level codes. Once again, Google is displaying them right in the SERPs without the assistance of structured markup. Pretty cool.

Anyone else notice anything like this recently?
Photo
Wait while more posts are being loaded