Profile cover photo
Profile photo
Jannick Garthen
94 followers
94 followers
About
Posts

Post has shared content
An update (March 2016) on the current state & recommendations for JavaScript sites / Progressive Web Apps [1] in Google Search. We occasionally see questions about what JS-based sites can do and still be visible in search, so here's a brief summary for today's state:

# Don't cloak to Googlebot. Use "feature detection" & "progressive enhancement" [2] techniques to make your content available to all users. Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn't support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.

# Use rel=canonical [3] when serving content from multiple URLs is required.

# Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don't use a "meta fragment" tag if the "escaped fragment" URL doesn't serve fully rendered content. [4]

# Avoid using "#" in URLs (outside of "#!"). Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.

# Use Search Console's Fetch and Render tool [5] to test how Googlebot sees your pages. Note that this tool doesn't support "#!" or "#" URLs.

# Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren't blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.

# Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.

# Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct "lastmod" dates for signaling changes on your website.

# Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

Looking at this list, none of these recommendations are completely new & limited to today -- and they'll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites!

I hope this was useful! Let me know if I missed anything, or if you need clarifications for any part.

Links:
[1] PWA: https://developers.google.com/web/progressive-web-apps
[2] Progressive enhancement: https://en.wikipedia.org/wiki/Progressive_enhancement
[3] rel=canonical: https://support.google.com/webmasters/answer/139066
[4] AJAX Crawling scheme: https://developers.google.com/webmasters/ajax-crawling/docs/specification
[5] https://support.google.com/webmasters/answer/6066468
Photo
Add a comment...

Post has attachment
This is just great! 
Best of all: "You can start already by adding our evergreen build badge to your README" :D
Add a comment...

Post has attachment
I really hate these "download our app, it is soo much better" layers. In most cases I just want to use the website because it is as good as or sometimes better than the app. It is just annoying to click a close button, sometimes two of them, before accessing the content I am interested in. Further I really don't need apps to consume space and memory on my phone all the time! I just wonder if those websites owner do not trust in there own product...
Google+: A case study on App Download Interstitials
Google+: A case study on App Download Interstitials
googlewebmastercentral.blogspot.jp
Add a comment...

Post has shared content
Don't guess it, test it!
"The Hamburger Menu Doesn't Work": bit.ly/1h87cYa - good critique and design tips. Definitely an abused pattern...
Photo
Add a comment...

Post has attachment
Lets follow this appeal to update picturefill! I just added a reminder for Monday morning ;-)
Add a comment...

Me and my team won a Sony Smartwatch at this year #hackathonHH with our project LifeBits, where we collected data from the Sony Lifelog API to show the highlights of your day: https://github.com/garthenweb/lifebits
We had a lot of fun, thanks to the organizers and sponsors to make it happen!
Add a comment...

Post has attachment
Good insights into http/2
High Performance Browser Networking
High Performance Browser Networking
chimera.labs.oreilly.com
Add a comment...

The user agent string of the current Internet Explorer on the Windows 10 Developer Preview is looking quite interesting:
Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.71 Safari/537.36 Edge/12.0

Particularly in comparison to Chrome:
Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.101 Safari/537.36
Add a comment...

Do you know how certain manufacturer will use the webview updater or are able to work around it? I tried the Galaxy S6 and found out that the default browser was at version 38... As far as I know, there was only version 37 and 40 released until now.

Post has shared content
I am still amazed by the cadence of browser updates:
2006: every 3 years 
2014: every 6 weeks

Getting performance upgrade, security improvements and more features in to users and developers hands at an ever increasing pace.

https://speakerdeck.com/paulkinlan/this-is-the-web-platform?slide=8
Photo
Add a comment...
Wait while more posts are being loaded