Profile cover photo
Profile photo
Alessandro Cuomo
682 followers
682 followers
About
Alessandro's interests
Alessandro's posts

Post has attachment

Post has shared content
Qual è stato lo speaker che ha scatenato il più alto impatto emotivo al Wired Next Fest? Scoprite l'analisi delle emozioni degli speech nel nostro post. #wnf16 #TSWexperience http://bit.ly/TSW-Emozioni-Wired

Post has shared content
An update (March 2016) on the current state & recommendations for JavaScript sites / Progressive Web Apps [1] in Google Search. We occasionally see questions about what JS-based sites can do and still be visible in search, so here's a brief summary for today's state:

# Don't cloak to Googlebot. Use "feature detection" & "progressive enhancement" [2] techniques to make your content available to all users. Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn't support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.

# Use rel=canonical [3] when serving content from multiple URLs is required.

# Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don't use a "meta fragment" tag if the "escaped fragment" URL doesn't serve fully rendered content. [4]

# Avoid using "#" in URLs (outside of "#!"). Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.

# Use Search Console's Fetch and Render tool [5] to test how Googlebot sees your pages. Note that this tool doesn't support "#!" or "#" URLs.

# Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren't blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.

# Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.

# Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct "lastmod" dates for signaling changes on your website.

# Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

Looking at this list, none of these recommendations are completely new & limited to today -- and they'll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites!

I hope this was useful! Let me know if I missed anything, or if you need clarifications for any part.

Links:
[1] PWA: https://developers.google.com/web/progressive-web-apps
[2] Progressive enhancement: https://en.wikipedia.org/wiki/Progressive_enhancement
[3] rel=canonical: https://support.google.com/webmasters/answer/139066
[4] AJAX Crawling scheme: https://developers.google.com/webmasters/ajax-crawling/docs/specification
[5] https://support.google.com/webmasters/answer/6066468
Photo

Post has shared content
Siamo stati al Search Marketing Connect: strategie di SEO internazionali, produttività e aggiornamenti. Scopri il resoconto del nostro SEO Manager +Alessandro Cuomo nel nostro blog! http://bit.ly/TSW-SMConnect

Post has shared content
💻 Numeri utili sugli Utenti Twitter in Italia.
(PDF http://bit.ly/chi-usa-twitter-italia)

LEGGI ANCHE
Lo stato degli Utenti Attivi e Registrati sui Social Media in Italia e Mondo
http://bit.ly/Social-Media-User-2015

#SMM #SocialMedia #Twitter #TwitterItalia #Numeri #Statistiche  
Photo

Post has shared content
Ciao a tutti,

mi piacerebbe fare una chiaccherata/hangout con gli esperti italiani o webmaster italiani che si sono occupati di ripulire siti compromessi, che hanno ricevuto azioni manuali dovuti a siti compromessi o che hanno visto l'etichetta in SERP, soprattutto mi piacerebbe scoprire chi sono questi esperti, quanti siti "ripuliscono", che tipi di servizi offrono se lo fanno professionalmente, questo per capire meglio il problema e trovare insieme soluzioni e migliorare la comunicazione sull'argomento :) Se siete interessati a partecipare fatemi sapere :)

Post has shared content
Gli utenti internet cinesi sono più di 600 milioni, la maggior parte dei quali utilizza i dispositivi mobili e i social media. Scopri come coinvolgere queste persone con una strategia di marketing digitale per la Cina leggendo il blog di TSW: http://bit.ly/1eiyPeA
Photo

Post has attachment

Post has attachment

Post has shared content
Here's the TL;DR of what we found by testing Googlebot and JavaScript:

1. Google can execute and then follow many, many different types of js links.

2. Google can execute and follow several types of js redirects, and the result is similar to how a 301 updates the index and preserves rank.

3. Google can read the DOM and then index dynamically inserted content.

4. Google can read meta data inserted in the DOM, such as rel canonical tags, robots noindex tags, and meta descriptions. We even inserted structured data dynamically and it resulted in rich snippets in Google's SERP.

These tests were designed and executed by the +Merkle SEO technical team led by +Jody O'Donnell and +Max Prin 
Wait while more posts are being loaded