Profile

Cover photo
Robert Ramirez
Works at Bruce Clay, Inc.
Attended UCLA
Lives in South Pasadena, CA
785 followers|112,092 views
AboutPostsPhotosYouTube+1's

Stream

Robert Ramirez

Shared publicly  - 
 
 
+Google Webmasters Structured Data Testing Tool gets new design, moves to new address #structureddata #semanticweb   #SEO . Wonder why they double index it ;) Perhaps they forgot to tag it semantically :0 (which they did, not a bit of structured data on the page, actually not even a rel=canonical) +John Mueller BTW, isn't the color a bit Bingish?
11 comments on original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
 
schema.org v3.0 Released

Full release details at the call-out link, but some highlights.

New extension: http://meta.schema.org

Terms "primarily designed to support the implementation of the Schema.org vocabulary itself."

New extension: http://pending.schema.org

A "staging area for work-in-progress terms which have yet to be accepted into the core vocabulary."

New extension: http://health-lifesci.schema.org

A "new home for our existing Medical and healthcare related terms."

*New document, "How we work": http://schema.org/docs/howwework.html*

A "document providing an overview of the project's approach to schema development, collaboration, versioning and change review." A treasure trove for those interested in, or have a vested interest in, the development of the vocabulary.

*New property: http://schema.org/disambiguatingDescription*

A sub property of description, "used to disambiguate from other, similar items."

audience, brand, logo, isRelatedTo, isSimilarTo

These previously Product-specific specific properties can now also be applied to Service.

*New class: http://schema.org/DigitalDocument*

"An electronic file or document", with sub-types NoteDigitalDocument,
PresentationDigitalDocument, SpreadsheetDigitalDocument, TextDigitalDocument.

*New type: http://schema.org/ComputerLanguage*

With modifications to http://schema.org/Language to stress it that it refers to natural languages, and with support for standard BCP 47 language codes via the alternateName property (for SEOs, this brings the functionality of language types and properties close to that provided by hreflang).

*New enumeration: http://schema.org/Monday*

And (surprise!) the rest of the days of the week as well, replacing the purl.org/goodrelations URIs on which these are based - thanks +Alexandre Bertails! Also, http://schema.org/specialOpeningHoursSpecification has been added, which allows webmasters to override general opening hours

Not seeing the current version? I am, but +Dan Brickley notes ""there seem to be some further issues with old versions of pages being cached. It was looking like this had resolved itself but I see
now that this isn't the case. At least the homepage and /docs/schemas.html seem to be intermittently showing old versions of those pages. You can force a fresher version by appending '/' to thedomain name e.g. http://schema.org////docs/schemas.html but obviously this needs attention""

#schemaorg  
Schema.org is a set of extensible schemas that enables webmasters to embed structured data on their web pages for use by search engines and other applications.
1 comment on original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
 
Bing releases preview of its new search API

I thought this was an interesting tidbit for members of this Community:

The new APIs are REST APIs that follow the latest structured data standards (Schema.org, JSON-LD), making them easy to implement, with the same reliability and support that has made Bing a trusted search service for many industry leaders.

#bing #structureddata #apis
The Bing team knows that developers want solutions that are easy to understand, simple to implement, and can make your apps and experiences smarter and more engaging for your users. Based on feedback from developers, partners and customers, we have created the next-generation of search APIs. Today we are giving a preview of the new...
8 comments on original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
Mandatory reading for #seo practitioners.
 
An update (March 2016) on the current state & recommendations for JavaScript sites / Progressive Web Apps [1] in Google Search. We occasionally see questions about what JS-based sites can do and still be visible in search, so here's a brief summary for today's state:

# Don't cloak to Googlebot. Use "feature detection" & "progressive enhancement" [2] techniques to make your content available to all users. Avoid redirecting to an "unsupported browser" page. Consider using a polyfill or other safe fallback where needed. The features Googlebot currently doesn't support include Service Workers, the Fetch API, Promises, and requestAnimationFrame.

# Use rel=canonical [3] when serving content from multiple URLs is required.

# Avoid the AJAX-Crawling scheme on new sites. Consider migrating old sites that use this scheme soon. Remember to remove "meta fragment" tags when migrating. Don't use a "meta fragment" tag if the "escaped fragment" URL doesn't serve fully rendered content. [4]

# Avoid using "#" in URLs (outside of "#!"). Googlebot rarely indexes URLs with "#" in them. Use "normal" URLs with path/filename/query-parameters instead, consider using the History API for navigation.

# Use Search Console's Fetch and Render tool [5] to test how Googlebot sees your pages. Note that this tool doesn't support "#!" or "#" URLs.

# Ensure that all required resources (including JavaScript files / frameworks, server responses, 3rd-party APIs, etc) aren't blocked by robots.txt. The Fetch and Render tool will list blocked resources discovered. If resources are uncontrollably blocked by robots.txt (e.g., 3rd-party APIs) or otherwise temporarily unavailable, ensure that your client-side code fails gracefully.

# Limit the number of embedded resources, in particular the number of JavaScript files and server responses required to render your page. A high number of required URLs can result in timeouts & rendering without these resources being available (e.g., some JavaScript files might not be loaded). Use reasonable HTTP caching directives.

# Google supports the use of JavaScript to provide titles, description & robots meta tags, structured data, and other meta-data. When using AMP, the AMP HTML page must be static as required by the spec, but the associated web page can be built using JS/PWA techniques. Remember to use a sitemap file with correct "lastmod" dates for signaling changes on your website.

# Finally, keep in mind that other search engines and web services accessing your content might not support JavaScript at all, or might support a different subset.

Looking at this list, none of these recommendations are completely new & limited to today -- and they'll continue to be valid for foreseeable future. Working with modern JavaScript frameworks for search can be a bit intimidating at first, but they open up some really neat possibilities to make fast & awesome sites!

I hope this was useful! Let me know if I missed anything, or if you need clarifications for any part.

Links:
[1] PWA: https://developers.google.com/web/progressive-web-apps
[2] Progressive enhancement: https://en.wikipedia.org/wiki/Progressive_enhancement
[3] rel=canonical: https://support.google.com/webmasters/answer/139066
[4] AJAX Crawling scheme: https://developers.google.com/webmasters/ajax-crawling/docs/specification
[5] https://support.google.com/webmasters/answer/6066468
29 comments on original post
2
Add a comment...

Robert Ramirez

Shared publicly  - 
 
This article is really great. Lots of actionable insights. #seo
 
Myself +David Harry +Bill Slawski and Hamlet Batista contributed to this post on Ecommerce SEO. I hope it helps along the way.
Four SEO experts talk about the most common SEO issues eCommerce websites face
View original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
 
WHAT?! Native advertising allowing advertisers to buy SEO?

+aimClear has exposed a "loophole" in Google's organic algorithm — a case where advertiser-created content can be indexed for Google News and Web organic search along with a publisher's editorial content.

The case involves +Mashable and its BrandSpeak program, which enables a brand to pay $20K to have an article published on the Mashable platform. Though links to the brand's website are nofollowed, the article itself can rank organically both in Google News and web search results.

With this violation of Google's intent, the FTC's guidelines, and the basic principles of #SEO exposed, it can't be long before Google corrects the ability for an advertorial to take up space in the search results that belongs to true organic listings.

A fascinating read by +Marty Weintraub - 
http://www.aimclearblog.com/2016/02/03/seo-for-sale-exposing-google-loopholes-in-light-of-ftc-native-ad-guidelines/
View original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
I wish Google would stop doing things like this. I have a very specific set of urls I would like indexed, I would prefer if Google didn't try to generate variations I never link to out of the blue. I foresee issues here for #webmasters and #seo.
2
Kane York's profile photoRajesh Saharan's profile photoZineb Ait Bahajji's profile photo
3 comments
 
If you don't want your HTTPS pages to be indexed you can just use a noindex. Butbif we can crawl them and if the TLS certificate is valid, we will index them and treat them as canonical pages. 
Add a comment...
In his circles
1,132 people
Have him in circles
785 people
AUDREY MANCINI's profile photo
scott polk's profile photo
Arthur Navarette's profile photo
Luis Angel Espinoza's profile photo
Ron Burress's profile photo
Gary Lovelace Jr.'s profile photo
Marty Vettel's profile photo
Harvey Levin - TMZ Fan Page's profile photo
Lộc Phát's profile photo

Robert Ramirez

Shared publicly  - 
 
 
New Title & Description Lengths for Google SEO in Search Results AKA Longer Titles Yay!
If you haven’t given your title and description meta tags some love lately, it might be a good thing you procrastinated… there are new title and description lengths in the Google Search Results. Google has increased the width of the search results for nearly all users (although they could roll it back at any time). …
View original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
 
A second mobile-friendly #SEO ranking boost is coming, says Google. Will there be degrees of mobile friendliness? Do AMP pages get a boost? +Robert Ramirez explains what you need to know.
Google says it's turning up the volume on the ranking boost given to mobile-friendly sites starting this May. Can your site capitalize on these gains?
View original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
A public service announcement from +John Mueller...
 
I was using my phone more on the weekend, and your mobile-firendly sites blew me away. Way too many of these are just horrible. Subscription interstitials, app interstitials, browser popups asking for my location, impossible to fill out search forms, login interstitials, tiny UI elements, cookie & age interstitials, "you're in the wrong country, idiot" interstitials, full-screen ads, "add to homescreen" overlays, etc. One - popular & well-known - site had four levels of popups/overlays on a page. Four, Bob. Tip: it's not a contest to see who can hassle the user in the most obnoxious ways. If people aren't using your mobile website to convert and buy something, do you think this will help? 

I was just going to rant about this internally, and I know Google doesn't always do it perfect either, but seriously: webmasters of mobile websites, you folks need to get your act together. Get these sites fixed, purge all those user-hateful UI patterns for good.

Here's an idea: do something that would require a bit of research (where you need to use a variety of sites) and use your phone instead. Remember the design patterns that drive you crazy, and be vocal when someone considers implementing them on sites you work on. 

(FWIW this is purely a personal post)
22 comments on original post
2
Add a comment...

Robert Ramirez

Shared publicly  - 
 
 
Google’s new Resizer tool lets mobile Web designers and developers test any URL
Google's new tool for mobile Web designers and developers is really fun (and powerful)
4 comments on original post
1
Add a comment...

Robert Ramirez

Shared publicly  - 
 
Good stuff from +John Mueller. I had always submitted old and new versions of xml sitemaps, but I had neglected to include /update the modified date. Makes perfect sense. #seo
 
Planning on moving to HTTPS? Here are 13 FAQs! What's missing? Let me know in the comments and I'll expand this over time, perhaps it's even worth a blog post or help center article. Note that these are specific to moving an existing site from HTTP to HTTPS on the same hostname. Also remember to check out our help center at https://support.google.com/webmasters/answer/6073543

# Do I need to set something in Search Console? No, just add the HTTPS site there. The change-of-address setting doesn't apply for HTTP -> HTTPS moves.

# How can we do an A/B test? Don't cloak to Googlebot specifically, use 302 redirects + rel=canonical to HTTP if you want to test HTTPS but not have it indexed. Don't block via robots.txt . More about A/B testing at https://googlewebmastercentral.blogspot.ch/2012/08/website-testing-google-search.html (302 redirects aren't cached.)

# Will the rel=canonical guarantee that the HTTP URL is indexed? No, but it's a very strong signal when picking the indexed URL.

# What's the next step after testing? Follow our site-move documentation ( https://support.google.com/webmasters/answer/6033049 ). Use 301 redirects from HTTP to HTTPS, confirm the new version by adding a rel=canonical on the HTTPS page, pointing to itself, and submit sitemaps including both HTTP & HTTPS URLs with new change-dates (in the long run, just keep the HTTPS sitemap).

# What about the robots.txt file? The HTTPS site uses the HTTPS robots.txt file. Check that it's reachable or serves a 404 result code, and check that your HTTP URLs aren't blocked by the HTTP robots.txt file.

# Is it OK to have just some pages on HTTPS? Yes, no problem! Start with a part, test it, add more.

# Should I move everything together, or is it fine to do sections? Moving in sections is fine.

# Will I see a drop in search? Fluctuations can happen with any bigger site change. We can't make any guarantees, but our systems are usually good with HTTP -> HTTPS moves.

# Which certificate do I need? For Google Search, any modern certificate that's accepted by modern browsers is acceptable.

# Do I lose "link juice" from the redirects? No, for 301 or 302 redirects from HTTP to HTTPS no PageRank is lost.

# Will we see search keywords in Google Analytics when we're on HTTPS? This won't change with HTTPS, you can see the search queries in Search Console.

# How can I test how many pages were indexed? Verify HTTP / HTTPS separately in Search Console, use Index Status for a broad look, or the sitemaps indexed counts for sitemap URLs.

# How long will a move from HTTP to HTTPS take? There are no fixed crawl frequencies, it depends on the size of your site, and the speed of crawling that's possible. The move takes place on a per-URL basis.


Hope this helps clarify some of the open questions! Let me know if there's anything missing.

97 comments on original post
1
Add a comment...
People
In his circles
1,132 people
Have him in circles
785 people
AUDREY MANCINI's profile photo
scott polk's profile photo
Arthur Navarette's profile photo
Luis Angel Espinoza's profile photo
Ron Burress's profile photo
Gary Lovelace Jr.'s profile photo
Marty Vettel's profile photo
Harvey Levin - TMZ Fan Page's profile photo
Lộc Phát's profile photo
Collections Robert is following
Education
  • UCLA
    1992 - 1997
  • Loyola High School
    1988 - 1992
Basic Information
Gender
Male
Looking for
Networking
Birthday
June 3
Relationship
In a relationship
Story
Tagline
SEO Manager at Bruce Clay, Inc., Internet Visibility Specialist, Legal Marketing Consultant, Digital Marketing Junky, Writer, Blogger, Foodie, Father, Husband, Man About Town
Introduction
I am the SEO Manager at Bruce Clay, Inc. and an SEO Analyst in my own right. I enjoy helping businesses increase their online revenue through search engine optimization. I specialize in legal internet marketing and enjoy helping businesses succeed in getting their piece of the online revenue pie.
Work
Occupation
SEO, Internet Marketing Strategist, Legal Marketing Expert, Online Visibility Consultant
Skills
SEO, SEM, PPC, Online Marketing, Website Design, Website Developer
Employment
  • Bruce Clay, Inc.
    SEO Manager, 2015 - present
  • Bruce Clay, Inc.
    SEO Analyst, 2013 - 2015
  • Local ReZults
    CTO, 2011 - 2013
  • Banafsheh, Danesh & Javid, P.C.
    Director of Online Marketing (In-House), 2012 - 2013
  • Sipe & Associates
    SEO Analyst, 2007 - 2010
  • Increase Visibility
    Chief SEO Analyst, 2010 - 2011
Places
Map of the places this user has livedMap of the places this user has livedMap of the places this user has lived
Currently
South Pasadena, CA
Previously
Eagle Rock, CA - San Francisco - Oakland
Robert Ramirez's +1's are the things they like, agree with, or want to recommend.