Profile cover photo
Profile photo
Jeevan Katwaru
always stare up at the stars, not down at your feet
always stare up at the stars, not down at your feet

Jeevan's posts

Our client recently switched to a full JS site (angular). We noticed all their rich snippets (review) have disappeared from the SERPs, but the code is on the test and verified via the rich snippet tester. Is angular and rich snippets an issue? Anyone else experience something like this? 

Display: None for Mobile/Duplicate Content?

basically the problem is that the order of the sections (on a new design) is slightly different on mobile layout compared to desktop. Because of this, one way dev is deciding to do it is to duplicate the content in the html of the page, then use .css to hide one of the duplicates on desktop and the other duplicate on mobile.

Think this will be an issue with cloaking and perceived as keyword stuffing? 

Apple Maps Bulk Upload
I have a client with about 700 locations. We have bulk uploading in Google/Bing but trying to figure this out with Apple. According to what we've found, they only allow bulk uploads over 10K locations? Is that true?

Also, we are trying to update 1 location with UTM params to track traffic, but Apple does not seem to allow campaign URLs, Anybody have experience with this? 

I have a US .com client who's hosting & server is located in Germany (German IP address). Search console is set to target US, but the client cannot change to a US server. I know that its best practice to have the IP address in the target country,  will a CDN using US IP's be good enough?

I recently discovered that over 80% of the blog posts on my client's website is duplicate content. They had authors contributing posts to their website, but did not realize the authors were just copying and pasting the content from their own personal site. The personal site posts came first obviously. 

There's a ton of content, I have some options:
1. robots.txt the duplicate content, post by post.
2. nofollow,noindex the duplicate posts
3. Block the entire blog directory in robots and create a new blog directory.

404'ing the pages is not an option here.

Any ideas? 

So with Google's new recommendation on Ajax - that they can crawl and render the page as a normal browser would.

What if The main LP has tabs being rendered by ajax using fragmented URLS. Each tab has unique content and a new clean URL that's in the index.

Since Google can crawl the main LP with all the content now, and they see the tabbed page in the index too - isn't that duplicate content?

Viewing both pages as text only in the SERPs I see the same content....

I've been moving a client over to GTM while they still have the hard code GA on the site. For GTM I'm using a new property ID under the same account. Both properties are collecting data, but the GTM never matches the hard code. It's always under reporting sessions (off by avg -6,000 sessions) . Also, bounce rate on GTM is 10% while hard code looks normal (60%) 

Any tips? 

What is the best way to install 2 Google analytics tracking id's via GTM? Is it a matter of installing 2 tags with the different UA numbers? 

Organic Coming in as Direct
We have a drupal site where after a relaunch of an updated CMS - all organic traffic is coming in as direct. I tested in real-time and it happens on all browsers any device. We're using universal GA basic snippet, the code is in the header.Any ideas? 

Post has attachment
Think there's a correlation here? We have a pretty large site (300K Urls) and the days where the "pages crawled per day" are high - we were getting indexed pretty quickly. Now that the 'crawl per day' has dropped, we stalled on indexation - and I noticed the time downloading a page has increased, Anyone else see this? 
Wait while more posts are being loaded