Stream

Mark Taylor

General Tech Talk  - 
 
From a SEO perspective, is there anything I should be mindful when moving our company eCommerce website to PHP7?
1
Manuel Lemos's profile photoDenver Prophit Jr.'s profile photo
7 comments
 
Well the urgency doesn't exist for a while to migrate to 7. I work voluntarily on PrestaShop shopping cart open source. We're just now adding unit tests for php 7.
Add a comment...

Mark Taylor

General Tech Talk  - 
 
I'm seeing this non-secure warning (see image) for UK sites (might be elsewhere) for non-https sites.

Does anyone know if this is a test or a roll-out?

How do people feel about https becoming a ranking factor going forward?
2
Denver Prophit Jr.'s profile photoCollin Davis's profile photo
5 comments
 
+Dave Ashworth true that
Add a comment...

Łukasz Rogala

General Tech Talk  - 
 
Hi!

So we basicaly are almost done with our website and we started indexation process. everything is added to GSC (starting from today so it's a fresh topic).

General idea is to reditect users based on their user-agent language to proper website version (if you come from Germany we redirect you to domain.de version, if from poland to .pl one and so on).

After adding sitemap.xml from .co.uk version of website I've noticed that Google Search Console returns a lot of errors with information that URLs are blocked by robots.txt (I guess it's because till today service was locked).

To go a littlebit deeper with the case I've put some URLs from sitemap.xml into "Fetch as Google" tool and noticed that it returns information about redirect to .com (x-default) version of website.

Is it because:

1. We have x-default URL in <HEAD> pointing to our .com domain (if we don't have your language version).
2. Google has problems with this kind of solution.
3. Other problem?

I didn't thought it will be so hard - based on how Skyscanner.com does it. :)

I can share the URL with you gusy on private message and discuss about it a littlebit more.
1
Łukasz Rogala's profile photoKrinal Mehta's profile photo
17 comments
 
Some of the popular sites using hreflang https://nerdydata.com/search?query=hreflang - might help looking into the ones where you see the right set of data for country-specific queries.
Add a comment...

Craig J. Mount

Technical Audits  - 
 
I have always been curious about URL's that are temporary but still need to be indexed. For example, if I have an ecommerce store with a limited-supply product, a car dealership that has a specific used car, or a real estate company with a listing that all don't need to be indexed long... what is best practice for keeping a clean house?

Would the site redirect the transient URL's once they become 404? Wait for Google to deindex them? Something creative like a custom 404 page with links?I'm sure this is a case-by-case, but I get the feeling there is likely a general guideline.

Curious to hear thoughts...




1
Craig J. Mount's profile photoKenichi Suzuki's profile photo
3 comments
 
Just return 404. You could use sitemaps to speed up de-indexing (once de-indexed, the URLs should be removed from the sitemap).
301 is fine if you have a (very) similar product. But don't 301 to the homepage or category pages because such redirects will be treated as soft 404.
Add a comment...

Nikko Marasigan

General Tech Talk  - 
 
Hey everyone,

Not sure where to put this question.
1.We have the non-www version as the preferred version on Google Search Console
2.The www version is set as the URL on the PPC campaign

Will there be any problems with this setup? Thanks!


1
William Harvey's profile photo
 
There's no problem with this setup. The PPC user will either land and stay on the www URL or redirected to the non-www dependant on your setup.


Add a comment...

Will Hattman

General Tech Talk  - 
 
Hey there everybody. Let's talk 301 redirects.

Specifically, let's talk about why in the world Google would retain a URL in its index after it has been 301 redirected to a new URL.

I have a client whose site has undergone all kinds of changes over the years to its IA, its content, and (most recently) its platform, and every time we 301 redirect an expiring URL as part of one of these changes, the old one stays in the index. So for every page on the site that's ever undergone such a change, the current version has to compete for visibility in the SERPs with all the ghosts of its former selves.

I've never had this problem with any other site in my life. How about you?
1
John Mueller's profile photoWill Hattman's profile photo
6 comments
 
Thanks, all. I appreciate it!
Add a comment...

Martin Reed

General Tech Talk  - 
 
Your site is being crawled at a rate calculated as optimal by Google.

Hi guys, I’m working with a client whose pages crawled by day in Search Console has dropped to 1/10th its previous rate and flatlined for the past 2 months. Looking at the crawl rate in site settings, the usual option to select a crawl rate (or let Google decide) is replaced with the message above.

We’ve improved server performance, capacity and unnecessary rate limiting and have seen the number of 5xx server errors is dropping. We’re also working on improving page speed to reduce the time spent downloading a page. The crawl rate is slowly increasing - a few thousand more pages a day - but it very much feels like this would happen anyway as Google works out the ‘optimal’ rate, and still well short of its former glory.

Any suggestions on how we can get the crawl rate back up? It’s becoming problematic because we’re finding that new content isn’t getting indexed.

I did find a page that lets you report a problem with how Googlebot crawls your site (https://www.google.com/webmasters/tools/googlebot-report) but it seems to be intended for limiting crawls.

Many thanks,
Martin
1
Alistair Lattimore's profile photoMartin Reed's profile photo
8 comments
 
Hmm there's no significant increase in the time to download a page. I'll check out getting TTFB added to the logs. That's a good one to know. Thanks Alastair
Add a comment...

Katherine Watier Ong

Google & Bing Tools  - 
 
I'm looking for an outside perspective. I have a client who has seen a drop in impressions and CTR for their branded keywords in Google Search Console but NOT in rankings, and a corresponding drop in homepage organic traffic (as that's the page that receives most of the branded queries).

I'm a bit stumped as to what might have gone one. From a technical perspective, there doesn't seem to be any change, the homepage design has stayed the same, etc.

They have a few associated login type pages that are now 404ing which probably ranked for branded terms, but there doesn't seem to be enough of those to warrant the drop in traffic, and I can't wrap my brain around how rankings would stay the same but impressions and CTR would drop.

Any other things that I could check?

1
Tony “Tiggerito” McCreath's profile photoDave Elliott's profile photo
12 comments
 
i'd check for seasonality as suggested but i'd also chuck the brand into Google Trends to see if the company has become less popular recently.

I'd also look at the SERPs and see if one of the competitors has increased their ad-words spend and is bidding on your branded terms.
Add a comment...

Hyderali Shaikh

General Tech Talk  - 
 
Is Google no longer accepting sitemap in html format?

I read the google guidelines also in which they didn't mention about html files.

Due to some security reason, we've blocked .xml & .txt in robots. So, I made sitemap file in .html but it has been three days the status in search console is still showing 'pending'.

Currently our robots.txt file is as below:-

Sitemap: http://[URL].com/sitemap.xml

User-agent: Googlebot
Allow: /sitemap.xml

User-agent: *
Disallow: /*.txt$
Disallow: /*.xml$

I still don't know how many URLs has been indexed in Google & how many are still pending.

Any solution on the above query?

Thanks.


1
Randy Milanovic's profile photoJaaved Khatree's profile photo
2 comments
 
As far as I know, you can't submit an HTML sitemap the same way you would an XML sitemap.

+1 for what Randy suggested above: Fetch and Submit the relevant pages that make up your HTML sitemap and see how that goes.

Depending on the size of your site and your internal linking structure, you could get away without using an XML sitemap.

Let us know how it goes :)
Add a comment...

Nikhil Raj

Google & Bing Tools  - 
 
Q on Search Console: The index status number in search console hasn't changed for last 4 weeks. But the sitemap index count is growing. Anyone see similar pattern in search console? 
1
Add a comment...

Hyderali Shaikh

General Tech Talk  - 
 
Hi,

One of my client offers content in more than one language i.e. Hindi, Portuguese, Turkish, Gujarati etc. The main site is in .com.

The sub-directory is like below

http://www.xyz.com/hi (for hindi)
http://www.xyz.com/pt (for portugues)
http://www.xyz.com/tr (for turkey)

After checking the view source, I found the below tag below /head

<a title="xyz" href="http://www.xyz.com/hi/" rel="alternate" hreflang="hi"></a>

<a title="xyz" href="http://www.xyz.com/pt/" rel="alternate" hreflang="pt"></a>

I'm not much familiar with multilingual seo. So, I wanted to know if the above code is perfectly implemented by the developer? Or I've to feed him something else?

Also, I read somewhere that you have to create multilingual sitemap of each subdirectory & submit those in Google. Is this true?

P.S. In search console we've added India as a international targeting. Should I select unlisted?
1
Hyderali Shaikh's profile photoFederico Sasso's profile photo
5 comments
 
Hi +Hyderali Shaikh setting geo-targeting to "Unlisted" in GSC would have a positive effect for non-Hindi queries on non-Indian SERPs.
Regarding the possible impact on today positioning in Indian SERP, I don't really know; I'd at least restrict geo-targeting to India for /hi/ in order not to alter today setting for Hindi contents.
Add a comment...

About this community

Technical SEO is a community to discuss the technical issues around building sites aimed at performing really well - for Google and users. We look at new technology and methodologies, help each other solve technical problems, and create experiments. We moderate! No self-promotion, no link dropping. We do allow links if you're creating a relevant discussion & adding value. If you have questions, ask a moderator. If you have basic SEO or marketing questions, we recommend checking out Google's Webmaster Central or one of the many other G+ communities. BELOW ARE LINKS TO FRIENDS OF THE COMMUNITY

Mark Taylor

General Tech Talk  - 
 
How does everyone feel about the Penguin 4.0 announcement?
1
William Harvey's profile photoPaula Allen's profile photo
8 comments
 
We confirmed it by asking +Gary Illyes on Twitter (https://twitter.com/methode/status/780457659409412096) who replied:
"@BruceClayInc we haven't changed our recommendations for the disavow tool with this launch"
Add a comment...

John Vantine

Site Speed & Performance  - 
 
Does anyone have a sense of how Google evaluates pagespeed for SPA (single page application) sites? A site I'm working on was recently relaunched in React+Redux with isomorphic rendering, with an emphasis on site speed.

After a lot of performance enhancements, the site feels very fast. However, the initial payload is inherently large (resulting in 5-6s load times), but subsequent requests are lightning fast - we're using HTML5 pushstate to update the URL for each subsequent request, but the browser isn't fetching the entire page each time... Subsequent requests are handled by JS fetching data from the backend.

I have several page speed monitoring solutions in place (Monitis, GA, New Relic) but none of them give direct insight into how Googlebot might be evaluating performance with this type of setup.

Do they evaluate each page on a URL-by-URL basis, and look at the initial payload (and corresponding slow-ish load time) for each? Or is it possible that once they load the initial JS/HTML, they continue to crawl from there? Are they essentially "refreshing" for each URL, therefore associating each URL with a higher load time?
1
Collin Davis's profile photo
 
Your best bet would be to analyze the log files, sort it by Google bot fetches and then see how much time it takes to download the page.
Add a comment...

Mark Taylor

General Tech Talk  - 
 
Does anyone know how this site is achieving the date snippet in this SERP? It doesn't appear to be due to Schema markup or OG code, unless I've missed it.

The URL is ... [https://] bloodwise.org.uk/event-challenges/cycle-bikeathon/north/wirral-bikeathon
1
Alan Astley's profile photoDave Ashworth's profile photo
15 comments
 
Looks to me like it's figured it out from a few different code snippets, such as the H1 in the class event-name and the date is marked up with:
property="dc:date" datatype="xsd:dateTime"

Have seen snippets generated this way before, but mainly showing paginated data, e.g.

Results 1 - 24 of 10710

And whilst the code is more obvious, have seen breadcrumbs generated in the SERP when no mark up is present,
Add a comment...

Mark Marino

Mobile SEO  - 
 
What would the correct (alternate) markup for a domain using both a separate m dot site, and Spanish website? Below are what I think to be the proper syntax.

www.site.com
m.site.com
es.site.com
es.site.com/mobile/

URL http://www.site.com
(source code)
<link rel="canonical" href="http://www.site.com" >
<link rel="alternate" href="http://m.site.com" >
<link rel="alternate" hreflang="es" href="http://es.site.com" >

URL http://m.site.com
(source code)
<link rel="canonical" href="http://www.site.com" >
<link rel="alternate" hreflang="es" href="http://es.site.com/mobile/" >

URL http://es.site.com
(source code)
<link rel="canonical" href="http://es.site.com" >
<link rel="alternate" href="http://es.site.com/mobile/" >
<link rel="alternate" hreflang="en" href="http://www.site.com" >

URL http://es.site.com/mobile/
(source code)
<link rel="canonical" href="http://es.site.com" >
<link rel="alternate" hreflang="en" href="http://m.site.com" >
1
Bob Gladstein's profile photoKathy Alice Brown's profile photo
3 comments
 
Agree with Bob. And also with the hreflangs you need a self-referential tag as well. So on http://es.site.com you need a <link rel="alternate" hreflang="es" href="http://es.site.com" > as well as the pointer to the www site. 
Add a comment...

Kathy Alice Brown

Technical Audits  - 
 
Hi, hoping to get some input from Google Analytics savvy folks on the best way forward. Have a client that does not have a top level canonical 301 redirect. The vast majority (about 40K) of the URLs have been indexed without the www, although the home page (which gets the bulk of the organic traffic - 46%) is indexed as www. They have set up their GA profile as www.

Normally in this case I would recommend setting up the 301 as www -> non-www (so Googlebot doesn't have to process a ton of 301s) and create a new GA profile as non www. But in this case there are three subdomains that should be considered separately. Although the subdomains are tagged with the same GA tag as the www profile, which I believe is a mistake.

My question is, is there any contraindications to going with a non www GA profile, or should we bite the bullet and set up the redirect as non-www -> www?
1
Kathy Alice Brown's profile photoGerry White's profile photo
3 comments
 
:-) 
Add a comment...

Katherine Watier Ong

SEO Tools & SAAS Providers  - 
 
Anyone using URL Profiler for a Panda audit on a large site?

I wanted to give it a whirl, but the site I'm looking at has 500K+ pages and I think URL profiler might run forever and I'll be spending a pretty penny on proxies.

Any suggestions?

I'm also using Deepcrawl...
1
1
Gerry White's profile photoManuel Lemos's profile photo
2 comments
 
That is interesting. Personally I have been using a tool that I developed to tell me what is the level of low quality pages that Google is not sending traffic to you.

Since I have a user generated content site. it has helped me automating marking as NOINDEX irrelevant pages and reduce the chances the site be so affected by Panda.

I have been considering providing this tool to other SEO or Web site managers.

Is this something you would be interested to be able to use?
Add a comment...

Joanna Laznicka

General Tech Talk  - 
 
I am presuming I am doing something wrong and would love some guidance. -- I understand that google can come up with their own Meta Title and Meta Description but feel in my gut something else is happening and it is something I am doing wrong. Time to time I search all my sites in google by site:MYSITE to just see what is happening and one of my site in search almost every meta title has the site name at the end of it, even thought I have not requested this to be in the back end. See screenshot of my search results and you will see VC List at the end of each title. I really don't want the verbage VC List at the end of the title for a few different reasons such as it throws off the keywords I am targeting, as well not every page of my site/publication is about a list of VC's but advice on other topics such as crowdfunding or valuation or the many other topics startups deal with. I can't imagine my meta titles are so bad in Googles eyes that they have to take my blog title and then throw VC List at the end of each one so where am I going wrong? I would really appreciate your guidance before I start messing with markup and breaking more things. - I also see that in Bing the meta titles are correct and don't have a forced VC List at the end of them - So why is this happening in Google

1
1
Chris Koszo's profile photoKathy Alice Brown's profile photo
13 comments
 
Hi Joanna, just to double check that you see this behavior for all your queries not just site: Google can change title tags based on the query, and I have seen the behavior you described for site: before.
Add a comment...

Chris Koszo

Mobile SEO  - 
 
Hello all, anyone know of a Wordpress plugin for AMP that already supports pages and not just Posts? +Joost de Valk I'm sure you guys have something coming down the pipe :D
1
Mark Taylor's profile photoKrinal Mehta's profile photo
2 comments
 
Hi, our plugin supports all content types - pages, posts, custom posts, taxonomies - https://wordpress.org/plugins/wp-amp-ninja/ - Would love to get some feedback.
Add a comment...

Dave Ashworth

General Tech Talk  - 
 
Not sure if this is a bug or not, but the Index Status within Google Search Console doesn't look to be updating - had the same figures across every website for 3 weeks in a row now.

Anyone else seeing similar?
3
Dave Ashworth's profile photoJohn Mueller's profile photo
14 comments
 
yeah, should be catching up now.

Add a comment...