Stream

Martin Reed

General Tech Talk  - 
 
Your site is being crawled at a rate calculated as optimal by Google.

Hi guys, I’m working with a client whose pages crawled by day in Search Console has dropped to 1/10th its previous rate and flatlined for the past 2 months. Looking at the crawl rate in site settings, the usual option to select a crawl rate (or let Google decide) is replaced with the message above.

We’ve improved server performance, capacity and unnecessary rate limiting and have seen the number of 5xx server errors is dropping. We’re also working on improving page speed to reduce the time spent downloading a page. The crawl rate is slowly increasing - a few thousand more pages a day - but it very much feels like this would happen anyway as Google works out the ‘optimal’ rate, and still well short of its former glory.

Any suggestions on how we can get the crawl rate back up? It’s becoming problematic because we’re finding that new content isn’t getting indexed.

I did find a page that lets you report a problem with how Googlebot crawls your site (https://www.google.com/webmasters/tools/googlebot-report) but it seems to be intended for limiting crawls.

Many thanks,
Martin
1
Alistair Lattimore's profile photoMartin Reed's profile photo
8 comments
 
Hmm there's no significant increase in the time to download a page. I'll check out getting TTFB added to the logs. That's a good one to know. Thanks Alastair
Add a comment...

Hyderali Shaikh

General Tech Talk  - 
 
Is Google no longer accepting sitemap in html format?

I read the google guidelines also in which they didn't mention about html files.

Due to some security reason, we've blocked .xml & .txt in robots. So, I made sitemap file in .html but it has been three days the status in search console is still showing 'pending'.

Currently our robots.txt file is as below:-

Sitemap: http://[URL].com/sitemap.xml

User-agent: Googlebot
Allow: /sitemap.xml

User-agent: *
Disallow: /*.txt$
Disallow: /*.xml$

I still don't know how many URLs has been indexed in Google & how many are still pending.

Any solution on the above query?

Thanks.


1
Randy Milanovic's profile photoJaaved Khatree's profile photo
2 comments
 
As far as I know, you can't submit an HTML sitemap the same way you would an XML sitemap.

+1 for what Randy suggested above: Fetch and Submit the relevant pages that make up your HTML sitemap and see how that goes.

Depending on the size of your site and your internal linking structure, you could get away without using an XML sitemap.

Let us know how it goes :)
Add a comment...

Dave Ashworth

General Tech Talk  - 
 
Not sure if this is a bug or not, but the Index Status within Google Search Console doesn't look to be updating - had the same figures across every website for 3 weeks in a row now.

Anyone else seeing similar?
3
Dave Ashworth's profile photoJohn Mueller's profile photo
14 comments
 
yeah, should be catching up now.

Add a comment...

Mark Taylor

General Tech Talk  - 
 
A friend of mine runs a local football Club website and wants to add a section to offer football related products, which will be fed from affiliate programs.

They have a good site that is technically strong from a SEO perspective and I've warned him of the impact of doing this but I understand why they want to do this.

Whilst it's good practice to block this section in robots.txt we all know this is a guide to the SEs rather than a command.

What steps should they take to prevent the whole site's SEO devaluing, losing natural search traffic or worse, being penalised?

Is there any reason to chose a sub-domain over a sub-directory?
1
Dave Elliott's profile photoJoanna Laznicka's profile photo
8 comments
 
Another option would be to promote affiliate links in their social media (of course follow the FTC guidelines of disclosing) - that way they are not missing with their SEO of their site - I find promoting targeted quality products via social media a much better return then adding them to my sites.

Add a comment...

Chris Bolton

General Tech Talk  - 
 
So my client had their Google My Business page hijacked by an offshoot of their company. The offshoot changed the name and the address of the Google business page, but the knowledge graph info is still showing for my clients brand name.

I've created a new Google My Business page with the correct info, but it's not showing yet.

Is there anything you guys would recommend to get the new Google Business page to usurp the one with incorrect info?
1
Chris Koszo's profile photoJenny Stradling's profile photo
6 comments
 
Wha?? Someone had success with their Twitter? I've uncovered hoards of issues and spam... We did research weeks (maybe months) ago on drug rehab center SPAM on Google Maps. I noticed some weird trends with our addiction
treatment clients and so we started looking closer. Turns out when Google decided to remove "maps" from Google+ and created Google My Business (GMB) they reverted the old Google maps back to maps that had since been claimed, cleaned up and optimized. When they reverted the maps back to maps, they removed all claim. Rehab marketing "spammers" went ahead and claimed these maps for themselves. There is so much OUTRIGHT spam happening, it's insane. I guess I have to write a blog post. Sigh 
Add a comment...

Mark Taylor

General Tech Talk  - 
 
Question - If a business has collected a reasonable number of customer reviews using a third party partner (Feefo, TrustPilot etc) but has not previously added them to their website, will Google see this as a problem if a product page goes from zero to 10+ reviews between the page being indexed?
1
Emile Pukander's profile photoChris Koszo's profile photo
8 comments
 
Trust Pilot is tricky because they let you use their API to populate your reviews on your site, but it's only for users because they're blocking the JS files to bots on their side. Makes sense for them since they want to get that traffic and not have duplicate content issues.

Anyone have experience in this and how to get around it? For one of my sites I have hundreds of Trust Pilot reviews, so I'm just thinking of manually pulling in a sample and writing my own JS on the site to pull from Trust Pilot's API to keep my aggregate score honest and up to date.

Another follow up question though (sorry): Is it ok to mark up a page on a site with an aggregate review schema without having any of the actual reviews on the company/site itself that's being reviewed?
Add a comment...

Dave Ashworth

General Tech Talk  - 
 
Geo-Targetting

Have a site that whilst it's primarily UK traffic (50%), it also gets traffic from the US (20%) and around Europe (1-4% per country), so is ranking well enough across all locations. The language tags etc are set to EN, whilst a US targeting site is due to be launched.

For one reason or another, ahead of the US site launch, the geo targeting has recently been set to UK within Search Console.

In doing so, would you expect:

- an improvement in UK performance, and/or
- a drop in performance for France, Germany etc.
2
Kathy Alice Brown's profile photoThomas Zickell's profile photo
5 comments
 
The site has to be built for geo-targeting using either

I would like the others strongly recommend Hreflang their are many ways to implement it and not knowing your current setup I could not give you any more information other than that.

Hreflang will help you target other countries much more easily than using search consoles geo-targeting.


Like most have said you must add Hreflang https://moz.com/learn/seo/hreflang-tag

Your site having a setup for EN English does not make that big of a difference.

Remember if you're going to target the United States the small differences the way words are spelled or used is extremely important to differentiate yourself from the United Kingdom.

Also remember the correct hreflang-tag for the United Kingdom is en-GB

(as uk is Ukraine)

Add a comment...

Mark Marino

General Tech Talk  - 
 
Not necessarily. The red lock icon happens to 2 of our 3 websites and only on Chrome to some (not all) visitors. All 3 websites have valid SSL certs through Network Solutions and CSR through Rackspace and get an A- rating on Qualys SSL Labs. This only happens on Chrome so I think Chrome needs to fix their versions; I'm handcuffed and am out of solutions for this Chrome issue.
1
Federico Sasso's profile photoMarinus Klap's profile photo
4 comments
 
I use Firefox now, there everything works perfect, bye bye chrome, it was a good browser!
Add a comment...

karl Jenkins

General Tech Talk  - 
 
I have a question about an e-commerce platform I'm currently working on.

I'm struggling with large amounts of product pages with very little onpage content other than the product descriptions themselves. My issue is that where the product is the same with only variations such as pack size, flavours etc. all the descriptions are essentially the same.

My question - how will Google view this? Is this duplicate content? If so do I need to make each description as unique as possible? I son't want to canonicalise anything as I want each page to rank as highly as I can.
1
Paul Kenjora's profile photoKalpesh Bharankar's profile photo
7 comments
 
+karl Jenkins if you still getting duplication problem, It is better to convert all your duplicate content into images. So you no need to use canonical tag or no need to rewrite content for all pages but while doing that do proper onpage on each and every page on your website.
Add a comment...

Nikhil Raj

General Tech Talk  - 
 
Hi Guys, Please could you check my post regarding hacked content on Webmaster Central forum https://productforums.google.com/forum/#!topic/webmasters/AL-w4_vv-Oc Not sure what I should do! The website is not hacked! +John Mueller +Gary Illyes
1
Nikhil Raj's profile photo
14 comments
 
Search Console now says there is no hacked content in WWW. I hope within a few days these wrong notices will be removed from DE domain also. I have filed a review again for DE +John Mueller
Add a comment...

Joanna Laznicka

General Tech Talk  - 
 
I am presuming I am doing something wrong and would love some guidance. -- I understand that google can come up with their own Meta Title and Meta Description but feel in my gut something else is happening and it is something I am doing wrong. Time to time I search all my sites in google by site:MYSITE to just see what is happening and one of my site in search almost every meta title has the site name at the end of it, even thought I have not requested this to be in the back end. See screenshot of my search results and you will see VC List at the end of each title. I really don't want the verbage VC List at the end of the title for a few different reasons such as it throws off the keywords I am targeting, as well not every page of my site/publication is about a list of VC's but advice on other topics such as crowdfunding or valuation or the many other topics startups deal with. I can't imagine my meta titles are so bad in Googles eyes that they have to take my blog title and then throw VC List at the end of each one so where am I going wrong? I would really appreciate your guidance before I start messing with markup and breaking more things. - I also see that in Bing the meta titles are correct and don't have a forced VC List at the end of them - So why is this happening in Google

1
1
Michael Rhodes's profile photoChris Koszo's profile photo
12 comments
 
G'day +Joanna Laznicka, I don't think this is a problem. It doesn't effect your keywords at all. My remedy would be to capitalize on the Brand angle by appending something like this to all your pages, telling Google to use your version: - VC LIST®

Seems official looking to me and could increase your CTR. Just make sure to actually trademark it for $300.

Add a comment...

Hyderali Shaikh

General Tech Talk  - 
 
Hi,

One of my client offers content in more than one language i.e. Hindi, Portuguese, Turkish, Gujarati etc. The main site is in .com.

The sub-directory is like below

http://www.xyz.com/hi (for hindi)
http://www.xyz.com/pt (for portugues)
http://www.xyz.com/tr (for turkey)

After checking the view source, I found the below tag below /head

<a title="xyz" href="http://www.xyz.com/hi/" rel="alternate" hreflang="hi"></a>

<a title="xyz" href="http://www.xyz.com/pt/" rel="alternate" hreflang="pt"></a>

I'm not much familiar with multilingual seo. So, I wanted to know if the above code is perfectly implemented by the developer? Or I've to feed him something else?

Also, I read somewhere that you have to create multilingual sitemap of each subdirectory & submit those in Google. Is this true?

P.S. In search console we've added India as a international targeting. Should I select unlisted?
1
Hyderali Shaikh's profile photoFederico Sasso's profile photo
5 comments
 
Hi +Hyderali Shaikh setting geo-targeting to "Unlisted" in GSC would have a positive effect for non-Hindi queries on non-Indian SERPs.
Regarding the possible impact on today positioning in Indian SERP, I don't really know; I'd at least restrict geo-targeting to India for /hi/ in order not to alter today setting for Hindi contents.
Add a comment...

Chris Koszo

General Tech Talk  - 
 
Hello guys and girls, I can't figure this out.. I'm looking at a website for a company that ranks well (hundreds of #1-2 rankings, has knowledge graph, featured snippets etc.) but for its actual product (only sells one product) the rankings are nonexistant (200+ position in Google).. This has been the case for them for 5+ years (before Penguin). Since they're the largest company in their industry I think searchers expect  to see them on page 1 but don't. What gives?
 
To me it seems like a simple case of Penguin where they did something to overoptimize for their main keyword and got hit, however, they've had this ranking issue years before Penguin even came out, and BEFORE they even hired any SEOs. Never in the company’s history did they rank for their main keyword in Google. In Bing their historic position varies from #1-15. They’re in a competitive niche but I think at least they’d deserve to rank on page 3 or something in Google? They have 1000 employees and links from Wikipedia and hundreds of other publications due to PR efforts.
 
 One theory going around is that since their domain is a prime 3 letter .com domain, and the previous company they purchased it from was so strong, that Google is still associating it with that/doesn’t trust them enough to rank the new site for their money keywords? It doesn’t make sense though because they’re in the top 3 for hundreds of mid to long tail queries. Their content is excellent and satisfies searcher intent. Also they don't and never had a manual penalty etc. either. One of their recent SEO companies acquired questionable links in 2015, but those were disavowed already by them.
 
Should they just sit out and wait for Penguin to actually run and see what that brings? I really can’t think of anything else. Feel free to PM me I will share the company name.
 
+John Mueller, this is the same question I mentioned to you on Wednesday Hangouts, but I think this provides more detail. Let’s see if +Gary Illyes  is on this board too :D  Thank you!!
1
Chris Koszo's profile photo
13 comments
 
FYI, here's rankings for a variation of that keyword. So weird! https://i.imgur.com/aMmR5afh.jpg +Terry Van Horne 
Add a comment...

Hyderali Shaikh

General Tech Talk  - 
 
Hi,

Today, I got a message in our search console account that there has been increased in authorization permission errors i.e. 403 error. But I checked the list, most of the URLs are login & logout page.

Like this -> [URL]?data%5Bregflag%5D=0&data%5Bmd5res%5D=TestName&login_uid=TestName&login_password=Password&submit=Submit

We haven't added /login to robots.txt. Should I do that?

How to fix this issue?

Thanks in advance.
1
Hyderali Shaikh's profile photoCollin Davis's profile photo
9 comments
 
Unless it is a clickable link, Google crawlers wouldn't be able to execute it. You need not add anything in that case.
Add a comment...

Dave Ashworth

General Tech Talk  - 
 
rel prev/next and canonical tag combination

When you have a series of paginated pages, and they are being reported as dupe content, have always gone with rel prev/next to conslidate them

As I understand it, canonical tags are used in a different way to inform search engines which of many similar pages is the primary page

So, why is mixing rel prev next and canonical tags common practice? More to the point, why do Google say it's ok to mix the two?

https://webmasters.googleblog.com/2011/09/pagination-with-relnext-and-relprev.html

Reason being, you have page 1, 2, 3 & 4 - I would use rel prev next to say these are a sequence and consolidate them.

But if each page then has a canonical tag specifying itself, is that not then saying to treat each page independently of the others in the sequence? i.e. a conflicting signal?
1
Dave Ashworth's profile photoKathy Alice Brown's profile photo
5 comments
 
With sites that have a lot of filters there was a reason to have both. With the filters that we deemed as not significant we would canonicalize the URL to a more authoritative URL and then use rel prev/next to aggregate the sequence to the first page. We were not using a View All page.
Add a comment...

Katherine Watier Ong

General Tech Talk  - 
 
Headless browsing?
I'm attempting to fix a site in Angular JS 1, and while I fell as though I'm relatively technical, I don't have a developer background.

Does anyone know of any resources for me to get to speed with using a headless browser to check the site? I've checked out the PhantomJS site, but the next steps on that site are not very clear.

I've read all of the posts on the Build Visible site, and some of the articles by Mike King, but I still feel a bit lost about how to really crawl this thing and capture all of the issues.

Is setting up prerender.io really the best first step and then crawl after that's set up? The site has clean URLs, but no <meta name="fragment" content="!"> in the head....
1
Eugene Rudenko's profile photo
Add a comment...

Nikhil Raj

General Tech Talk  - 
 
For the usage rights filter in Google Images, to enable it for my images should I add the license property for schema Image Object. Does it work? Or is it based on Google's classification of websites? Any one knows..
1
Nikhil Raj's profile photo
 
Found 3 ways. 1) rel=copyright in header 2) rel-license microformat 3) license for ImageObject schema Hope this works!
Add a comment...

Dave Ashworth

General Tech Talk  - 
 
Correct Alternate Language Implementation

I have a site that currently runs on .com and targets everywhere - it gets traffic primarily from the UK, then the US and plenty from France, Germany, Australia etc

We're about to launch a new US site in a subfolder, so we'd have

www.domain.com/us/ for the US
www.domain.com for everywhere else

So I believe the alternate language tags would be as follows:

<link rel="alternate" hreflang="x-default" href="http://www.domain.com" />
<link rel="alternate" hreflang="en-us" href="http://www.domain.com/us" />

I am thinking x-default because, although the main site is in English, the home page doesn't have language selectors for other languages such as French / German, I don't want to specify this as:

<link rel="alternate" hreflang="en" href="http://www.domain.com" />

As I don't want to run the risk of losing non english traffic, as they get it and it converts

Is this the right way to do things?
1
Federico Sasso's profile photo
 
I would use the hreflang="en" link, since the content is in English.
Let's not illude ourselves Google cannot recognize English content: hreflang purpose it to pick the best content for users' localization, and not much for ranking. You can leave the "x-default" one, doesn't hurt; in my opinion doesn't help either in this case.
Add a comment...

Mark Marino

General Tech Talk  - 
 
https red locks on Chrome... this is (and has been) happening to 2 of our 3 eCommerce sites. The invalid SSL warnings are intermittent and only with some Chrome users. I cannot replicate but others in my office (shared IP) do.

All 3 of the sites' SSLs score an "A-" rating and it's only happening with comlax.com and puregoalie.com -- not purehockey.com

https://www.ssllabs.com/ssltest/analyze.html?d=www.puregoalie.com

We reissued a CSR and SSl Cert for Pure Goalie two-weeks ago, but I don't think that helped anything.

1
Add a comment...

Andrea Moro

General Tech Talk  - 
 
Without searching on internet, what in your opinion is the "right" definition of "site architecture"?

2
Paul Kenjora's profile photoDave Ashworth's profile photo
7 comments
 
pages and content categorised correctly in a hierarchical fashion, and them navigation to them "easy" for both users and bots
Add a comment...