Calling experts: what are your views on allocation of Googlebot crawl budget when you have a site with the same page accessible via http://_ https://_ http://www_ and https://www_ and not marked with rel="canonical". Is this in effect wasting crawl budget? In terms of indexing, Google's decision about which to show in search is proving to be variable. It's a mish-mash. I do not have GSC access and cannot claim the domain. I can only make recommendations for this large national identity.
Opinions / ideas? Thanks in advance.

What is a better idea when using content from other websites? (for legitimate purposes).

Canonical tag to the source?
Mention the source in the article (like it's done on Medium with a link to the source)?
Or simply add a text-only reference, eg. Source: Domain.com



Is there a benefit to making the last level of a breadcrumb link to itself?


Huge amount of backlinks detected - what to do ?

The websites that use Yotpo review solution can display product galleries like this //imgur.com/4dHUh7O - orginal source page: http://skibox.fr/fr/veste-de-pluie-dynastar-long-shell.html

Every product in the gallery generates a link to https://yotpo.com such as https://yotpo.com/go/eAaQNjJh

This generate a huge amount of links detected in Google Search Console (GWMT) of yotpo.com

And every of those links redirects 301 to a page of the website using Yotpo review solution. Example: https://yotpo.com/go/eAaQNjJh redirects to http://skibox.fr/fr/batons-de-ski-leki-worldcup-lite-slalom-4683.html?#.VymNdr5_TwY

It seems to be similar to shorten URL sites such as bit.ly (that are doing fine), but I would be glad to get your feedback

Is this influencing (in bad) the (potential) rankings of https://www.yotpo.com subdomain pages?
What would you recommend to do?

Technical question: a client just sent this to me:

"We just found that the <lastmod> date in our sitemap is hardcoded for 2008-03-18 - I assume that’s a bad thing. How should this be handled? Hard coded for a new date and updated manually? Or updated automatically using some tool?

Also, what should the priority and change frequency be for a blog."

This client has a homegrown CMS. Can anyone chime in with their recommendations?

Thanks in advance! 

I've had my fair share of clients who pay for my advice but never take it. We all know these clients - the ones that want the help but never make an effort to action anything. They coast along with average rankings, traffic and sales. And they don't seem to care that they've paid good money for advice that would improve their business. It can be quite disheartening. It's not that you want them to do well so you'll get another win for your portfolio - you want them to do well because you genuinely care.

How do you handle things when they don't bother? What do you do to ensure clients action your recommendations? Do you have arrangements in your contracts that include implementation of your recommendations?

How do you stay positive and move onto the next project?

#latenightthoughts #pensiveseo  

Has anyone noticed any ranking drops (or even gains) around 2-4 December?

Algoroo reports some action on 30 November and Mozcast reports similar movement around 28-29 November which is close enough to the dates I'm noticing some movement (could perhaps be a delay due to the region?)

I thought it might have been a bug with AWR Cloud but it shows some sites dropping dramatically whilst some have improved (or have been unaffected) hence the question. 

I'm gaining many clients that have come from poor SEO experiences and on reviewing their backlinks I continously see a large amount of spammy link building practices such as social bookmarking sites, countless business directories and article submission sites. The majority of links, 99% of them, for one client are nofollow. Is it worth going through and disavowing the majority of these?

Hi All, I'm about to help a client with a transfer over to HTTPS, and I've made the following checklist to hopefully minimise ranking loss. Some of it is from my brain and some is research, so I'd really appreciate if you could let me know if you think I've missed anything off, or if there's something on there that's not essential.

I will also appreciate discussion on whether this is a good idea or not, although from my end that's a little out of my hands - client has made up their mind and are switching over! FYI the site is a publisher and has over 700k indexed results.

cheers :)


* Update the links in templates (menu structure, footer etc) with the new HTTPS url versions
* Redirect all HTTP requests using 301 to the equivalent HTTPS resource. Generally this can be done using one command
* Current redirects in Htaccess will need to be redirected to their new HTTPS equivalent to avoid 'redirect chains'
* Do you serve any content from CDNs or from external ad platforms that don't use HTTPS? See if they have HTTPS versions and use those
* Are there any tools which use JS tracking code on your website that sends requests that aren't HTTPS? They should have secure versions
* Are there canonical tags on your website? If so, they will need to be updated with the new HTTPS URLs
* Images can often bring up HTTPS errors, try to use relative path names

Hi all, I'm working on a clients rather large new website launch, same domain but different URL structure. So far I've done;

-301s (based on backlinks, core site structure of old site, top 200 landing pages and more)
-GWMT fetching
-sitemap submission
-backups of HTML of old site
-404 monitoring

I'm not worried about a huge loss in rankings, although it would be nice to know if you guys do anything for a new site launch with a structure change, i.e. have I missed anything really obvious off the list. Can't seem to find too many solid articles on this.

cheers!
Wait while more posts are being loaded