Stream

Jake Mabey

Questions  - 
 
We had a site get hacked a while back. We got the hack cleaned up, but we've run into a weird issue. Our site interprets any URL with a ? as you would expect, an internal query. So any page like domain.com/?=random-string merely shows the home page.

However, when we had hacked pages on our site, they were all /?=hacked-url. So while we've done a disavow, technically all of these previously existing and hacked pages are effectively being redirect to the home page.

I've been trying to do a 410 from the htaccess file, but with no luck at all. I've tried two different forms:
Redirect 410 /?hv=100mg+zoloft+generic
and
RewriteCond %{QUERY_STRING}  ^$
RewriteRule ^\?hv=100mg\+zoloft\+generic$ /? [R=410,NE,NC,L]

Rewrite engine is on, so that's not the problem. Does anyone see anything glaring in the htaccess code? The browser simply shows the home page and still has the questionable path appended.
1
William Harvey's profile photoTony “Tiggerito” McCreath's profile photo
4 comments
 
This should work for your example:

RewriteCond %{QUERY_STRING}  ^hv=100mg+zoloft+generic$ [NC]
RewriteRule ^.*$ - [R=410,NE,NC,L]

It will work with that parameter on any URL. Make sure you place it before any CMS rewrite rules.

Spaces (aka +) can be a pain. First test it with one without spaces. I personally use this pattern to capture a space in place of +:

(?:[\ +]|%20)

You can also use [G] in the RewriteRule. So an alternate form would be:

RewriteCond %{QUERY_STRING}  ^hv=100mg(?:[\ +]|%20)zoloft(?:[\ +]|%20)generic$ [NC]
RewriteRule ^.*$ - [G]

Or if you know all hv parameters are dodgy:

RewriteCond %{QUERY_STRING}  ^hv=.*$ [NC]
RewriteRule ^.*$ - [G]
Add a comment...

Byron Trzeciak

Questions  - 
 
Does anyone have a good solution for blocking the spam referral traffic that we're all receiving in Google analytics. I've tried a few solutions which have helped but in some cases I still receive the traffic even though I've attempted to block it in the .htaccess file. At the moment I'm trialling this solution can anyone provide their thoughts or an example of your .htaccess that is working.

## STOP REFERRER SPAM ##
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?4webmasters\.org [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?anticrawler\.org [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?addons\.mozilla\.org [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?bestwebsitesawards\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?best-seo-solution\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?best-seo-offer\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?blackhatworth\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?buttons-for-website\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?buttons-for-your-website\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?buy-cheap-online\.info [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?econom\.co [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?darodar\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?hulfingtonpost\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?Get-Free-Traffic-Now\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?googlsucks\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?free-share-buttons\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?humanorightswatch\.org [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?ilovevitaly\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?7makemoneyonline\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?o-o-6-o-o\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?priceg\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?social-buttons\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?semalt\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?smailik\.org [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?theguardlan\.com [NC,OR]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?simple-share-buttons\.com [NC]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?free-social-buttons\.com [NC]
RewriteCond %{HTTP_REFERER} ^([^.]+.)*?event-tracking\.com [NC]
RewriteRule .* – [F]
## STOP REFERRER SPAM ##
2
2
Jim Banks's profile photoRob Maas's profile photoTony Dimmock's profile photoFrank Gainsford's profile photo
9 comments
 
A hostname include filter seems to be more effective nowadays. Spammers also start up showing under direct visits and search when using the exclude refferal filters !
Add a comment...

Hardial Singh

Questions  - 
 
We are facing some hacking issues with the website nycollege(dot)edu. In Google Search Results it's coming for the homepage that 'This site may be hacked'. Earlier also we have cleaned the website and this message was removed. But now again same message is coming in Google Search Results.

We have also found some unknown links generated for the website after the homepage which we later redirected to the Homepage. when we check Cached version in Google it shows some other site. Is there any tool to check what exactly is issue with the website as Google webmaster tools is not providing any details of hacking. 
1
Will Kennard's profile photoTerry Van Horne's profile photo
4 comments
 
Never ever ever ever ever 301 any page that is the result of a hack.... In fact 410 them mainly because all the links pointing at them are also likely hacked sites! I dare say there are likely more pages you need to spider your site with screaming frog... the "hacked" notice in SERP is likely the result of you not finding all the pages...generally they create 000's
Add a comment...
 
I have a kind of stupid question to ask.
I never come  across this kind of problem ever before.
When I check http://dsignfurniture.com on GWT , 71 URLs indexed
When I check on Google as - "site:dsignfurniture.com" no search results not available comes up.
When I use SEO by SQUIRRLY, "Not Index" error shows up for every single URL page.
I check robot.text file, OK. I check on Screaming Frog, OK. GWT, OK
Can someone tell me why nothing comes up when I search as "site:dsignfurniture.com"?
Or any of you had similar problem? And how did you fix it?
1
Sabastian Yalpur's profile photoWilliam Harvey's profile photo
10 comments
 
No problem, it only takes a few minutes.
Add a comment...

John Romaine

Questions  - 
 
Hey guys,
How can I go about taking a page on a site and finding out which pages link to it - internally, from the same domain? Is there a way to do this? Tool, software etc? Either I haven't had enough sleep or I've missed something....
1
Dave Ashworth's profile photoNicholas Chimonas's profile photo
6 comments
 
Screaming Frog is a better tool to use for this task than OSE - because OSE updates their index only ~once a month or so, and you could get an out of date picture. Screaming Frog will give you accurate data up to the time of crawl. If you don't have Screaming Frog you can use Beam Us Up for free. 
Add a comment...
 
I'm afraid the answer to my question is a simple "No." But may as well take a shot in the dark. 

Is there a way to bulk remove pages from Google's index? The method I typically use is to go into Google Webmaster Tools, Google Index > Remove URLs - from there you can create a removal request, but only for one page at a time. 

Here's the problem. I just cleaned up the vulnerabilities of a hacked site, and the next day Google is throwing 272 errors.. because there are that many now-404 pages since cleaning up 272 pages of spam the hack created. 

It's easy to get a list of all 272 pages; it's incredibly tedious and time consuming to manually remove each one from the index. Yes, they'll probably fall out of the index over time, but some of them are showing up on the first few pages for brand searches. 

What I've done so far is just a site:domain.com search and manually removed every spam page in the top 20 pages of results - which caught about ~50 of the spam pages. Hopefully the other ~200 some won't show up in other searches until they fall out of the index?
1
Nicholas Chimonas's profile photoSabastian Yalpur's profile photo
16 comments
 
Usually they give limit per day from same IP address.
Try it for next day or 24 hrs after.
This applies manual removing as well.
I don't know why Google does this, but they do sometimes.
Add a comment...

Rick Eliason

Questions  - 
 
Hey all,

A question if you don't mind;

One of my clients serves a local audience here within the UK - literally several counties within the country only.

However, Anlaytics reports that 11% of traffic comes from outside the UK. In what ways can I make the site not rank overseas so that my overall data isn't skewed by poor quality, international traffic?

(Side question, is there any way I can delve deeper into location traffic - i.e. Google Analytics splits visitors down to England, Wales, Scotland et.al. - is there a way to tell WHERE in England this traffic comes from, either using GA or another tool?)

Thanks in advance all!
Rick  
1
John Romaine's profile photoPerry Bernard's profile photo
12 comments
 
+John Romaine or if you're the Prince's only surviving relative and are inheriting his estate.
Add a comment...

John Romaine

Questions  - 
 
Quick question...does anyone know of a tool (online or otherwise) that allows you to enter in a URL and have it check all of your internal linking to ensure there are no broken links or links pointing to non existent pages? I'm just looking at Screaming Frog and this doesn't appear to do what I want. Essentially I'd like some kind of report that says "broken link - points to non existent page" or something along those lines. Something that provides me with insights into any given websites internal linking structure. Cheers.
1
Terry Van Horne's profile photoDawn Anderson's profile photo
20 comments
 
SEM Rush has a site audit feature which does this
Add a comment...

Jaaved Khatree

Questions  - 
 
XML sitemap question: what would be the impact of hosting XML sitemaps on a subdomain instead of on the main domain? Could this still work in terms of letting Google know about content on the site (talking about 50k URLs per sitemap and having many of these)? TIA.
1
Dave Ashworth's profile photoTony “Tiggerito” McCreath's profile photo
6 comments
 
I'd personally list my pages on sitemaps that are placed on the same domain or subdomain. It would avoid any issues if some search engines require that.
Add a comment...

Jenny Munn

Questions  - 
 
Would love to hear ya'lls thoughts on sidebar links. 

I can't seem to find any articles supporting this, but I thought it was a known theory that site-wide sidebar links are mostly ignored by Google (kind of like footer links) and don't really pass along a lot of juice. Have you heard, tested or experienced this? 

Or, do you believe they count just the same as any other link on a page? Looking for any evidence to pass info along to a client. Thanks!
1
John Romaine's profile photoJenny Munn's profile photo
7 comments
 
Thanks John and Terry for chiming in. Appreciate it!
Add a comment...

Jaaved Khatree

Questions  - 
 
Would your site become extra sensitive to penalties if you've been hit by a penalty previously? Say you were hit by Penguin for low quality links but you cleaned that up and recovered.. would you then be on Google's radar as a site to watch in case you do something else that might be penalty-worthy?
1
Nicholas Chimonas's profile photoJames Norquay's profile photo
4 comments
 
What about a site that already has been hit by TWO manual penalties in the past. I've seen cases like this and worked on recovery.
Add a comment...
 
Hi, I've got question regarding images shown in Google Web search results. Why for example the results for "ferrari" on google.de show no images and on google.pl do show images for the same keyword - see pictures attached.
1
Michael Jaroszewski's profile photoKatarzyna Ostrowska's profile photo
2 comments
 
Thx Michael, I checked again and you were right, but the images show below the fold in SERPs on google.de when I check it and I presumed they do not display at all. However I know other brands for which images show on one google domain on Web search and not on another. What does depend on?
Add a comment...

Byron Trzeciak

Questions  - 
 
One of my client sites is receiving a large amount of "direct" traffic from the united states which seems unrealistic in my opinion for a fairly new site operating within Australia. The only thing that I can find that gives any indication on where it's from is the network is set as "Google Inc". Is anybody seeing this?
1
Ivan Smolković's profile photoRob Maas's profile photo
3 comments
 
Yes the include hostname filter will help, the exclude referral wont :(
Sure hope Google/analytics  is also working on a solution
Add a comment...
 
Migration question: If you were migrating / rearchitecting a large collection of sites - we're talking 27k pages total - what would your core SEO considerations be over and above 301 redirecting retired and moved pages.
1
Kate Toon (Kate Toon Copywriter)'s profile photoNicholas Chimonas's profile photo
7 comments
 
Besides proper 301 mapping... Make sure you download the latest WMT top queries and top pages reports. Pay especially close attention to the pages that receive the most click through traffic and make sure they all migrate appropriately, monitor change up or down, and implement fixes as necessary.
Add a comment...

Will Kennard

Questions  - 
 
I have a client who have many location based landing pages - for example they have one page .com/example-page-london, another .com/example-page-glasgow etc etc. Problem is, all these pages have exactly the same content, save for the page title and any references to the location in the text.

My instinct is to noindex them, as they are technically dupe content. But, they don't seem to be harming rankings, as the client has many page 1 results and good conversions. But I don't want them to be penalised in the future. What would you guys do?
1
1
Will Kennard's profile photoJordan Manchev's profile photoKarishma Pradhan's profile photo
11 comments
 
If you are using these pages just to funnel the visitor to your actual service/product page, its doorway. Add some unique content that is connected with the location somehow and place the service/product there should keep you safe, but google is google after all ;)
Add a comment...

Juan Pablo

Questions  - 
 
Hello hello!
Has anyone here worked with the IBM/Lotus Domino in the past or currently? The URL structure of a client's site has a number of levels (refer to enclosed image) and the developer is telling they are unable to re-write these URLs and make them SEO friendly :(
I've recommended a rewrite + 301 redirection of these URLs already and just wanted to know if you have encountered this type of issue in the past and how did you solve it.

and yes +Perry Bernard the new Search Analytics from GMT is pretty handy, specially having the ability to compare Mobile vs Desktop performance.
1
Add a comment...
 
Wanna share your tools?
 I'm writing an article about the tools we're using (there will be a lecture on that subject as well soon) so I'm asking you to share your tools, or the tools you would recommend if you're using an in-house solution
1. Ranking checker 
2. Backlink checker
3. Disavow 
4. Broken Links
5. Outreach monitoring 
6. Hidden content in source code
7. Competitor research
8. Canonicals
9. Reporting

And anything else I might have missed here that you think would be a good tool to know.
1
Maja Jovancevic's profile photoWilliam Harvey's profile photo
6 comments
 
First and foremost their data is remarkably accurate. 

I can analyse competitor rankings, keywords, content, social activity, PPC either locally, nationally, internationally for desktop and mobile.

They have first class reporting, offer content and site structure optimisation based on their data insight.

It allows you to spot any up and coming competitors.

And lots more, it's a fantastic tool.

And just yesterday they won "Best SEO Software Suite" at the #eusearchawards , so I'd guess I'm not the alone thinking this.
Add a comment...
 
Feedback needed...

I'm doing an expert roundup on my blog http://www.robbierichards.com/. I want to get input from as many community members as possible.

Here is the question:

If you could only use 3 tools for KW research, which 3 would you choose?

The answer can be a simple 3 bullet list,  or more info if you want to add context.

I'm going to publish the results on Tues or Wed. Already have over 40 responses. The results are surprising so far.

Please weigh in the comments below and include a link and Twitter handle if you want me to link to your site from within the roundup. 

Cheers!
1
Victor Gras's profile photoDawn Anderson's profile photo
3 comments
 
SEM Rush, Searchmetrics
Add a comment...

Kath Dawson

Questions  - 
 
HrefLang International Indexing Problem - Help?

We have a site that’s launched for international targeting through sub folders, and have so far released the UK site (/en-gb/) and the RoW site (/en/). /en-gb/ has been targeted at the UK in GWT and EN not targeted at anything, as per Google guidelines (as we’re targeting EN speakers, but not UK). 

The thing is, the /en/ site is trumping the /en-gb/ in UK organic search nearly a month after go-live. Anyone out there have any ideas?

thx
1
William Harvey's profile photoKath Dawson's profile photo
10 comments
 
Thanks for looking into this +William Harvey. You can see why we're a bit confused... 

Think we'll try the swaparound - will update this thread if it works!

Thanks again
Add a comment...
 
Does anyone know of a YEXT style equivalent for Australia? A service that allows you to manage all your local listings from the one profile ? Thanks
1
James Norquay's profile photoTim Capper's profile photo
2 comments
 
Moz local I think is coming to Oz next month
Add a comment...