Stream

Mark Taylor

General Tech Talk  - 
 
Does anyone have any experience or proof that Google is now favouring https sites over http?
2
Amel Mehenaoui's profile photoRyan C.'s profile photo
6 comments
Ryan C.
 
Maybe for ecommerce and other websites that ask for sensitive information like your credit card.

On these websites, an encrypted connection is definitely a signal of better user experience.
Add a comment...
 
Anyone have any experience with content syndication services on e- commerce sites? Multiple Product manufacturers have made contracts with a third party to create rich media content for their products. This data is then distributed by scripts or by some other form of automation to the retailers product page. Data is approved by the product manufacturer and the customers seem to like what they see, at least the conversions point to that conclusion.

- Clearly one problem is unique content, but almost all sites seem to have implemented this as supplementary, not as primary source of data.

- he other part I'm worried of, is the amount of data the syndication company is requesting in return to "better provide data based on our products" What are they really using this for? (insert paranoid behavior here)

- The data is not available as pure data. It is always pre-formatted.

This might be old news in other parts of the world but our small country with really small audience (language) is only now seeing the beginning of this

Examples:
http://www.fnac.com/Sony-Cyber-shot-DSC-HX50V-Argent-WiFi-GPS/a5920770/w-4

And 

http://www.currys.co.uk/gbuk/tv-dvd-blu-ray/televisions/large-screen-tvs-32-and-over/lg-47la660v-smart-3d-47-led-tv-21262828-pdt.html

Thank you for your help and thoughts.
1
Krinal Mehta's profile photoandreas wpv's profile photo
4 comments
 
Some of the biggest e-commerce sites in our country has dealt with this problem with a similar strategy that +andreas wpv mentioned. For top selling products, they have close to 10k reviews, just imagine the amount of unique content that generates. http://bit.ly/1eUo2Sp
Add a comment...

Rob Morgan

General Tech Talk  - 
 
Hi - A backlink from a discussion page on Wikipedia, but the linking page has a domain authority of just 1/100 (as opposed to wikipedia being 100/100) - worth anything?
1
Amel Mehenaoui's profile photoRob Morgan's profile photo
5 comments
 
+Rob Morgan I do second +Ashley Berman Hale on this. If your link adds value to the wikipedia page than it's worth having a backlink from there.

We have to remind ourself that we are not just inquiring backlinks for the link authority only but to also drive traffic to a site. So if your backlink is a good resource...well, wikipedia's audience may find it interesting and may click to learn more!

It's a win/win/win/win situation where the Search Engines, Wikipedia, your target audience and you...win!
Add a comment...

Erica Weatherstone

General Tech Talk  - 
 
If we do a landing page for a PPC campaign and put a different tracking number on it, will Google crawl and index that number and will that affect the Knowledge Graph/Google Local results if Google identifies a different number on your website?  Client is unfamiliar and uncomfortable with the effectiveness of a no index, no follow for the landing page.
2
Eric Wu's profile photoKrystian Szastok's profile photo
6 comments
 
Educate the client to be comfortable with noindex on page level. Or suffer risks.
Add a comment...

Ben Wood

General Tech Talk  - 
 
Google now seem to be testing product images in organic results.  First spotted by +Krystian Szastok earlier this afternoon.
6
9
Davy Ros's profile photoAaron Bradley's profile photo
Add a comment...

Ry Bacorn

General Tech Talk  - 
 
Index status & SSL in Google News
Google shared an update on the 9th (link below), where they discuss verifying your website with both http and https URLs. The reason this is interesting is that it's clear there are websites working towards getting onto SSL for a number of reasons. Furthermore, with this type of update for any site, the ability to get secure content into Google news has been difficult as a result of URLs, even with both http and https being verified. 

Though the update doesn't specifically highlight this issues, if you read between the lines it does speak towards the problem in their vague verbiage: "in order to see the index count for your secure site, you will need to add it to Webmaster Tools (e.g. https://www.example.com) and then select it from the Site Selector."

Has anyone else had issues with their secure content being indexed?

Google Index Status Update: https://support.google.com/webmasters/answer/2642366#update 
4
Ry Bacorn's profile photoRick Bucich's profile photo
2 comments
 
Agreed. What I see between the lines is that they are making is possible for SSL sites to be added to WMT and therefore indexed in various Google products without any issues.
Add a comment...

Chris Bolton

General Tech Talk  - 
 
Here's a fun one. My client's street address magically changed to the wrong street in the serps and on Google Maps. In Google Map maker, the address is correct. We've sent several correction requests to Google, but it has been weeks with no response.

When logging into Google+ the address is correct, but on the published page it is incorrect.

(correct address is 7220 SE Cesar Chavez Blvd. Portland, OR 97202)
1
Tony McCreath's profile photoChris Bolton's profile photo
8 comments
 
I have a recent case where places suddenly changed an address and will not accept the correct one. It seems they got stricter on what they will allow and have started changing addresses that don't pass.
Add a comment...

Neeraj Kumar

General Tech Talk  - 
 
When somebody decides to make career in Search Engine Marketing, do he should pursue a course related to online marketing or Search Engine Optimization?
Is there real certificate course available - online, distance or full time?
1
Jennifer M's profile photoGareth James's profile photo
6 comments
 
hahahaha 2 Internets for you, sir.
Add a comment...

Aaron Bradley

General Tech Talk  - 
 
pushState vs. hash bang URLs for link equity

Do pushState ("pretty") URLs for AJAX do a better job of passing on link equity than hash bang URLs (http://yourwebsite.com/#!/some/page/)?

I could - but won't - overcomplicate this question by adding my own opinion, because I don't feel confident that my own understanding of the relevant technologies is fully correct (that's why I'm asking!:).

I can say, however, that there's little in the way a reliable answer to this question, although most SEOs come down on the side of pushState.  It would great if someone could do a robust compare-and-contrast of these two options for AJAX URL rendering as they pertain to link equity - and better yet if this was fleshed out with examples of real-life link metrics.

Many thanks in advance for any thoughts on this often infuriatingly complex topic!
4
Mark Keller's profile photoChris Koszo's profile photo
22 comments
 
I would also like to add something to this debate. +Aaron Bradley originally was asking about link equity. I think you are asking whether or not the #! method would terminate the url, and all states after that would attribute all positive link equity to that root URL.

If thats the case I will add that you should remember, when adding a #! the URL is referred to in 'Google-bot' transforms from #!state-identifier, to ?escaped_fragment=state-identifier=value.

A reasonable answer would simply be that Google has been attributing link equity to anything that is a dynamic URL ?_ as unique URLs for a long time. So in my own views I would say that these URLs should be equally qualified for the same link equity using either pushState or #!.

You had a good point +Aaron Bradley , I didn't get it until reading it a few times. 

Cheers!
Add a comment...

JR Oakes

General Tech Talk  - 
 
Question:  If you could only follow one SEO blog, what would it be and why?
4
Eric Wu's profile photoHyderali Shaikh's profile photo
7 comments
 
I visit only inbound. org & there I get everything ;-)
Add a comment...

Andrea Moro

General Tech Talk  - 
 
In the process of re-organizing the taxonomy and "indexationability" of some page where the geo structure is a key, I wonder what's your opinion on the following:

Let's assume a big country like France, but it can be UK or US.

I have the following levels:
Area (Or departments)
City
District (available only in specific circumstances)

So in Geo terms this means 
/13/Marseilles/01
or
/Hertfordshire/Cambridge/CB1

The problem a structure like this may face is that for each level the proposed dataset is a limited one of the hierarchy above, so in the end there is lots of duplicate data on the site.

I thought it could be appropriate to add the noindex tag for the innermost level, thus allowing only the Area and the City level to be proposed.
In such a way, even though Area and the city definitely contains the same information, the proposed order is not conflicting and generating too many URLs.

What's your take on this?
1
Andrea Moro's profile photoAndy Beard's profile photo
3 comments
 
Hey +Andy Beard thanks for your input.
Though I appreciate what you say, the origins page of my business entity lives in a structure that is not connected to geo data. Not at least in the URL.
This will prevent issues like the one you described.

I was just concerned on the huge mass of listing items across the whole structure thus my concerns in getting the district level not indexable.
Add a comment...

Dan Manahan

General Tech Talk  - 
 
Wow, site hacked!  Anyone ever run into these guys?
During a G-SERP assessment the #2 result for the query  san diego personal injury lawyer yielded the attached pic. Must be recent,  how else could it slip the #2 spot? 

Has me wondering where an SEO's responsibility starts/stops concerning a hacked site. If this was your client would you pass the buck to the webmaster or host or whomever? Would you get involved? As an SEO are you culpable for security breaches?
1
Dan Manahan's profile photoDean Cruddace's profile photo
5 comments
 
Thanks guys, appreciate the feedback. Granted it's a hypothetical as I've never had the misfortune of a client site being hacked, but then I haven't been playing the game very long. Bound to happen sooner or later, hence my interest. 
I'm also curious how G's algo is presently unable to catch the aforementioned result. I can't imagine that's a result G wants to give users. 
Add a comment...

Tony McCreath

General Tech Talk  - 
 
I get a lot of spam SEO email from gmail accounts trying to sell SEO. I generally ignore them. 

But I rarely get emails offering payment for links. I have just been offered $85 "to post a guest article" that "will contain a link" to a "casino website".

And it's from a real domain that represents an SEO company.

What's peoples policy on dealing with this sort of thing? Ignore or report? And how would you report it?
3
Amel Mehenaoui's profile photoKrystian Szastok's profile photo
12 comments
 
+Tony McCreath congrats for reporting it. I do agree with +Eric Wu our industry is getting a lot of bad reputation because of the unethical behavior of some SEO companies.

I even had a confession from one of my connection telling me (after I have showed them in $ the benefits of SEO), that he always thought that SEO was like a "black box"! I was kind of sad to hear him say that but I took it as a challenge to always show how SEO is a valuable asset to a company bottom line!

I'm always determined to apply "Smart & Ethical SEO" as a basis to all my web strategies.
Add a comment...

Tony McCreath

General Tech Talk  - 
 
I have a client who has an SSL certificate that only works for the main domain. So if you use https://www. Chrome throws up the security warning.

I tried setting up a 301 redirect but Chrome still throws up the warning before doing the redirect (would that be a Chrome bug or feature?).

This problem is made real by the fact Google is currently sending people to the https://www. based URLs. Thus all visitors from Google get the warning.

Has anyone come across this issue and resolved it?

I've currently set up 301 redirects and have asked the client to claim all the https domains in GWT. Hopefully a Fetch+Submit will get Google to stop it using those URLs.
4
Tony McCreath's profile photo
13 comments
 
Google has already dropped using the https in the search results so it looks like the redirect has helped there. It's still worth cleaning up the certificate problem.
Add a comment...

Glynn Davies

General Tech Talk  - 
 
I'm looking for a proxy recommendations! I'm in the UK, and need to crawl a large commercial site via non-UK IP(s).

Country not too important, though access to more than one would be a bonus.

Hide My Ass looking good so far, so if you've any experience crawling sites through their proxies, I'd love to hear it.

Thanks!
1
Stephen Rhoads's profile photoGlynn Davies's profile photo
10 comments
 
MyPrivateProxy.net is great. They offer Private & Shared proxies. Inexpensive and fast customer service.
Add a comment...

Paulo Oliveira

General Tech Talk  - 
 
hello awesome community!

I'm doing a new website for a client: new domain, new CMS; 
I should 301 redirect all pages to the new pages. But there are hundreds of pages to redirect. Is there any tool to 'automatize' the process?

thanks in advance for rhe answers!

oh and the redirects will be from something like site.com/444 to site.com/friendlyurl
1
Menachem Rosenbaum's profile photoPaulo Oliveira's profile photo
7 comments
 
Another great idea +Menachem Rosenbaum Thanks :)
Add a comment...

Tony McCreath

General Tech Talk  - 
 
Say you ran a CMS system that managed 1,000s of websites and you accidentally set the robots.txt file on all of them to disallow all pages.

Once you fixed the bug, what would be the best way to help those websites recover?
3
Menachem Rosenbaum's profile photoTony McCreath's profile photo
18 comments
 
ping the SE with all pages
Add a comment...

Anna Gawecka

General Tech Talk  - 
 
I'm wondering if hreflang attribute value signals are passed with nofollowed links. 
1
Daniel Vareta's profile photoAnna Gawecka's profile photo
5 comments
 
I would also agree with what is said on the thread
Add a comment...

Ry Bacorn

General Tech Talk  - 
 
Interstitial Impact on Organic Traffic
Getting more mobile users is huge right now. And getting them to download your app is also desirable. But at what cost, and should we sacrifice organic traffic? In other words, when you serve an experience that includes an interstitial, what are the impacts on search? A few things come to mind:
1 Will we do a takeover or redirect users to the interstitial, and will the latter appear as cloaking?
2 Can we control the display of the interstitial with a cookie, and how will bots interpret this?
3 Will this impact our rank and organic traffic?
4 Will it impede the mobile experience, namely speed and exits?

Getting users to download an app is be important. But will your organic traffic suffer? There are no clear answers, only speculation about what the best path is. Here is a pretty good article on what Google's stance might be (you'll have to copy and paste the link): econsultancy[dot]com/blog/62902-google-avoid-download-app-pop-ups-or-lose-mobile-search-rankings

What do you think?
1
Ry Bacorn's profile photoTrey Collier's profile photo
4 comments
 
+Jennifer M better? :P 
Add a comment...

JR Oakes

General Tech Talk  - 
 
Youtility:  Has anyone here read this book?  If you have, how have you applied this to SEO (ie. working with the client to gain insight into their client's needs)
3
Add a comment...