Profile

Cover photo
Brett Mové
4,115 views
AboutPosts

Stream

Brett Mové

General Tech Talk  - 
 
Hi everyone, first time poster here.  I'm trying to get opinions on an SEO disaster.  I'd appreciate if anyone had any thoughts.  We had a website (ecommerce, 8 years old, 600k indexed pages, Panda problems due to poor content product pages, but no penguin problems) relaunch on 12/14 with the following major changes:

• New IP
• New Server (Apache to Nginx)
• New design
• New chat feature
• New titles and meta descriptions sitewide
• Some new content
   o New homepage text
   o Reworded some other static page text
   o New image alt attributes
   o Added new wording to product pages (delivery rates, warranty, common accessories, new shipping images, new schema markup, new links to related product pages, as well as appropriate image alt attributes)
• New breadcrumb and local business schemas
• New sitewide links to social media sites (Facebook, G+, Linkedin)
• New URL structure
   o Removed .html extension (301 redirect) for all product page URLs (approximately 500k URLs)
   o Changed URL structure with 301 redirect and content of catalog pages (approximately 100k URLs)
   o Removed subcategories (currently a 302 redirect)
• New information architecture 
   o Added new top nav links (links to category pages) sitewide
   o Added new top nav manufacturer links to the homepage 
   o Added breadcrumbs to all pages
   o Added category catalog pages and an "All products" catalog
   o Added related products links to homepage, category pages, manufacturer pages, and product pages
   o Added an html sitemap link in the top nav
   o Added rel next/prev to catalog pages
• Turned off wildcard subdomains


Our Google traffic has dropped by 60% WOW since launch.  There have been some 301 redirect issues, but most of those have been ironed out.  My boss wants to revert back to the old site (an SEO "guru" told him to do this today), but I think it's risky because Google has already crawled about 100k - 200k URLs on the new site, and 100k - 200k 301 redirects from the old site.

Should we revert or stay the course?  Thanks in advance.
1
Tommy Redmond's profile photoBert van Heerde's profile photoRick Bucich's profile photoBrett Mové's profile photo
8 comments
 
Hi +Brett Mové, good to hear. Keep working on site structure, content and quality links. Last year I restructured a large site and it took 6 months until Google 'got it'. Since then, traffic more than doubled and continues to break traffic records every month. So stick to the (white hat) strategy, even if you do not see results immediately.
Add a comment...

Brett Mové

Shared publicly  - 
1
Add a comment...

Brett Mové

Shared publicly  - 
 
Anyone remember this?
1
Add a comment...

Brett Mové

Shared publicly  - 
 
It's finally here!
1
Add a comment...

Brett Mové

Shared publicly  - 
 
Are you kidding me?
 
With almost 3.5 acres of direct oceanfront, this Maui home is one of the largest properties on the North Shore. From the moment one drives through its gated entry leading to a tunnel of palm trees, a sense of complete serenity and privacy overcomes the senses, creating the perfect setting for relaxation and island living. The open floor plan primed for entertaining and emphasis on luxurious yet understated finishes becomes apparent. Truly a piece of art.

http://www.realtor.com/blogs/2012/02/01/rare-find-in-mauis-north-shore-listed-for-12-4-million-photos/
1
Add a comment...

Brett Mové

Shared publicly  - 
 
This is awesome!
1
Add a comment...

Brett Mové

Shared publicly  - 
 
Google Street View at night: 4220 Laurel Canyon Blvd, Studio City, California

I don't know if this is cool or terrible.
1
Add a comment...

Brett Mové

Shared publicly  - 
1
Add a comment...

Brett Mové

Shared publicly  - 
 
A UI change already!?
1
Add a comment...

Brett Mové

Shared publicly  - 
 
Thinking about a SOPA blackout? Read below:
Pierre Far originally shared:
 
Website outages and blackouts the right way

tl;dr: Use a 503 HTTP status code but read on for important details.

Sometimes webmasters want to take their site offline for a day or so, perhaps for server maintenance or as political protest. We’re currently seeing some recommendations being made about how to do this that have a high chance of hurting how Google sees these websites and so we wanted to give you a quick how-to guide based on our current recommendations.

The most common scenario we’re seeing webmasters talk about implementing is to replace the contents on all or some of their pages with an error message (“site offline”) or a protest message. The following applies to this scenario (replacing the contents of your pages) and so please ask (details below) if you’re thinking of doing something else.

1. The most important point: Webmasters should return a 503 HTTP header for all the URLs participating in the blackout (parts of a site or the whole site). This helps in two ways:

a. It tells us it's not the "real" content on the site and won't be indexed.

b. Because of (a), even if we see the same content (e.g. the “site offline” message) on all the URLs, it won't cause duplicate content issues.

2. Googlebot's crawling rate will drop when it sees a spike in 503 headers. This is unavoidable but as long as the blackout is only a transient event, it shouldn't cause any long-term problems and the crawl rate will recover fairly quickly to the pre-blackout rate. How fast depends on the site and it should be on the order of a few days.

3. Two important notes about robots.txt:

a. As Googlebot is currently configured, it will halt all crawling of the site if the site’s robots.txt file returns a 503 status code for robots.txt. This crawling block will continue until Googlebot sees an acceptable status code for robots.txt fetches (currently 200 or 404). This is a built-in safety mechanism so that Googlebot doesn't end up crawling content it's usually blocked from reaching. So if you're blacking out only a portion of the site, be sure the robots.txt file's status code is not changed to a 503.

b. Some webmasters may be tempted to change the robots.txt file to have a “Disallow: /” in an attempt to block crawling during the blackout. Don’t block Googlebot’s crawling like this as this has a high chance of causing crawling issues for much longer than the few days expected for the crawl rate recovery.

4. Webmasters will see these errors in Webmaster Tools: it will report that we saw the blackout. Be sure to monitor the Crawl Errors section particularly closely for a couple of weeks after the blackout to ensure there aren't any unexpected lingering issues.

5. General advice: Keep it simple and don't change too many things, especially changes that take different times to take effect. Don't change the DNS settings. As mentioned above, don't change the robots.txt file contents. Also, don't alter the crawl rate setting in WMT. Keeping as many settings constant as possible before, during, and after the blackout will minimize the chances of something odd happening.

Questions? Comment below or ask in our forums: http://www.google.com/support/forum/p/Webmasters?hl=en
1
Add a comment...

Brett Mové

Shared publicly  - 
 
What's going on with Google!?
Nelson Mattos originally shared:
 
We were mortified to learn that a team of people working on a Google project improperly used Mocality’s data and misrepresented our relationship with Mocality to encourage customers to create new websites. We’ve already unreservedly apologised to Mocality. We’re still investigating exactly how this happened, and as soon as we have all the facts, we’ll be taking the appropriate action with the people involved.
1
Add a comment...
Links