PageRank is the algorithm that seems to have set apart Google from the other search engines of its day, but chances are that it started changing from the moment that it was set loose on the world. I can't in good faith write about the PageRank of the late 90s, but wanted to point to a different model.
Not every link on a page passes along the same weight, the same amount of PageRank, and likely not even the same amount of hypertextual relevance. We heard this from Google Representives for a few years, and from even search engines like Yahoo and Blekko, where we've been told that some links are likely completely ignored such as those that might show up in comments on blog posts.
As this patent tells us, Google might see the anchor text of "terms of service" on a page, and automatically not send much PageRank to that page.
You see the name "Jeffrey Dean" listed as one of the inventors on this patent, and if you start digging through other Google patents, you'll see it frequently. He often writes about technical issues involving the planet-wide data center that Google has been building, and how the whole of the machinery works overall. If you have a few days to spare towards looking at patents from Google, it wouldn't hurt looking for ones written by him. His "Research at Google" page might overwhelm you:
Jeffrey Dean - Research at Google
There have been a lot of things written about PageRank over the years, but if you haven't read about the Reasonable Surfer and don't understand the transformation it describes from a random surfer model, you really should.
Here's a blog post I wrote about it that you can use as a kick start:
Google's Reasonable Surfer: How the Value of a Link May Differ Based upon Link and Document Features and User Data
Join and I as we share advice on how to build a winning social media strategy. Carrie is CEO of Likable Media. Likable works on a variety of enterprise clients and I met her as a result of clients we have in common.
The live YouTube link will be shared here 15 minutes before showtime. Just say YES to the event to make sure you get it, either for live viewing, or later.
Share the event and/or invite others as this is a public event!
The discussion is going on!
Highlights of the Feb 14th and Feb 24th hangouts.
Panda and Content Spidering: If you are adding new content to your website daily make sure that spidering is increasing in Webmaster Tools. If you are adding content and they're spidering drops it is likely a Panda quality problem.
HTTP and HTTPS: If you use both versions you need to have a robots.txt file for each version or it causes duplicate content.
Disavowed Links: If you remove links from your disavow file that have been ignored by Google the links will count again, good or bad. The disavow file works like a robot.txt file with a list of instructions to follow. Whether the link is suspect or good it will get the link juice back.
Don't use HTML Sitemap Pages: Google ignores these as a low quality page and will not trust it and could be a Panda issue.
Does Google Pass Juice to Links in Files: John Mueller says that Google will read links in any file (pdf,xls,doc,etc), but will not follow them with link juice. Only proper HTML anchor tagged links in files will pass link juice.
Does Google Use CTR Tracking: Mueller denied, but if you read between the lines you could figure out that they do. It is mentioned in the book, "In the Plex" on Click through analysis, fom SERP to site and back to SERP. If visitors are bouncing from that page, add some fresh content, video or links from internal pages that are ranking.
Panda: they use internal usage metrics. If it looks nicer in the SERPS and a quality issue of on page content. "Have to have to site optimized before sending traffic of any kind" this implies that they may watch paid traffic in terms of the bounce rate for panda but has previously denied it in past hangouts.
They will ignore 301 is a directive to follow if you forward to a to b and they like A they will ignore the page your redirecting to.
New Site Honeymoon: Make sure your traffic is responding to your pages for "New Site Honeymoon" split test to do A/B testing to see if it perfoms well. It implies that the algorithm puts your site in a category for user metrics and specifically "sharing". Social proof is something they look at and if your website is not getting social traffic it will disapear. Google has algorithms for rank, for trust and de-rank according to social signals.
Site errors: 500, PHP, MySQL errors will be a "Panda" quality factors. Do not have errors for more than 48 hours with our testing.
Google is Machine Learning It is alive, it is SkyNet. It is equivellant to a 3 year old child.
Geo targeting Algorithm Working on a Best Guess Basis Even if you have .com or .uk Your content has to reflect your geo-targeting href lang. Based on quality or trust they will decide whether or not you are relevant to that specific geo location. Not just by the URL. Everything is proportional based and machine learning. That is what they are using on-the-fly to rank websites.
Share Your Disavow File to Public: John Mueller stated that when you supply a Google spreadsheet for your disavow list that is set to "Public" otherwise the web spam team cannot access it. The Webspam team does not have access to your Google account backend. It is against Google's terms and conditions that they cannot get into your account.
Review snippets: John Mueller implied that the review snippet is a quality factor. It's a great way to share the site.
Cloned Sites: When you have multi URL with same design, topic and just trying to rank for keywords. You cannot use clones but you can have exact match domains if it is trusted. If you have non-exact match keywords pointed to the site is preferable.
Google could detect that the IP is the same, registered by the same person, the design is the same, back links, keywords and content topic, that they will choose the best version that they determine which is the main site and will canonicallize it for you.
Does Forwarding from a Penalized Domain to a New One Reset a Penalty? If you try to kill your domain that was hit by Penguin and you create a new domain and forward everything over that they will "help" by automatically setting a 301 redirect. So they look at old site information and forward all the signals, good or bad to the new site. They have 2 indexes one is the historical index and the ranking index to compare and make a decision based on the mined data.
"They don't want spammers to get away with cloning sites" John Mueller said.
They keep a changelog and can tell if you are maintaining the pages. They could figure out that these are all the same and that you are trying to manipulate the SERPS.
Site Navigation is a Ranking Factor: Internal navigation needs to be logical. The page on your site that you want to rank the highest you need to point your internal links to that page to "sculpt" you page rank internally.
If you are experiencing issues with Google's many algorithms you could visit us at http://www.gobiya.com/ for answers.
- SEO Workers, GermanyForensic SEO & Social Semantic Web Consultant, 2006 - present
- Webnauts Net, GermanyAccessibility & Usability Consultant, 2001 - present
- Lycos EuropeLead of Usability Testing, 2001 - 2003
- Ronacher Theater, AustriaEvening Duty Manager, 1993 - 1995
- VSG, AustriaPersonnel Director, 1993 - 1995
- College of Police Sciences, GreeceStudies Director, 1991 - 1993
- WTA & Co., GreeceChairman & Managing Director, 1986 - 1993
- InterDetectives, GreeceCEO & Chief Investigator, 1983 - 1993
- Blog Webnauts Net (current)
- Algojunkie Blog (current)
- Search Editors (current)
- SEO Watch Blog (current)
- SEO Workers (current)
- Webnauts Net (current)
- Semantic Articles (current)
- SEO Workers Forums (current)
- SitePoint (current)
- WebProWorld (current)
- SEO Workers on Google+ (current)
- Morestar (current)
- SEO Workers Labs (current)
- Article Blast (current)
- Doc Sheldon (current)
- Algohunters (current)
- rainvac.com (current)
- searcheditors.com (current)
- algohunters.com (current)
- gmail.com (current)
In 2001, I graduated from the German Academy Brueschke in Bielefeld as a Specialist for Multimedia Office Communication.
The same year, I graduated from the Academy of the German Chamber of Commerce and Industry in Bielefeld as a Web Project Manager (IHK).
Additionally, in 2003 I passed an exam at Brainbench.com in Web Design Concepts. Also, in 2003 I passed my test at Brainbench.com as a Master Web Designer for Accessibility.
In the same year, I completed an online course of study at Carleton University, Sprott School of Business in Usability Testing.Lycos Europe Inc. in various positions for two years:
- 04.2001 - 07.2001 Intranet Webmaster;
- 07.2001 - 03.2002 Ad Format Specialist;
- 07.2001 - 03.2002 Trainer for Web Design for Accessibility;
- 03.2002 - 04.2003 Lead of Usability Testing, supervising the usability tests for the European Web Development.
I also hold membership with the following professional organizations:
- The Guild of Accessible Designers;
- Web Standards Group;
- Internet Society (ISOC);
- Interaction Design Association (IxD).
- Florida State UniversityFSU Certified Webmaster, 2004 - 2004
12 Types of Evergreen Content That Attract Valuable Links
Evergreen content, or content that is updated regularly and won't quickly become out of date or totally incorrect within a short time period
How Google May Boost Search Rankings for Your Relevant Pages Using Keywo...
Imagine that Google assigns categories to every webpage or website that it visits. You can see categories like those for sites in Google’s l
Professional Web Copywriting Can Make Your Business Stand Out - Doc Shel...
The tone used for web copywriting can make or break an article. There are several types of tones that can be used for written content and ma
Business Names Google Places Quality Guidelines Updated
Google has updated their Google Places quality guidelines once again this time to clarify how you can name your business within Google Place
Why Duplicate Business Listings Are Like The Walking Dead
In the Local SEO biz, we spend a lot of time fixing duplicate business listings. Duplicate records of your business appearing throughout the
How One SEO Consultant's Near Death Experience United The SEO Community
Dana Lookadoo is a familiar, warm, embracing and smart personality in the SEO/SEM space, and when she was involved in a major accident, the
Google's Matt Cutts: Don't Worry About Poor Grammar In Comments
In today’s video from Google’s Matt Cutts, Matt addresses the concern over having third-party comments with poor grammar on your blog or sit
The Future of Web Design May Be Ugly | Neuromarketing
We’ve seen a variety of disastrous web design trends over the years. Remember splash pages? All-Flash sites? Frames? We may be on the cusp o
Heading Elements and the Folly of SEO Expert Ranking Lists
The importance of a heading element to a search engine doesn't hinge upon how it's displayed, nor upon an presumption that it is what a page
Two Videos Of What It Would Be Like If Google Was A Real Living Person
Google, as a search engine, is incredibly helpful to most of us. But what if Google was a real person? There are two video parodies that mak
Social Media Cartoon: Bathroom Humor | Social Snap Blog
No folks, we're not above bathroom humor. Maybe someone should publish a guide on social media usage in the bathroom. This entry was posted
Google Has Officially Penalized Rap Genius For Link Schemes
Google has penalized Rap Genius for link schemes this morning. If you go to Google and search for [rap genius], rapgenius.com will not be fo
C’Mon Google, Women are Entities Whose Names Might Change!
Back on November 23rd I put up a post asking people for their best holistic SEO tips, and this led to a very interesting side discussion abo
Machine Learning Basics for Better Content & SEO Results
Learning algorithms learn from the patterns they detect, both in queries and documents, as well as the relationships between them. By unders