Shared publicly  - 
This is a little unorthodox, but I ran a follow-up experiment to yesterday's blog post, and thought I'd post it here. In Experiment #2, I looked at a high-flux keyword across two data centers. That begged the obvious question: "How does a low-flux keyword look across those same two data centers?"

I used the same low-flux keyword as Experiment #1 ("fun games for girls"). Tracking it every 10 minutes for 24 hours (144 data points), data center 1 changed 13.2% of the time, and data center 2 changed 6.3% of the time. Across data centers, though, the results were out of sync 59% of the time. While this is less than the 97% of the time the high-flux keyword was out of sync, it's still substantial and suggests that the flux from differences in Google's systems/data is greater than the keyword's (and keyword environment's) inherent flux.

So, how high was the flux across data centers? Luckily, we can measure it just like MozCast temperatures. For the high-flux keyword, the average change was 137.7 deg. F - obviously substantial. For the low-flux keyword, the average change was only 59.4 deg. F.

Clearly, this is a complex picture. Even with changes across data centers, the nature of the keyword (volume, competitiveness and ranking signals) had a strong impact on the extent and severity of the flux.
Nathan Byloff's profile photoPete Meyers (Dr. Pete)'s profile photoJim Watson's profile photo
You are looking at how often there is a change, even one URL on page 1 right, or the top top pages? Can you run an experiment on how often the spots shift in 1-3, 4-6, 7-9, 10-12 from page 1 only? Is there anything interesting in there? I notice in some keywords I track, the top 3 spots are usually pretty static, but everything else shifts a lot.
The percentage is just how often there's any change. The temperature/flux shows how much change there is, but it isn't weighted by position (that turned out, for the overall purposes of MozCast, to be not terribly meaningful). It can sometimes be tough to find granular analyses that end up being actionable, made tougher by the fact there are roughly infinite ways to slice the data :) I do get the sense that there's a lot more stability in the top 3 in this rapid-fire scenario, though. A lot of the flux is shuffling.

We're also looking to track domain-based flux vs. URL-based flux. I don't have the tools yet to do that quickly (on the fly), but it's on the roadmap for MozCast. For some queries, it would help separate out new content or internal SEO changes from larger, competitive changes.
Add a comment...