Trying to upgrade clang at mozilla I found some problems with our build not being easily reproducible and with the tools we use to compare benchmarks.  I hope to write a bit more about it soon, but I wanted to sure a particularly interesting fact first.

Chatting about these problems with Nick Lewycky yesterday he mentioned that assuming a normal distribution might be an over simplification. A computer has a minimum time in which it can perform a task and there are many things that can cause it to be slower than that. A particular load address might cause cache lines to alias, the kernel might move the program just as it had the cache populated, etc.

I decided to try it out. The attached histogram is from 4000 runs of sunspider's 3d-raytrace with firefox's js interpreter (not the jit). Notice the different peaks. It does suggest that a good model is a base time and multiple random problems that slow it down.

With entire companies dedicated to writing benchmarks, I hope someone has studied this before. Anyone knows a reference?
Photo
Shared publiclyView activity