Hard Drive Benchmark Patterns

seekermeister

Honorable Member
A question that has been bouncing around my head for years, is that when benchmarking any hard drive, it always produces the same general pattern...a gradually declining slope with regularly spaced dips across the entire slope.

I just don't understand why those dips exist, because it doesn't matter what drive, OS or benchmark tool that is used, they are always there. The actual distance in GBs will vary, depending on the size of the drive, but the dips will occur at a set interval for a given size of drive.

If those dips represent real changes in performance, what is the cause. If not real changes, then what are they?
 
Back
Top