A question that has been bouncing around my head for years, is that when benchmarking any hard drive, it always produces the same general pattern...a gradually declining slope with regularly spaced dips across the entire slope.
I just don't understand why those dips exist, because it doesn't...
analysis
benchmarking
capacity
data transfer
dips
disk speed
drive size
drive testing
hard drive
measurement
os impact
patterns
performance
performance metrics
regular intervals
slope
storage
storage media
toolprecision
variability