I have an external Seagate 8TB USB disk . As usual , when I have less than 10% free disk space , Win 7 gives me a warning . If this is only a data disk , is it still important to have at least 10% free space ?
If it's a spinning disk, as they become close to full, they will have a performance hit since the R/W head has to move further out to write data. Other than that it shouldn't be an issue other than the obvious space shortage issue.
First, the dimensions of the platters inside drives have not changed in years. What has changed is the number of platters and how much data the R/W head can cram into the same amount of space (density). So just because today's drives have monster capacities, that does not suggest it takes longer to retrieve the data. In fact, seek times have improved greatly and while rotation speeds have remained the same, the data density is much greater so the R/W head can read in or write much more data per rotation. Thus, access times are greatly improved with larger capacity drives.
And to those seek times, whether reading or writing, note that really only applies to finding the first segment of a file. Unless very heavily fragmented, each following segment will be read from or written in the adjacent storage location. There will be no moving further out.
Also data is NOT saved on hard drives with all the data jammed at one end of the platter. Instead, files are scattered all over the disk for the purpose of keeping all the file segments in sequential order (not fragmented). But over time as those files are modified, they can become fragmented and scattered about even further if there is not sufficient amounts of free disk space. It is this fragmentation that slows down access.
Fortunately, unless you dinked with the default settings, Windows automatically defrags disks weekly (if needed) to minimize these issues.
Also, the R/W does not "park" itself at the beginning or end of the platters after every read or write either. Automatic head parking is only done when the power is removed. This means the distance, thus "seek" time it needs to travel to get to the first file segment for the next read or write is random and immaterial because it is all based on where it performed its last access.
So the only performance hit comes due the fact a shortage of free disk space means potentially more fragmentation causing the R/W head to jump back and forth much more, not because it has to travel to the other end of the drive.
As to your question about 10%, that is just an arbitrary number. Many years ago when drives were much smaller, that was more or less a rule of thumb but today that no longer applies. Years ago, most users only had one drive and of course, it contained the OS, which needs free space to operate in.
You still need a nice chunk of free space on secondary drives to accommodate defragging tasks, and for temporary files. But 10% on a 8TB disk would be 800GB and reserving that amount is just ridiculously wasteful.
I think reserving anything over 100GB would be wasteful and on my own drives, I like to keep at least 30GB free. If I start crowding that, it is time to free up or buy more space.
But there are many variables that must be considered. What kind of data matters. If this disk is full of family photos, music or video files, they are likely to never be "opened" for editing. That means they likely will never change size, need any temp file locations, or ever need to be moved. But if these files are document type files (Word files, spreadsheets, presentations, etc.) that will be opened, the OS always creates temporary copies when they are opened and that requires extra space. And if modified, the modified file is always saved in a new location then the old copy is deleted (creating a new hole of free space).
Thanks Digerati for your informative reply - much appreciated . The files are of the first type , so I can safely down up to 100GB free , if I understood you .
Re defragging etc. , on the one hand , it could influence performance as you point out . But , a good side of fragmentation could be , that the disk , as a mechanical device is being forced to access more of the physical area , preventing what might be described as "getting rusty" in the un-accessed areas . Has happened to me , that parts of disk will just be working much slower . There are no bad blocks etc. - and always seemed to me that this could be the reason .
No, sorry. But it just does not work that way. I don't know what it was that happened to you, but it was nothing like unused portions getting rusty due to lack of use. We are talking about microscopic magnetic particles on the platters. Not hinges and levers or mechanical moving parts that might freeze up if not used.
Defragging does NOT mean consolidating free space - suggesting that regular defragging might mean some portions are never used. Consolidating free space is a separate process - and one that Windows integrated defrag feature does NOT do. This is because it is not important that all files be jammed on one end. In fact, that is counterproductive because files are constantly being modified - even system files via Windows Update. What is important is files are not fragmented. So files can be scattered all over without degrading performance, as long as the file segments are together.
Also, it is important to note how defragging works. When you defrag a disk, file segments are shuffled all over the disk during the defragging process. That is, every part of the disk is used to temporarily store data, while other parts are freed up to make room for entire files.
It is like organizing your closet. You clutter up the entire room first, then put everything back in order.
Even during regular use, portions of free space are regularly used for temporary files - at least on the boot drive. This includes 1000s of cookies and temporary Internet files, not to mention other temp files used by Windows and other running applications. So it is just wrong to suggest these locations are not used.
Lastly, whether data is saved in a storage location or not, the effects on those locations is exactly the same. All that happens is the magnetic charge imposed by the R/W head orients the particle one way to represent a 1 and another way to represent a 0. They don't get stuck if not "flipped". And there is not more or less of a "charge" if data is there or not.
Thanks Digerati .
"Getting rusty" - my intention was not so much the disk surface , as the accessing that position ( e.g. at 50% radius , 70% radius etc. ) .
If defragging does not do , what you term consolidating , then that would prevent this problem .
Is the consolidating mentioned here in capture , the same one , you mention ?
The disk map here , from Defraggler , seems to show that the end part of disk is not being used . This was not after running Defraggler , by the way , just to see the map .
This problem , I mentioned , has happened to me a few times , by the way , as over the years , probably , actually only using a small part of disk area .
An update on this . I actually don't defrag , for the reasons , I've explained .
Still , I decided to try the Win7 defrag .
Below is the map that Defraggler gives me after . It has distributed some toward the end , but most appears to be in the first 1/2 .