Full defrag option via cmd


  1. Posts : 10
    Windows 7 Ultimate
       #1

    Full defrag option via cmd


    The defrag tool in Windows Vista and Windows 7 seems to skip fragments larger than 64MB in size in order to cut down on defrag times.

    This problem was easily "solved" by going into command prompt and using the "-w" parameter.

    Example: defrag c: -w

    This option seems to have disappeared in Windows 7, and as such, large fragments are left unoptimized.

    Is there anything that can be done?



      My Computer


  2. Posts : 5,092
    Windows 7 32 bit
       #2

    If you actually tried it and it didn't happen I guess they killed it. In Vista I noticed even though the -b switch was not listed when you display command line switches, it still defragged boot files. Figure that one out!

    Anyway, I'd recommend you search for a free tool that does a more thorough job. Since XP almost all defrag tools use the defrag API built into Windows anyway, so most of them are pretty safe(I would look on the site for verification that they use it, as example, MyDefrag/jkDefrag states it uses the API.)

    I believe on MyDefrag you can set the "space hog" size. But you can check on the MyDefrag forum to see what the limits are.
      My Computer


  3. Posts : 10
    Windows 7 Ultimate
    Thread Starter
       #3

    Using the -b parameter still seems to invoke a boot optimization pass, however using -w doesn't seem to do anything. Large fragments still remain.

    It also doesn't consolidate free space as much anymore. Large chunks of data are left untouched, and as a result, fragmentation is very quick to re-appear on the drive.



    Thing is, i have tested PerfectDisk, Diskeeper, O&O, MyDefrag / JKDefrag, and Defraggler. The Windows defrag option, for some reason, yielded the quickest boot up times.

    All the other defrag utilities actually slowed down my boot up time, and caused my desktop to take an unreasonably long time to load in and become usable.
      My Computer


  4. Posts : 5,092
    Windows 7 32 bit
       #4

    The "trick" I used to do was use the built in command line defrag with the -b switch every now and then. Then use a 3rd party defrag for the interval. IOW, a boot defrag every couple of weeks, one of the other defrags every so many days or whatnot.

    Someone I knew in the business who optimized mini-computers used to tell me the ideal set up(all HD speeds being equal) was to have 1 physical HD for system/executables, 1 physical HD for virtual memory, and 1 physical HD for data. It's changed a bit due to RAID but the idea is to have diversity of physical disks rather than one drive thrashing.

    On a single physical disk any defrag strategy is going to favor one scenario or usage and hamper another. Engineering is usually making the best trade-off under the circumstances(long-winded way of saying nothing's perfect.)


    edit: btw his method of defragging a HD wasn't to use a defrag program. He just did an image backup, then quick formatted the drive, restored the backup. Of course being in a corporate environment there were redundant backups so it wasn't as "risky" as it sounds here.
      My Computer


  5. Posts : 1
    Windows 7 Home x64
       #5

    On that note...


    I was talking a friend about this, and he linked me to the following article:
    Engineering Windows 7 : Disk Defragmentation – Background and Engineering the Windows 7 Improvements

    The key part of the article being this:
    In Windows XP, any file that is split into more than one piece is considered fragmented. Not so in Windows Vista if the fragments are large enough – the defragmentation algorithm was changed (from Windows XP) to ignore pieces of a file that are larger than 64MB. As a result, defrag in XP and defrag in Vista will report different amounts of fragmentation on a volume. So, which one is correct? Well, before the question can be answered we must understand why defrag in Vista was changed. In Vista, we analyzed the impact of defragmentation and determined that the most significant performance gains from defrag are when pieces of files are combined into sufficiently large chunks such that the impact of disk-seek latency is not significant relative to the latency associated with sequentially reading the file. This means that there is a point after which combining fragmented pieces of files has no discernible benefit. In fact, there are actually negative consequences of doing so. For example, for defrag to combine fragments that are 64MB or larger requires significant amounts of disk I/O, which is against the principle of minimizing I/O that we discussed earlier (since it decreases total available disk bandwidth for user initiated I/O), and puts more pressure on the system to find large, contiguous blocks of free space. Here is a scenario where a certainly amount of fragmentation of data is just fine – doing nothing to decrease this fragmentation turns out to be the right answer!
    It's nice to know that they consider fragmentation a genuine issue worth investigating, and not just an archaic problem.
      My Computer


  6. Posts : 5,092
    Windows 7 32 bit
       #6

    The natural tendency is to think of packing everything to the front of the disk. In fact I think back to Win9x I believe Windows would try to create new files after the used space on a partition to avoid breaking the file up to fit in open spaces. But then when you get that file you have to seek past all the packed in data. You have contiguous file but you have to take a big seek to get to it. It's been a long time since I read file system FAQs like about ext2,3, NTFS etc. but for the curious you can find them and the rationale for the design afa reducing fragmentation etc.

    As example what they were saying about big chunks not being all that bad, if you run the benchmarks in Crystal Disk Mark you see that the performance takes a nose dive when you get down to the random reads and write with the tiny chunks. Bigger chunks don't fare so poorly.
      My Computer


  7. Posts : 1
    Suffolk
       #7

    The /R /W and /F legacy switches are still available in Windows 7, but they are not listed in the defrag /? for some reason best known to Microsoft.
      My Computer


  8. Posts : 687
    Microsoft Windows 10 Professional / Windows 7 Professional
       #8

    MilesAhead said:
    Someone I knew in the business who optimized mini-computers used to tell me the ideal set up(all HD speeds being equal) was to have 1 physical HD for system/executables, 1 physical HD for virtual memory, and 1 physical HD for data. It's changed a bit due to RAID but the idea is to have diversity of physical disks rather than one drive thrashing.
    I have the same configuration, been using it for like 10 years, never had a single issue.

    MilesAhead said:
    his method of defragging a HD wasn't to use a defrag program. He just did an image backup, then quick formatted the drive, restored the backup. Of course being in a corporate environment there were redundant backups so it wasn't as "risky" as it sounds here.
    Lol, exactly my defrag method on WinXP and Symantec Ghost days.
      My Computer


 

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 14:54.
Find Us