Extremely long shut down & hibernate

Page 4 of 4 FirstFirst ... 234

  1. Posts : 2,039
    Several, including Windows 7 x64 Ultimate
       #31

    Be careful with Puran defrag. It messed up a couple of drives on a few machines very badly. So badly that they would not boot or run properly. I don't know what exactly those people set up of course. They just said they had used Puran defrag. I have never used it myself.

    You might like to have a look at this, gives you a lot of control regarding placement etc.

    UltimateDefrag - The Defragger For Power Users - About

    Regards....Mike Connor
      My Computer


  2. Posts : 2,039
    Several, including Windows 7 x64 Ultimate
       #32

    PS. As you are doubtless well aware, but it occurred to me so I thought I would post it anyway, if your disc is actually 34% fragmented, then it is being subjected to massive wear and tear, and you will soon start getting a lot of read and write errors. This increases exponentially, especially if you are using things like databases with lots of reads/writes to small data blocks, and the drive will eventually fail.

    If you can possibly manage it, it is better to put such software on its own disc.

    Regards....Mike Connor
      My Computer


  3. Posts : 17,545
    Windows 10 Pro x64 EN-GB
    Thread Starter
       #33

    I can not understand such a big defragmentation percentage. According to Puran, it was mostly because of two files, VMware vm's vdi file where I installed Windows 8 M1 last weekend, and the second vhd I created for this W8 vm as drive D:. Made a mistake when installing W8 and selected fixed size vhd's instead of dynamically expanding.

    I normally defrag and maintain my rigs regularly, but it seems this one vm and its two 35 GB vhd's were spread all around my system.

    I have no fears for testing Puran, my last full system image is from yesterday morning.

    Kari
      My Computer


  4. Posts : 2,039
    Several, including Windows 7 x64 Ultimate
       #34

    OK. One last point which may be relevant, some defragmentation software automatically skips files above a certain size, and some wont defrag large files at all.

    Problems often arise when these files are written because they are written in a completely fragmented state to begin with, which actually excacerbates the problems even more.

    Anyway, will be interesting to hear the results of this.

    Regards....Mike Connor
    Last edited by Mike Connor; 25 Apr 2011 at 12:39.
      My Computer


  5. Posts : 2,528
    Windows 10 Pro x64
       #35

    True - Windows' own defrag.exe will skip large files by default, and depending on how large the fragments are they can be less detriment (in Windows 7) than they used to be.
    Disk Defragmentation
    In Windows XP, any file that is split into more than one piece is considered fragmented. Not so in Windows Vista if the fragments are large enough – the defragmentation algorithm was changed (from Windows XP) to ignore pieces of a file that are larger than 64MB. As a result, defrag in XP and defrag in Vista will report different amounts of fragmentation on a volume. So, which one is correct? Well, before the question can be answered we must understand why defrag in Vista was changed. In Vista, we analyzed the impact of defragmentation and determined that the most significant performance gains from defrag are when pieces of files are combined into sufficiently large chunks such that the impact of disk-seek latency is not significant relative to the latency associated with sequentially reading the file. This means that there is a point after which combining fragmented pieces of files has no discernible benefit. In fact, there are actually negative consequences of doing so. For example, for defrag to combine fragments that are 64MB or larger requires significant amounts of disk I/O, which is against the principle of minimizing I/O that we discussed earlier (since it decreases total available disk bandwidth for user initiated I/O), and puts more pressure on the system to find large, contiguous blocks of free space. Here is a scenario where a certainly amount of fragmentation of data is just fine – doing nothing to decrease this fragmentation turns out to be the right answer!
    However, this doesn't address problems when a lot of large fragments cause file system layout to be suboptimal - I guess the decision was made that someone who defrags regularly probably won't hit this issue, and the likelihood that a "regular" user is going to have a 35 - 40GB file (or multiples of them) on the same volume is probably a corner scenario that doesn't make much sense to address (especially when products like Defraggler, Puran, O&O, Auslogics, etc. address these sorts of scenarios, in different ways, of course). Defrag -w still exists, but no longer works (for defragmenting large files), and -b doesn't appear to be documented (for boot file defrag). Not sure how long these will live, but it appears -w still "works" just does a no-op) for backwards compat with scripts written for Vista. It no longer defrags files larger than 64MB.
      My Computer


  6. Posts : 2,039
    Several, including Windows 7 x64 Ultimate
       #36

    Indeed. For some special purpose file placement optimisation scenarios this works pretty well;

    Disktrix - Two Amazing Defragmentation Software Titles - UltimateDefrag And DefragExpress

    and in the right circumstances can give very significant performance gains.

    But personally I think this is pointless for "normal" users. For most "normal" users the Windows defragmentation software is adequate.

    Some scenarios like this are now becoming fairly widespread though, as larger numbers of users are using much larger files ( Movies, High resolution graphics, etc), often coupled with poor general maintenance. Doubtless if this becomes more of a problem, then Microsoft will address it.

    I think virtualisation will be playing more and more of a role here as well and more and more "cloud" orientierung.

    Regards....Mike Connor
      My Computer


  7. Posts : 2,528
    Windows 10 Pro x64
       #37

    If you want, you can also use contig from sysinternals to try and remove fragmentation on specific files.
      My Computer


  8. Posts : 212
    Windows 7 Ultimate x64 (Retail)
       #38

    I am also seeing a long period to shut down my system now.
    2 events that took place at the same time this started: I updated my BIOS, and installed SP1.
    I ran Cluberti's diagnostics, and everything active appears to be shutting down properly and within reasonable time (just it doesn't add up to the time before computer finally shuts down or a system reset/restart).

    I am taking a guess here that the BIOS upgrade possibly has a lot to do with the delay I am experiencing.
    Am not sure what it could be doing to cause this though.

    Note I did shorten (via the registry) the process kill time from 12 seconds down to 1 second, just to see if that helps any.
      My Computer


  9. Posts : 2,528
    Windows 10 Pro x64
       #39

    If your shutdown takes a long time, but the trace tool doesn't match the time (nor matches a delay), the problem is happening after Windows is unloaded (the trace is tracing all of the ETW events in Windows, so if the data doesn't match, the likelihood is you are correct - your BIOS upgrade is going to be the root cause.
      My Computer


 
Page 4 of 4 FirstFirst ... 234

  Related Discussions
Our Sites
Site Links
About Us
Windows 7 Forums is an independent web site and has not been authorized, sponsored, or otherwise approved by Microsoft Corporation. "Windows 7" and related materials are trademarks of Microsoft Corp.

© Designer Media Ltd
All times are GMT -5. The time now is 11:29.
Find Us