Just for laughs...
Many of the non-default defraggers can slow the system down by destroying prefetch optimisation.
As Antman said, the actual definition of "fragmented" is in the eye of the beholder. It's possible to choose any arbitrary definition for what constitutes the best possible layout. You could have small files first, then medium, then large, or perhaps all EXEs, then all DLLs, then all TXT... clearly some schemes will be better in certain ways but worse in others.
What the OS "prefetch" mechanism does is to watch what's happening during the boot phase and during the first 10 seconds of an app's startup. It records information along these lines:
- While starting Notepad, we first read all of Notepad.exe.
- Then we accessed the second half of NTDLL.dll.
- Then we read from 25% to 50% of Comctl32.dll.
- Then we finished by reading from 65% to 75% of Advapi32.dll.
(Obviously, just an example.) Later, the in-built defragger comes along and examines that recorded prefetch info. It may conclude that a lot of apps currently on the system do similar things with those particular files, so that it makes sense to reorganise their storage on-disk to make those fragments contiguous. The disk head(s) can then read them in one swoop without having to incur the penalty of extra seek time and rotational delay to move between the various fragments.
To an outsider, that disk layout may look completely insane and highly fragmented - "why the HECK would you have a quarter of Comctl32.dll sandwiched between a half of NTDLL.dll and a tenth of Advapi32.dll?!?" If that outsider is a disk defragmenter which doesn't understand prefetch, and many of them don't, it may be tempted to rearrange the file chunks so they all look "pretty" again, whatever its definition may be.
You end up with a dialog box that says "fragmentation: 0%" and a system which performs slightly more slowly