In (I hope) plain English
Caching come into its own when you've been using an application for a while (or when the system has learned how you usually use your computer).
What it does is saves often used data into a specially fast memory area (not in RAM but in the disk controllers OWN memory area) so that if you need to access this data the disk contoller can supply it straight from the cache instead of having to create a set of I/O instructions to find the correct sector on Disk, position the drive and do the data transfer - both READ amd WRITE.
On a new computer the system will need to do some form of learning before the cache is effectively used.
Some better algorithms even do a "Data Pre-fetch" which reads data before you actually need it - based on your previous usage of the computer --it does this whilst the computer is idle so it won't degrade the performance at all.
Successful caching algorithms (very tricky to write) can ENORMOUSLY speed up a computer - especially when data bases are being accessed with similar queries.
You'll also find that a disk with a larger cache costs quite a bit more than a similar capacity drive with a smaller or no cache.
(This is why you should also "Quiesce" a USB external drive before unplugging it -- the system will need to write any remaining / pending I/O requests from the cache back to the drive).