Been explained numerous times before. It's all to do with marketing, and how the term Giga is represented. Many HDD manufacturers market drives using the decimal version of Giga, which is 10^9. The computer, on the other hand, works in binary and its version of Giga is represented by 1024^3
Let's do the sums.
As marketed: 120 x 10^9 = 120,000,000,000 bytes
What the computer sees: 120,000,000,000 / 1024^3 = 111.76