Not an issue, just something I would like to understand. Have been testing backups to cloud which of course will benefit from higher UL speeds.
Test PC uses a dedicated line that has a theoretical maximum of 6.2 MBps UL speed, or 50 Mbps. Only test PC is connected to this line, all other network usage denied or disabled. The line is only used for this test.
To measure UL speed, I start uploading to a cloud server. Test backup size 30 GB. A counter starts measuring the speed when it hits 5.8 MBps, marking then actual speed every 20 seconds.
This shows averages after about 20 tests:
Horizontal values show elapsed time in minutes. Variations in all tests max 10% to any direction, the fact seems to be that after about 3 minutes the UL speed sets in around 2 MBps and stays there. Results are similar with XP Pro, Windows 7 and Windows 8 Enterprise editions so it does not look as an OS related thing.
I am very interested to hear opinions in this, why over 60% overhead on a dedicated line?