not so much an issue with capacity of their system, as the way the updates are packaged. I pulled down the update list file at 2.6Mb...then the downloads bounced from 200k to 1Mbit as it limped along (I monitor it to get a better picture of how long it will actually take).

It's an issue with the difference between sending 2000 small files in rapid succession, vs one zip file of all those small files as a single download. All that overhead generated from all the extra acknowledgements from starting/stopping all those little files bogs everything down. Couple that with some of the screwy bandwidth throttling policies and you have a nasty mix of things working against you.

So, basically what we've got is hundreds of thousands of users spamming the network for 1700 tiny files and stopping and starting every couple of seconds, injecting a crapload of lag into the process and possibly tripping up throttling from the service providers as well.

What they need to do is tweak the method for downloading the files first, then look into if there are bandwidth issues.