Yeah, with no controls whatsoever on the speeds. One might want to limit the downstream not to mention upstream, which can be a lot smaller in comparison (for example I have a 8/1Mbit connection; tiny upload rates are still quite common here) but of course, other software exists that can be used for throttling.
On another note, this time around was the first time I looked more into the update process as it was going on. From the looks of it, it downloads a single compressed file, and then makes it uncompressed in its designated location, then downloads the next file.
This is probably why it needs to re-check the files every time the update is interrupted (as if it doesn't know where it left off). Re-downloading the patch-info is probably a kind of a fail-safe.
I'm not quite sure why they would opt for downloading single files instead of one package, which would then be uncompressed at once, and the files implemented to the installation.
IF they were to patch the files as soon as they are downloaded and uncompressed, then this would allow for less space required for the update all in all (the uncompressed file-size for this particular update is a bit over 840MiB). Since it patches the files only after everything is downloaded and uncompressed, this is not what is happening (I can not tell if it does on the consoles, though?).
Perhaps they could not implement any other kind of a resume feature (e.g., the download would have to start from 0 were it to be interrupted at any point when using one single file).
I'd guess it is what it is more than likely due to the PS2.
That is if I'm understanding it all right anyways...



Reply With Quote