How can we speed up rpm downloads?

John Summerfield debian at
Tue Jun 17 02:24:19 UTC 2008

Patrick O'Callaghan wrote:
> On Mon, 2008-06-16 at 08:08 +0800, John Summerfield wrote:
>> I don't think either downloads in parallel, and if your internet is 
>> running at its rated speed, that is likely the bottleneck do running 
>> two, three or more downloads in parallel will serve only to choke
>> your 
>> self. And waste server resources.
> apt-get does run several downloads in parallel. This makes sense when
> some servers can only server data at a rate lower than the connection
> bandwidth, which does happen particularly with high-traffic sites.

I use apt-get regularly, I also have some Debian systems, and the 
progress meter doesn't reflect parallel downloads.

If a remote (free!) server is already overloaded, adding to its stress 
doesn't seem very sensible. It doesn't take a very large increase in 
requests for a service to go from "very busy but coping" to "thrashing." 
Just take a look at supermarket queues and think how well the are run, 
and how they might be run better (from the customer's POV). I've rairely 
seen an idle checkout operator.

If an operator can (on average) serve one customer per minute (and the 
times don't vary much), and customers arrive at one per minute, there 
won't be much of a queue. However, if customers arrive each 55 seconds, 
it won't take long for the queue to go out the door, so to speak.

If the bottleneck isn't the server, but the network, then IP is designed 
to discard packets when overloaded, and TCP manages this by detecting 
discarded packets and requesting they be resent. An IP network is quite 
resilient, but it can be flooded and this is what use of parallel 
downloads does.



-- spambait
1aaaaaaa at  Z1aaaaaaa at
-- Advice

You cannot reply off-list:-)

More information about the fedora-test-list mailing list