Mark H Weaver writes: > For these reasons, I'm inclined to think that parallel downloads is the > wrong approach. If a single download process is not making efficient > use of the available bandwidth, I'd be more inclined to look carefully > at why it's failing to do so. For example, I'm not sure if this is the > case (and don't have time to look right now), but if the current code > waits until a NAR has finished downloading before asking for the next > one, that's an issue that could be fixed by use of HTTP pipelining, > without multiplying the memory usage. I think so too. More generally, the Guix daemon jobs can be categorized in 2: downloads and builds. These 2 categories demands almost complementary hardware resources for the local machine. With that in mind, we could have 2 pipelines: one for the builds and one for the downloads. This, I think, is general enough that it could be used by default and improve performance for everyone, regardless of your Internet bandwidth or CPU power. Cheers! -- Pierre Neidhardt https://ambrevar.xyz/