Would it be useful for me to wrote a script to collect performance data for various stages? Like downloading, computing derivations, building profile etc?
Suhail Singh <suhailsingh247@gmail.com> writes:
> Daniel Littlewood <danielittlewood@gmail.com> writes:
>
>> guix pull ("38k new commits"): 21m45s
>> guix pull immediately after: 2m25s
>> guix shell emacs (fresh): 1m49s
>> ...
>>
>> nix-channel --update: 0m23s
>> nix shell -p emacs (fresh): 0m24s
>
> Those are some interesting comparisons. Is the reason guix pull takes
> so long as compared to updating nix-channel primarily due to the
> authentication of commits? Or something else?
As far as the local machine computations go, clearly, authenticating the
commits is not the bottleneck. On all machines, indexing the received
git objects locally is much longer than authenticating the commits.
On my X60, when I pull for the first time after I delete the cache, the
indexing step alone takes more than 40 minutes.
The 2m25s that Daniel had for his second git pull, that had to be spent
mostly on computing the Guix derivation. This time is in large part
in-compressible I guess. Not that I know of a lot about this, but by
reading the output, it's clear that every time guix pull has to compute
the whole derivation of the latest commit of all the
channels. Apparently, in a pull where Guix determines that it has
nothing to do, this step is required before Guix can make the
determination that it has nothing to do...