From a computational perspective, downloading tarballs is much simpler than fetching from Git. But Git offers so many advantages, and computing has become so inexpensive, that it's become very common to use Git instead. Recent Git implementations have optimized serving of specific Git commits, as compared to fetching the entire Git history. That means that you can fetch a single revision of a huge repo, using a small amount of bandwidth. For the specific case of `guix pull`, the Git server is hosted at Savannah, which does not use one of these optimized Git implementations, so it is relatively slow and expensive to fetch. Additionally, Guix's authentication mechanism requires fetching many Git revisions in order to verify the chain of trusted revisions (Git commits). This requirement to fetch many Git revisions, combined with the unoptimized Got server on Savannah, means that `guix pull` may be slower than comparable actions on other distros, especially the first time, and if you haven't pulled in a while On Sun, Apr 21, 2024, at 18:31, Adam wrote: > Hi guix! > Recently I used nixos on one of my machines. And I noticed people there use tar balls for fetching package definitions. And It worked much faster for me. > That was surprising and I decided to write this letter. > Is git the right tool for getting new package definitions? What if git commits history will become enormous? > As I see, first guix pull running too long for a lot of people. > Probably there are ways to cache all of this. > Anyway, I'm just curious about it. If there are already answers for my question, I would like to read them.