unofficial mirror of guix-science@gnu.org 
 help / color / mirror / Atom feed
From: Efraim Flashner <efraim@flashner.co.il>
To: "Ludovic Courtès" <ludovic.courtes@inria.fr>
Cc: guix-science@gnu.org
Subject: Re: GC strategy on clusters
Date: Thu, 1 Apr 2021 13:45:37 +0300	[thread overview]
Message-ID: <YGWkUTW/SCfH/Zph@3900XT> (raw)
In-Reply-To: <87v9969uml.fsf@inria.fr>

[-- Attachment #1: Type: text/plain, Size: 2445 bytes --]

On Thu, Apr 01, 2021 at 12:17:22PM +0200, Ludovic Courtès wrote:
> Hi there!
> 
> Recently the Guix head node of our cluster at Inria was getting short on
> disk space, despite running ‘guix gc -F20G’ (or similar) twice a day.
> 
> Turns out that some users had accumulated many profile generations and
> that was getting in the way.  So we kindly asked them to run:
> 
>   guix package --delete-generations=4m
> 
> or similar, which was enough to free more space.

I feel like 4-6 months should be plenty for anything active. Even if it
were run automatically for them it wouldn't remove the last generation.

> We’re now considering setting up automatic user notification by email,
> as is commonly done for disk quotas, asking them to remove old
> generations.  That way, users remain in control and choose what GC roots
> or generations they want to remove.
> 
> How do people on this list deal with that?

I like the idea of asking people to remove old generations. It's not
something that we've come up against yet. It doesn't feel that different
than reminding them that their $HOME is for code and smaller things and
the storage space is for their large data collections.

> Longer term, I think Guix should automatically delete old generations
> and instead store the channels + manifest to reproduce them, when
> possible.
> 

This seems to help a bit less when we run into issues about dates being
wrong on SSL tests, or when sources go missing.

How much storage and people are you working with? Our initial multiuser
system has 188GB for /gnu and I think 30-40 people and some people have
profiles going back almost 3 years. Not many people have multiple
profiles and the experiments we tried with shared profiles in
/usr/local/guix-profiles don't see a lot of use or get updated
frequently.

I guess I'm not really sure if its a technology problem or a people
problem. Figuring out if someone is the only one pulling in a copy of
glibc-2.25 is doable but how many copies of diffoscope is too many?

On a practical note, 'guix package --list-profiles' as root currently
lists everyone's profiles so it can be easier to see who has older
profiles hanging around.

-- 
Efraim Flashner   <efraim@flashner.co.il>   אפרים פלשנר
GPG key = A28B F40C 3E55 1372 662D  14F7 41AA E7DC CA3D 8351
Confidentiality cannot be guaranteed on emails sent or received unencrypted

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 833 bytes --]

  reply	other threads:[~2021-04-01 15:25 UTC|newest]

Thread overview: 5+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2021-04-01 10:17 Ludovic Courtès
2021-04-01 10:45 ` Efraim Flashner [this message]
2021-04-01 12:36   ` Ludovic Courtès
2021-04-01 13:29     ` zimoun
2021-04-01 13:43       ` Ludovic Courtès

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

  List information: https://guix.gnu.org/

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=YGWkUTW/SCfH/Zph@3900XT \
    --to=efraim@flashner.co.il \
    --cc=guix-science@gnu.org \
    --cc=ludovic.courtes@inria.fr \
    --subject='Re: GC strategy on clusters' \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).