unofficial mirror of guix-devel@gnu.org 
 help / color / mirror / code / Atom feed
From: zimoun <zimon.toutoune@gmail.com>
To: Christopher Baines <mail@cbaines.net>, guix-devel@gnu.org
Subject: Re: Mid-December update on bordeaux.guix.gnu.org
Date: Wed, 15 Dec 2021 23:49:35 +0100	[thread overview]
Message-ID: <86r1adiiv4.fsf@gmail.com> (raw)
In-Reply-To: <874k79zs29.fsf@cbaines.net>

Hi Chris,

Thanks for the update.  And for the all work. :-)


On Wed, 15 Dec 2021 at 16:48, Christopher Baines <mail@cbaines.net> wrote:

> In summary, the space issue I mentioned in the previous update has
> effectively been addressed. All the paused agents are now unpaused and
> builds are happening again.

The timing had almost been perfect. ;-)


Well, as discussed on Sept., one concern I have is about “long-term
storage” – where long-term is not well-defined and storage either.

Do you think that Bordeaux could run

   <https://git.savannah.gnu.org/cgit/guix.git/tree/etc/source-manifest.scm>

?  Having a redundancy about all origins would avoid breakage.  For
instance, because Berlin was down yesterday morning, “guix pull” was
broken because the missing ’datefuge’ package – disappeared upstream.

Today, Guix is well covered for package using ’git-fetch’ but not for
all the others methods.  The situation is improving to have a complete
fallback using Software Heritage via Disarchive.  Not ready yet [1]. :-)

This redundancy about all sources appears to me vitally important.
Because if Berlin is totally lost for whatever reason, it is game over
for preserving Guix – well, recover from scattered with people’s store.

Other said, in term of capacity and priority, it appears to me worse if
0.01% source (or even just one) is missing than if some substitutes are
missing.  Because I can always locally burn some CPU, but I cannot
create source code. :-)

1: <https://ngyro.com/pog-reports/2021-12-06/>


> In addition to lakefront, I've also added a 6TB hard drive to hatysa,
> the HoneyComb LX2 machine that I host. Like lakefront, it's busy
> downloading the nars from bayfront. This will act as a backup in case
> lakefront is lost.

Cool!  Thanks.


> In general this is an important step in being more flexible where the
> nars are stored. There's still a reliance on storing pretty much all the
> nars on a single machine, but which machine has this role is more
> flexible now. I think this architecture also makes it easier to break
> the "all nars on a single machine" restriction in the future as well.

IIUC the design, if the proxy server is lost, then it is easy to replace
it.  Right?

I remember discussions about CDN [2,3,4,5,6].  I do not know if it
solves the issue but from my understanding, it will improve at least
performance delivery.  Well, it appears to me worth to give a try.


2: <https://lists.gnu.org/archive/html/guix-devel/2016-03/msg00312.html>
3: <https://yhetil.org/guix/KBlEbLsxWu2Kmv5RvS2dHXDleGAyyz9WEA0T6wQ1QArc0mjkol-1W5Vv66D9oauvQ5l6WYTaJ86Ckxjc8YS_2pn2NN1M_L8RJUsIBmmFeqE=@protonmail.ch/>
4: <https://yhetil.org/guix/87tvju6145.fsf@gnu.org/>
5: <https://lists.gnu.org/archive/html/guix-devel/2018-12/msg00192.html>
6: <https://yhetil.org/guix/87d0my1380.fsf@gmail.com/>


> Going forward, it would be good to have an additional full backup of the
> nars that bayfront can serve things from, to provide more
> redundancy. I'm hoping the nar-herder will also enable setting up
> geographically distributed mirrors, which will hopefully improve
> redundancy further, and maybe performance of fetching nars too.

To me, one first general question about backup coordination is to define
a window for time:

 - source: forever until the complete fallback to SWH is robust;
 - all the substitutes to run “guix time-machine --commit=<> -- help ”
   for any commit reachable by inferior: forever;
 - package substitute: rule something.


Thanks for taking care about redundancy and reliance of CI.


Cheers,
simon


  reply	other threads:[~2021-12-15 22:55 UTC|newest]

Thread overview: 18+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2021-12-15 16:48 Mid-December update on bordeaux.guix.gnu.org Christopher Baines
2021-12-15 22:49 ` zimoun [this message]
2021-12-16  0:20   ` Christopher Baines
2021-12-16 11:05     ` zimoun
2021-12-16 12:48       ` Christopher Baines
2021-12-16 14:25         ` Andreas Enge
2021-12-21  9:53     ` Redundancy for source code and Disarchive Ludovic Courtès
2021-12-17  9:00 ` Mid-December update on bordeaux.guix.gnu.org Andreas Enge
2021-12-17  9:03   ` Andreas Enge
2021-12-20 22:07 ` Ludovic Courtès
2021-12-20 22:52   ` extend ’guix archive’? zimoun
2021-12-21  5:50     ` Jack Hill
2021-12-21 10:49       ` zimoun
2022-02-04 12:48       ` Christopher Baines
2021-12-21  9:39     ` Ludovic Courtès
2021-12-21 10:32       ` zimoun
2022-02-04 12:36     ` Christopher Baines
2022-01-06 13:26   ` Mid-December update on bordeaux.guix.gnu.org Christopher Baines

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

  List information: https://guix.gnu.org/

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=86r1adiiv4.fsf@gmail.com \
    --to=zimon.toutoune@gmail.com \
    --cc=guix-devel@gnu.org \
    --cc=mail@cbaines.net \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this public inbox

	https://git.savannah.gnu.org/cgit/guix.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).