From mboxrd@z Thu Jan 1 00:00:00 1970 From: zimoun Subject: Re: New build system: copy-build-system Date: Mon, 27 Jan 2020 17:17:10 +0100 Message-ID: References: <87sgk2dkuv.fsf@ambrevar.xyz> <87pnf5arh1.fsf@ambrevar.xyz> <87k15dapxk.fsf@ambrevar.xyz> <875zgwc2zd.fsf@ambrevar.xyz> Mime-Version: 1.0 Content-Type: text/plain; charset="UTF-8" Return-path: Received: from eggs.gnu.org ([2001:470:142:3::10]:58159) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1iw74n-0001sz-0T for guix-devel@gnu.org; Mon, 27 Jan 2020 11:17:26 -0500 Received: from Debian-exim by eggs.gnu.org with spam-scanned (Exim 4.71) (envelope-from ) id 1iw74m-0003kY-0Y for guix-devel@gnu.org; Mon, 27 Jan 2020 11:17:24 -0500 Received: from mail-qk1-x72f.google.com ([2607:f8b0:4864:20::72f]:44729) by eggs.gnu.org with esmtps (TLS1.0:RSA_AES_128_CBC_SHA1:16) (Exim 4.71) (envelope-from ) id 1iw74l-0003jk-Rl for guix-devel@gnu.org; Mon, 27 Jan 2020 11:17:23 -0500 Received: by mail-qk1-x72f.google.com with SMTP id v195so10107958qkb.11 for ; Mon, 27 Jan 2020 08:17:23 -0800 (PST) In-Reply-To: <875zgwc2zd.fsf@ambrevar.xyz> List-Id: "Development of GNU Guix and the GNU System distribution." List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: guix-devel-bounces+gcggd-guix-devel=m.gmane-mx.org@gnu.org Sender: "Guix-devel" To: Pierre Neidhardt Cc: Guix Devel On Mon, 27 Jan 2020 at 16:51, Pierre Neidhardt wrote: > > zimoun writes: > > > And for example, the 'copy-build-system' could: > > > > - fetch the data from an archive, such http://data.astrometry.net or > > IPFS or Zenodo or > > - fetch the resulting of /gnu/store/-name-version from susbtitutes > > Build systems don't fetch data. I know (isolated blabla). :-) "The package using the 'copy-build-system'" is a better wording? ;-) I do not know exactly how a derivation is computed and what is used to compute the hash used in the store. Ok, first I have to do my homework. :-) But one does not want to hash several tens of GB. Whatever, I just have the feeling that what you are proposing (with small tweaks) could improve the situation for "large" data set. I do not know... Cheers, simon ps: Well, maybe we should work in the same office... it will be easier and faster than exchanging all these emails and I know yours has an awesome view :-D