From mboxrd@z Thu Jan 1 00:00:00 1970 Path: news.gmane.io!.POSTED.blaine.gmane.org!not-for-mail From: Ihor Radchenko Newsgroups: gmane.emacs.tangents Subject: Re: [NonGNU ELPA] New package: llm Date: Fri, 01 Sep 2023 09:53:22 +0000 Message-ID: <87pm32rzhp.fsf@localhost> References: <87v8d0iqa5.fsf@posteo.net> <87cyz3vaws.fsf@localhost> Mime-Version: 1.0 Content-Type: text/plain Injection-Info: ciao.gmane.io; posting-host="blaine.gmane.org:116.202.254.214"; logging-data="701"; mail-complaints-to="usenet@ciao.gmane.io" Cc: emacs-tangents@gnu.org, Jim Porter , ahyatt@gmail.com, rms@gnu.org To: chad Original-X-From: emacs-tangents-bounces+get-emacs-tangents=m.gmane-mx.org@gnu.org Fri Sep 01 15:17:44 2023 Return-path: Envelope-to: get-emacs-tangents@m.gmane-mx.org Original-Received: from lists.gnu.org ([209.51.188.17]) by ciao.gmane.io with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.92) (envelope-from ) id 1qc41w-000AV0-8h for get-emacs-tangents@m.gmane-mx.org; Fri, 01 Sep 2023 15:17:44 +0200 Original-Received: from localhost ([::1] helo=lists1p.gnu.org) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1qc41Y-0003qS-7x; Fri, 01 Sep 2023 09:17:20 -0400 Original-Received: from eggs.gnu.org ([2001:470:142:3::10]) by lists.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from ) id 1qc0pv-00030D-1Q for emacs-tangents@gnu.org; Fri, 01 Sep 2023 05:53:08 -0400 Original-Received: from mout01.posteo.de ([185.67.36.65]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from ) id 1qc0pl-0005Kg-EA for emacs-tangents@gnu.org; Fri, 01 Sep 2023 05:53:02 -0400 Original-Received: from submission (posteo.de [185.67.36.169]) by mout01.posteo.de (Postfix) with ESMTPS id 4EBCB240029 for ; Fri, 1 Sep 2023 11:52:47 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=posteo.net; s=2017; t=1693561967; bh=KPNc4OXAhdLAgydJRq5kYyzXQh+Jjrk8tT06QD5ZOos=; h=From:To:Cc:Subject:Date:Message-ID:MIME-Version:From; b=E0zF0MDqQCdJn2qk7//z9WNaoYVNPk2OFOMaqJ/9EP2EtFnrVBN0Af2jbhy0BJowB sJs0LHOR7fn/DAv6xdNZBOsNC+0jt8wuFYAxjaQsY+ZoOyEzfhu6cAiXhU1Xok4Qt4 BuWqu3zrSA3n6xYIdBW0YWnOCAfzHk8U+pJj/5L4pnnv+Qm+jAuVlegBYJGNV4hMsg 8/5pMngVtq5h79DjTVcxJeNs4j4I1wFKUbhL2PgRn9qdhRkJ3f+m3JdlywYdi7Ot0a hgNR6cbRZazyN9A0jh5b+jXvjq/u9bUwACwPDAKOrFxJdpXT8gNLuovSH2m1ZmYbNH c6/VOqVRBH57g== Original-Received: from customer (localhost [127.0.0.1]) by submission (posteo.de) with ESMTPSA id 4RcYHK6W5sz9rxn; Fri, 1 Sep 2023 11:52:45 +0200 (CEST) In-Reply-To: Received-SPF: pass client-ip=185.67.36.65; envelope-from=yantar92@posteo.net; helo=mout01.posteo.de X-Spam_score_int: -43 X-Spam_score: -4.4 X-Spam_bar: ---- X-Spam_report: (-4.4 / 5.0 requ) BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, DKIM_VALID_EF=-0.1, RCVD_IN_DNSWL_MED=-2.3, RCVD_IN_MSPIKE_H5=0.001, RCVD_IN_MSPIKE_WL=0.001, SPF_HELO_NONE=0.001, SPF_PASS=-0.001 autolearn=ham autolearn_force=no X-Spam_action: no action X-Mailman-Approved-At: Fri, 01 Sep 2023 09:17:19 -0400 X-BeenThere: emacs-tangents@gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: Emacs news and miscellaneous discussions outside the scope of other Emacs mailing lists List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: emacs-tangents-bounces+get-emacs-tangents=m.gmane-mx.org@gnu.org Original-Sender: emacs-tangents-bounces+get-emacs-tangents=m.gmane-mx.org@gnu.org Xref: news.gmane.io gmane.emacs.tangents:1044 Archived-At: chad writes: > For large AI models specifically: there are many users for whom it is not > practical to _actually_ recreate the model from scratch everywhere they > might want to use it. It is important for computing freedom that such > recreations be *possible*, but it will be very limiting to insist that > everyone who wants to use such services actually do so, in a manner that > seems to me to be very similar to not insisting that every potential emacs > user actually compile their own. In this case there's the extra wrinkle > that the actual details of recreating the currently-most-interesting large > language models involves both _gigantic_ amounts of resources and also a > fairly large amount of not-directly-reproducible randomness involved. It > might be worth further consideration. Let me refer to another message by RMS: >> > While I certainly appreciate the effort people are making to produce >> > LLMs that are more open than OpenAI (a low bar), I'm not sure if >> > providing several gigabytes of model weights in binary format is really >> > providing the *source*. It's true that you can still edit these models >> > in a sense by fine-tuning them, but you could say the same thing about a >> > project that only provided the generated output from GNU Bison, instead >> > of the original input to Bison. >> >> I don't think that is valid. >> Bison processing is very different from training a neural net. >> Incremental retraining of a trained neural net >> is the same kind of processing as the original training -- except >> that you use other data and it produces a neural net >> that is trained differently. >> >> My conclusiuon is that the trained neural net is effectively a kind of >> source code. So we don't need to demand the "original training data" >> as part of a package's source code. That data does not have to be >> free, published, or available. -- Ihor Radchenko // yantar92, Org mode contributor, Learn more about Org mode at . Support Org development at , or support my work at