From mboxrd@z Thu Jan 1 00:00:00 1970 Path: news.gmane.io!.POSTED.blaine.gmane.org!not-for-mail From: Andrew Hyatt Newsgroups: gmane.emacs.devel Subject: Re: [NonGNU ELPA] New package: llm Date: Tue, 8 Aug 2023 11:09:25 -0400 Message-ID: References: <871qgexf7j.fsf@posteo.net> Mime-Version: 1.0 Content-Type: multipart/alternative; boundary="00000000000050230d06026ac0e3" Injection-Info: ciao.gmane.io; posting-host="blaine.gmane.org:116.202.254.214"; logging-data="8730"; mail-complaints-to="usenet@ciao.gmane.io" Cc: Emacs-Devel devel To: Philip Kaludercic Original-X-From: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Tue Aug 08 17:10:30 2023 Return-path: Envelope-to: ged-emacs-devel@m.gmane-mx.org Original-Received: from lists.gnu.org ([209.51.188.17]) by ciao.gmane.io with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.92) (envelope-from ) id 1qTOLu-00024X-Jc for ged-emacs-devel@m.gmane-mx.org; Tue, 08 Aug 2023 17:10:30 +0200 Original-Received: from localhost ([::1] helo=lists1p.gnu.org) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1qTOL9-0006Sh-OE; Tue, 08 Aug 2023 11:09:43 -0400 Original-Received: from eggs.gnu.org ([2001:470:142:3::10]) by lists.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from ) id 1qTOL7-0006Ru-8u for emacs-devel@gnu.org; Tue, 08 Aug 2023 11:09:41 -0400 Original-Received: from mail-lf1-x132.google.com ([2a00:1450:4864:20::132]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_128_GCM_SHA256:128) (Exim 4.90_1) (envelope-from ) id 1qTOL4-00077X-Vg for emacs-devel@gnu.org; Tue, 08 Aug 2023 11:09:41 -0400 Original-Received: by mail-lf1-x132.google.com with SMTP id 2adb3069b0e04-4fe3b86cec1so8960384e87.2 for ; Tue, 08 Aug 2023 08:09:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20221208; t=1691507377; x=1692112177; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:from:to:cc:subject:date:message-id:reply-to; bh=k3Xn/Rot6EVxgjvwabfsZYctQNgGortTWhSoJb1sTNg=; b=DPkNgYDPazzpqF7gk5FmngVpRjlsAbo8Mj1j6fVDZIA1rBImbJclDSmHWuXK5faP56 q/+/GiG/4PHW2ZpywMMVpKD0Ry2kFsw85dSjTvhTigwe4+YOUZeCLoYB6gowR0uPUtA1 VF41qtuq5zEDfboHNXPeWwnTWPxIlzNz6Vq22Kac6WpuSZs7xB6c/IgggMCbiJvhixaG n6wKcO1xpjDLo9HfmNnPGFUqQs+OGEGJyz+BuUAb0rIt4G7sg2DANMFVwDg5vdA2zC0s Kvun5uVYATnEj/s7QlL6B1SxTp33b0VRktObknYDBHmzCFrE4tPcFnXAv9BDM1IMTaY9 F6jw== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20221208; t=1691507377; x=1692112177; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:x-gm-message-state:from:to:cc:subject:date:message-id :reply-to; bh=k3Xn/Rot6EVxgjvwabfsZYctQNgGortTWhSoJb1sTNg=; b=gPmvDExWv704jsF+Yla7fA6OSdHsTHopLoNeEJUW5mPhsCXTqIHiHVRV4dTCWnlsa6 ZRfuQ80KWCkdpZNJ4We7bB9efYTp3zLNOEb9gr09EAIL0kaTlpSNy3zvqHNJo49vthfM hpXXmUBIRZBpDDMIaup7FM3Uqo6Oqyu8xYmfuG1LA0b4rzXIUrXPtAhPBA7Bejv/VhqQ o1xbsYyJHUOvJmCOm5nLsfHJXPzZgQM9vC1wxeITDdaiFIcCXLOui6Z6QY6uULjlJGzq /V0yvK9nVPOTZiRvuJmDtFghzj3GQimGXVMjCHQcMIRuIYPpCFQqc/eXKYIPo+NUhtGL I5yw== X-Gm-Message-State: AOJu0YzzCAfPhtvg3ay9t1f7THC8oHmzaPJgYWgr/zP646I52MijnPUl nvxNM2hUTXqZgtTSSvtZLsPUR91VYtZvgq/zFeI= X-Google-Smtp-Source: AGHT+IGRXtqDKgs0gIWkdQgxKSVl2Wd9/5qEi5hpUO7pRmG5EeKSyqhTxWS7I2aN6IYtZmSfw0CXqZWmfGi44n2RMO8= X-Received: by 2002:a05:6512:3ee:b0:4fd:ba8d:d4ed with SMTP id n14-20020a05651203ee00b004fdba8dd4edmr6752397lfq.24.1691507376528; Tue, 08 Aug 2023 08:09:36 -0700 (PDT) In-Reply-To: <871qgexf7j.fsf@posteo.net> Received-SPF: pass client-ip=2a00:1450:4864:20::132; envelope-from=ahyatt@gmail.com; helo=mail-lf1-x132.google.com X-Spam_score_int: -20 X-Spam_score: -2.1 X-Spam_bar: -- X-Spam_report: (-2.1 / 5.0 requ) BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, DKIM_VALID_EF=-0.1, FREEMAIL_FROM=0.001, HTML_MESSAGE=0.001, RCVD_IN_DNSWL_NONE=-0.0001, SPF_HELO_NONE=0.001, SPF_PASS=-0.001 autolearn=ham autolearn_force=no X-Spam_action: no action X-BeenThere: emacs-devel@gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: "Emacs development discussions." List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Original-Sender: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Xref: news.gmane.io gmane.emacs.devel:308442 Archived-At: --00000000000050230d06026ac0e3 Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable On Tue, Aug 8, 2023 at 1:42=E2=80=AFAM Philip Kaludercic wrote: > Andrew Hyatt writes: > > > Hi everyone, > > > > I've created a new package called llm, for the purpose of abstracting t= he > > interface to various large language model providers. There are many LL= M > > packages already, but it would be wasteful for all of them to try to be > > compatible with a range of LLM providers API (local LLMs such as Llama = 2, > > API providers such as Open AI and Google Cloud's Vertex). This package > > attempts to solve this problem by defining generic functions which can > then > > be implemented by different LLM providers. I have started with just tw= o: > > Open AI and Vertex. Llama 2 would be a next choice, but I don't yet ha= ve > > it working on my system. In addition, I'm starting with just two core > > functionality: chat and embeddings. Extending to async is probably > > something that I will do next. > > Llama was the model that could be executed locally, and the other two > are "real" services, right? > That's correct. > > > You can see the code at https://github.com/ahyatt/llm. > > > > I prefer that this is NonGNU, because I suspect people would like to > > contribute interfaces to different LLM, and not all of them will have F= SF > > papers. > > I cannot estimate how important or not LLM will be in the future, but it > might be worth having something like this in the core, at some point. > Considering the size of a module at around 150-200 lines it seems, and > the relative infrequency of new models (at least to my understanding), I > don't know if the "advantage" of accepting contributions from people who > haven't signed the CA has that much weight, opposed to the general that > all users may enjoy from having the technology integrated into Emacs > itself, in a way that other packages (and perhaps even the core-help > system) could profit from it. > That seems reasonable. I don't have a strong opinion here, so if others want to see this in GNU ELPA instead, I'm happy to do that. > > > Your thoughts would be appreciated, thank you! > --00000000000050230d06026ac0e3 Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
On Tue, Aug 8, 2023 at 1:42=E2=80=AFAM Ph= ilip Kaludercic <philipk@posteo.ne= t> wrote:
Andrew Hyatt <ahyatt@gmail.com> writes:

> Hi everyone,
>
> I've created a new package called llm, for the purpose of abstract= ing the
> interface to various large language model providers.=C2=A0 There are m= any LLM
> packages already, but it would be wasteful for all of them to try to b= e
> compatible with a range of LLM providers API (local LLMs such as Llama= 2,
> API providers such as Open AI and Google Cloud's Vertex).=C2=A0 Th= is package
> attempts to solve this problem by defining generic functions which can= then
> be implemented by different LLM providers.=C2=A0 I have started with j= ust two:
> Open AI and Vertex.=C2=A0 Llama 2 would be a next choice, but I don= 9;t yet have
> it working on my system.=C2=A0 In addition, I'm starting with just= two core
> functionality: chat and embeddings.=C2=A0 Extending to async is probab= ly
> something that I will do next.

Llama was the model that could be executed locally, and the other two
are "real" services, right?

T= hat's correct.
=C2=A0

> You can see the code at https://github.com/ahyatt/llm.
>
> I prefer that this is NonGNU, because I suspect people would like to > contribute interfaces to different LLM, and not all of them will have = FSF
> papers.

I cannot estimate how important or not LLM will be in the future, but it might be worth having something like this in the core, at some point.
Considering the size of a module at around 150-200 lines it seems, and
the relative infrequency of new models (at least to my understanding), I don't know if the "advantage" of accepting contributions from= people who
haven't signed the CA has that much weight, opposed to the general that=
all users may enjoy from having the technology integrated into Emacs
itself, in a way that other packages (and perhaps even the core-help
system) could profit from it.

That seem= s reasonable.=C2=A0 I don't have a strong opinion here, so if others wa= nt to see this in GNU ELPA instead, I'm happy to do that.
=C2= =A0

> Your thoughts would be appreciated, thank you!
--00000000000050230d06026ac0e3--