From mboxrd@z Thu Jan 1 00:00:00 1970 Path: news.gmane.io!.POSTED.blaine.gmane.org!not-for-mail From: Andrew Hyatt Newsgroups: gmane.emacs.devel Subject: Re: [NonGNU ELPA] New package: llm Date: Tue, 19 Sep 2023 14:19:30 -0400 Message-ID: References: <87v8c69l5f.fsf@posteo.net> Mime-Version: 1.0 Content-Type: multipart/alternative; boundary="000000000000747afd0605ba4d0a" Injection-Info: ciao.gmane.io; posting-host="blaine.gmane.org:116.202.254.214"; logging-data="26680"; mail-complaints-to="usenet@ciao.gmane.io" Cc: Stefan Kangas , emacs-devel@gnu.org, jporterbugs@gmail.com, rms@gnu.org To: Philip Kaludercic Original-X-From: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Tue Sep 19 20:20:56 2023 Return-path: Envelope-to: ged-emacs-devel@m.gmane-mx.org Original-Received: from lists.gnu.org ([209.51.188.17]) by ciao.gmane.io with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.92) (envelope-from ) id 1qifLD-0006jZ-3W for ged-emacs-devel@m.gmane-mx.org; Tue, 19 Sep 2023 20:20:55 +0200 Original-Received: from localhost ([::1] helo=lists1p.gnu.org) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1qifKG-00019v-3i; Tue, 19 Sep 2023 14:19:56 -0400 Original-Received: from eggs.gnu.org ([2001:470:142:3::10]) by lists.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from ) id 1qifK7-00018u-Fj for emacs-devel@gnu.org; Tue, 19 Sep 2023 14:19:49 -0400 Original-Received: from mail-ed1-x530.google.com ([2a00:1450:4864:20::530]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_128_GCM_SHA256:128) (Exim 4.90_1) (envelope-from ) id 1qifK4-0005yU-9t; Tue, 19 Sep 2023 14:19:47 -0400 Original-Received: by mail-ed1-x530.google.com with SMTP id 4fb4d7f45d1cf-5315b70c50dso2399238a12.2; Tue, 19 Sep 2023 11:19:43 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1695147582; x=1695752382; darn=gnu.org; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:from:to:cc:subject:date:message-id:reply-to; bh=8bhIGKX8Zr3GTgssVZ2HqJY2+Ekp7NboyN5/mDQ2eXI=; b=HrEevI9pn0JErFjaZh0MhJ/HX8uNYZ3bbpLUjZ3+VWZ4y+3MqYAcctmR7dcPEyJz5I SsArAeXHX7fRnM4HXD+f0q1xnCyHQd5Syn0/kqQQvZux2FtQq524kiG5r4EH/0PiA+6s RY65rJ3Yj32cvh9zb+rF8plpHhUm+Y8+L8YFM3pKmevApPIi/HTABjArebrjTbqnMO8X LDFC4HGF+fEzXuqok+5QeBajasq4LcEJ7+vbvd4qcAqI4WfeNlU1ph8s0nCvQojahUOn 7XLS8gPakp1yGkSEEeVhb++Q0kMshKXtJiII7cgEnBy4+Bn0gEXZZDXXelfo3lN7iKvc M7FA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1695147582; x=1695752382; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:x-gm-message-state:from:to:cc:subject:date:message-id :reply-to; bh=8bhIGKX8Zr3GTgssVZ2HqJY2+Ekp7NboyN5/mDQ2eXI=; b=XjiT284+thH3MW5QI8XwhqVPgmv22rXdI0MKeOa30d3tYWfGkawSMLNTJQ76slnKdb OjWGWBY/WxmLBivYodq1wTZPKa0qtXqE256dDuX/UbhkCZ9i2bQtTnidWxMOf2hO32xp aUoi6FeRE1N/quRPia0etLM4aTFmvI+QwA0Wv3VlZBtkTSeee7ANr2zqZzNhiDYV5kqE BWIWF21O/004rWTe9L7NOfz5l4tBLvQl2X7hGGXyURTtslfShF1HzX1GS4m+ujVhQunz 1bYYRqCeIlQ+LwpXx1jdnFiT7LnY4/kuZfA/TT6f0YWxq6ke1ZY8VyMJxGchAZi0ADeU Y22w== X-Gm-Message-State: AOJu0YzdlGPrgsM9L8LvpjXTFByGwKr/N/mFE5tVR2nv/K4PO76NlvCN mPFVqa2tthtCle1TvNSYl9tgWMeWCKMgNrEGrbE= X-Google-Smtp-Source: AGHT+IG11TSITQULvpjBf4ZfKe3Ix4pNB38JaMGvuNxJEqtK3ZYAsJworkMBMLL8u5h+0AIhRkDSaygcXtZRBmSL2iI= X-Received: by 2002:aa7:da81:0:b0:525:76fc:f559 with SMTP id q1-20020aa7da81000000b0052576fcf559mr186215eds.41.1695147581797; Tue, 19 Sep 2023 11:19:41 -0700 (PDT) In-Reply-To: <87v8c69l5f.fsf@posteo.net> Received-SPF: pass client-ip=2a00:1450:4864:20::530; envelope-from=ahyatt@gmail.com; helo=mail-ed1-x530.google.com X-Spam_score_int: -20 X-Spam_score: -2.1 X-Spam_bar: -- X-Spam_report: (-2.1 / 5.0 requ) BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, DKIM_VALID_EF=-0.1, FREEMAIL_FROM=0.001, HTML_MESSAGE=0.001, RCVD_IN_DNSWL_NONE=-0.0001, SPF_HELO_NONE=0.001, SPF_PASS=-0.001 autolearn=ham autolearn_force=no X-Spam_action: no action X-BeenThere: emacs-devel@gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: "Emacs development discussions." List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Original-Sender: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Xref: news.gmane.io gmane.emacs.devel:310790 Archived-At: --000000000000747afd0605ba4d0a Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable On Tue, Sep 19, 2023 at 12:34 PM Philip Kaludercic wrote: > Andrew Hyatt writes: > > > I've submitted the configuration for llm and set up the branch from my > > repository last Friday. However, I'm still not seeing this package bei= ng > > reflected in GNU ELPA's package archive. I followed the instructions, > but > > perhaps there's some step that I've missed, or it is only periodically > > rebuilt? > > Did you try to run make build/llm? I get this error: I did build it, and it seemed to work. I=E2=80=99m not sure what I=E2=80=99= m doing differently but I appreciate the patch which I=E2=80=99ll apply later today= . Thank you for your help. > > > --8<---------------cut here---------------start------------->8--- > $ make build/llm > emacs --batch -l /home/philip/.../elpa/admin/elpa-admin.el \ > -f elpaa-batch-make-one-package llm > Cloning branch llm: > Preparing worktree (new branch 'externals/llm') > branch 'externals/llm' set up to track 'origin/externals/llm'. > HEAD is now at 39ae6fc794 Assign copyright to FSF, in preparation of > inclusion to GNU ELPA > > Debugger entered--Lisp error: (search-failed ";;; llm.el ends here") > ... > --8<---------------cut here---------------end--------------->8--- > > In other words the footer is missing. I have prepared a patch that > would address that and a few other checkdoc issues: > > diff --git a/llm.el b/llm.el > index 11b508cb36..08f07b65ca 100644 > --- a/llm.el > +++ b/llm.el > @@ -23,9 +23,9 @@ > > ;;; Commentary: > ;; This file defines a generic interface for LLMs (large language > models), and > -;; functionality they can provide. Not all LLMs will support all of > these, but > +;; functionality they can provide. Not all LLMs will support all of > these, but > ;; programs that want to integrate with LLMs can code against the > interface, and > -;; users can then choose the LLMs they want to use. It's advisable to > have the > +;; users can then choose the LLMs they want to use. It's advisable to > have the > ;; possibility of using multiple LLMs when that make sense for different > ;; functionality. > ;; > @@ -50,7 +50,7 @@ > > (defun llm--warn-on-nonfree (name tos) > "Issue a warning if `llm-warn-on-nonfree' is non-nil. > -NAME is the human readable name of the LLM (e.g 'Open AI'). > +NAME is the human readable name of the LLM (e.g \"Open AI\"). > > TOS is the URL of the terms of service for the LLM. > > @@ -72,7 +72,7 @@ EXAMPLES is a list of conses, where the car is an examp= le > inputs, and cdr is the corresponding example outputs. This is optional. > > INTERACTIONS is a list message sent by either the llm or the > -user. It is a list of `llm-chat-prompt-interaction' objects. This > +user. It is a list of `llm-chat-prompt-interaction' objects. This > is required. > > TEMPERATURE is a floating point number with a minimum of 0, and > @@ -80,8 +80,7 @@ maximum of 1, which controls how predictable the result > is, with > 0 being the most predicatable, and 1 being the most creative. > This is not required. > > -MAX-TOKENS is the maximum number of tokens to generate. This is optiona= l. > -" > +MAX-TOKENS is the maximum number of tokens to generate. This is > optional." > context examples interactions temperature max-tokens) > > (cl-defstruct llm-chat-prompt-interaction > @@ -102,19 +101,20 @@ This should be a cons of the name of the LLM, and > the URL of the > terms of service. > > If the LLM is free and has no restrictions on use, this should > -return nil. Since this function already returns nil, there is no > +return nil. Since this function already returns nil, there is no > need to override it." > (ignore provider) > nil) > > (cl-defgeneric llm-chat (provider prompt) > "Return a response to PROMPT from PROVIDER. > -PROMPT is a `llm-chat-prompt'. The response is a string." > +PROMPT is a `llm-chat-prompt'. The response is a string." > (ignore provider prompt) > (signal 'not-implemented nil)) > > (cl-defmethod llm-chat ((_ (eql nil)) _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-chat :before (provider _) > "Issue a warning if the LLM is non-free." > @@ -130,7 +130,8 @@ ERROR-CALLBACK receives the error response." > (signal 'not-implemented nil)) > > (cl-defmethod llm-chat-async ((_ (eql nil)) _ _ _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-chat-async :before (provider _ _ _) > "Issue a warning if the LLM is non-free." > @@ -143,7 +144,8 @@ ERROR-CALLBACK receives the error response." > (signal 'not-implemented nil)) > > (cl-defmethod llm-embedding ((_ (eql nil)) _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-embedding :before (provider _) > "Issue a warning if the LLM is non-free." > @@ -159,7 +161,8 @@ error signal and a string message." > (signal 'not-implemented nil)) > > (cl-defmethod llm-embedding-async ((_ (eql nil)) _ _ _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-embedding-async :before (provider _ _ _) > "Issue a warning if the LLM is non-free." > @@ -169,7 +172,7 @@ error signal and a string message." > (cl-defgeneric llm-count-tokens (provider string) > "Return the number of tokens in STRING from PROVIDER. > This may be an estimate if the LLM does not provide an exact > -count. Different providers might tokenize things in different > +count. Different providers might tokenize things in different > ways." > (ignore provider) > (with-temp-buffer > @@ -199,3 +202,4 @@ This should only be used for logging or debugging." > ""))) > > (provide 'llm) > +;;; llm.el ends here > > > > > > On Tue, Sep 12, 2023 at 11:05=E2=80=AFAM Stefan Kangas > > wrote: > > > >> Andrew Hyatt writes: > >> > >> > Another question is whether this should be one package or many. The > >> > many-package option would have the llm and llm-fake package in the > main > >> llm > >> > package, with a package for all llm clients, such as llm-openai and > >> > llm-vertex (which are the two options I have now). If someone has a= n > >> > opinion on this, please let me know. > >> > >> It's easier for users if it's just one package. > >> > --000000000000747afd0605ba4d0a Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
On Tue, Sep 19, 2023 at 12:34 PM Philip Kaludercic <philipk@posteo.net> wrote:
Andrew Hyatt <ahyatt@gmail.com> = writes:

> I've submitted the configuration for llm and set up the branch fro= m my
> repository last Friday.=C2=A0 However, I'm still not seeing this p= ackage being
> reflected in GNU ELPA's package archive.=C2=A0 I followed the inst= ructions, but
> perhaps there's some step that I've missed, or it is only peri= odically
> rebuilt?

Did you try to run make build/llm?=C2=A0 I get this error:

I did build it, and it seemed to = work. I=E2=80=99m not sure what I=E2=80=99m doing differently but I appreci= ate the patch which I=E2=80=99ll apply later today.=C2=A0 Thank you for you= r help.=C2=A0


--8<---------------cut here---------------start------------->8---
$ make build/llm
emacs --batch -l /home/philip/.../elpa/admin/elpa-admin.el=C2=A0 =C2=A0 =C2= =A0 \
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0-f elpaa-batch-make-one-package llm
Cloning branch llm:
Preparing worktree (new branch 'externals/llm')
branch 'externals/llm' set up to track 'origin/externals/llm= 9;.
HEAD is now at 39ae6fc794 Assign copyright to FSF, in preparation of inclus= ion to GNU ELPA

Debugger entered--Lisp error: (search-failed ";;; llm.el ends here&quo= t;)
...
--8<---------------cut here---------------end--------------->8---

In other words the footer is missing.=C2=A0 I have prepared a patch that would address that and a few other checkdoc issues:

diff --git a/llm.el b/llm.el
index 11b508cb36..08f07b65ca 100644
--- a/llm.el
+++ b/llm.el
@@ -23,9 +23,9 @@

=C2=A0;;; Commentary:
=C2=A0;; This file defines a generic interface for LLMs (large language mod= els), and
-;; functionality they can provide. Not all LLMs will support all of these,= but
+;; functionality they can provide.=C2=A0 Not all LLMs will support all of = these, but
=C2=A0;; programs that want to integrate with LLMs can code against the int= erface, and
-;; users can then choose the LLMs they want to use. It's advisable to = have the
+;; users can then choose the LLMs they want to use.=C2=A0 It's advisab= le to have the
=C2=A0;; possibility of using multiple LLMs when that make sense for differ= ent
=C2=A0;; functionality.
=C2=A0;;
@@ -50,7 +50,7 @@

=C2=A0(defun llm--warn-on-nonfree (name tos)
=C2=A0 =C2=A0"Issue a warning if `llm-warn-on-nonfree' is non-nil.=
-NAME is the human readable name of the LLM (e.g 'Open AI').
+NAME is the human readable name of the LLM (e.g \"Open AI\").
=C2=A0TOS is the URL of the terms of service for the LLM.

@@ -72,7 +72,7 @@ EXAMPLES is a list of conses, where the car is an example=
=C2=A0inputs, and cdr is the corresponding example outputs.=C2=A0 This is o= ptional.

=C2=A0INTERACTIONS is a list message sent by either the llm or the
-user. It is a list of `llm-chat-prompt-interaction' objects. This
+user.=C2=A0 It is a list of `llm-chat-prompt-interaction' objects.=C2= =A0 This
=C2=A0is required.

=C2=A0TEMPERATURE is a floating point number with a minimum of 0, and
@@ -80,8 +80,7 @@ maximum of 1, which controls how predictable the result i= s, with
=C2=A00 being the most predicatable, and 1 being the most creative.
=C2=A0This is not required.

-MAX-TOKENS is the maximum number of tokens to generate.=C2=A0 This is opti= onal.
-"
+MAX-TOKENS is the maximum number of tokens to generate.=C2=A0 This is opti= onal."
=C2=A0 =C2=A0context examples interactions temperature max-tokens)

=C2=A0(cl-defstruct llm-chat-prompt-interaction
@@ -102,19 +101,20 @@ This should be a cons of the name of the LLM, and the= URL of the
=C2=A0terms of service.

=C2=A0If the LLM is free and has no restrictions on use, this should
-return nil. Since this function already returns nil, there is no
+return nil.=C2=A0 Since this function already returns nil, there is no
=C2=A0need to override it."
=C2=A0 =C2=A0(ignore provider)
=C2=A0 =C2=A0nil)

=C2=A0(cl-defgeneric llm-chat (provider prompt)
=C2=A0 =C2=A0"Return a response to PROMPT from PROVIDER.
-PROMPT is a `llm-chat-prompt'. The response is a string."
+PROMPT is a `llm-chat-prompt'.=C2=A0 The response is a string." =C2=A0 =C2=A0(ignore provider prompt)
=C2=A0 =C2=A0(signal 'not-implemented nil))

=C2=A0(cl-defmethod llm-chat ((_ (eql nil)) _)
-=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))
+=C2=A0 "Catch trivial configuration mistake."
+=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))

=C2=A0(cl-defmethod llm-chat :before (provider _)
=C2=A0 =C2=A0"Issue a warning if the LLM is non-free."
@@ -130,7 +130,8 @@ ERROR-CALLBACK receives the error response."
=C2=A0 =C2=A0(signal 'not-implemented nil))

=C2=A0(cl-defmethod llm-chat-async ((_ (eql nil)) _ _ _)
-=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))
+=C2=A0 "Catch trivial configuration mistake."
+=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))

=C2=A0(cl-defmethod llm-chat-async :before (provider _ _ _)
=C2=A0 =C2=A0"Issue a warning if the LLM is non-free."
@@ -143,7 +144,8 @@ ERROR-CALLBACK receives the error response."
=C2=A0 =C2=A0(signal 'not-implemented nil))

=C2=A0(cl-defmethod llm-embedding ((_ (eql nil)) _)
-=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))
+=C2=A0 "Catch trivial configuration mistake."
+=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))

=C2=A0(cl-defmethod llm-embedding :before (provider _)
=C2=A0 =C2=A0"Issue a warning if the LLM is non-free."
@@ -159,7 +161,8 @@ error signal and a string message."
=C2=A0 =C2=A0(signal 'not-implemented nil))

=C2=A0(cl-defmethod llm-embedding-async ((_ (eql nil)) _ _ _)
-=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))
+=C2=A0 "Catch trivial configuration mistake."
+=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))

=C2=A0(cl-defmethod llm-embedding-async :before (provider _ _ _)
=C2=A0 =C2=A0"Issue a warning if the LLM is non-free."
@@ -169,7 +172,7 @@ error signal and a string message."
=C2=A0(cl-defgeneric llm-count-tokens (provider string)
=C2=A0 =C2=A0"Return the number of tokens in STRING from PROVIDER.
=C2=A0This may be an estimate if the LLM does not provide an exact
-count. Different providers might tokenize things in different
+count.=C2=A0 Different providers might tokenize things in different
=C2=A0ways."
=C2=A0 =C2=A0 =C2=A0(ignore provider)
=C2=A0 =C2=A0 =C2=A0(with-temp-buffer
@@ -199,3 +202,4 @@ This should only be used for logging or debugging."= ;
=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0"")))

=C2=A0(provide 'llm)
+;;; llm.el ends here


>
> On Tue, Sep 12, 2023 at 11:05=E2=80=AFAM Stefan Kangas <stefankangas@gmail.com= >
> wrote:
>
>> Andrew Hyatt <ahyatt@gmail.com> writes:
>>
>> > Another question is whether this should be one package or man= y.=C2=A0 The
>> > many-package option would have the llm and llm-fake package i= n the main
>> llm
>> > package, with a package for all llm clients, such as llm-open= ai and
>> > llm-vertex (which are the two options I have now).=C2=A0 If s= omeone has an
>> > opinion on this, please let me know.
>>
>> It's easier for users if it's just one package.
>>
--000000000000747afd0605ba4d0a--