From mboxrd@z Thu Jan 1 00:00:00 1970 Path: news.gmane.io!.POSTED.blaine.gmane.org!not-for-mail From: Andrew Hyatt <ahyatt@gmail.com> Newsgroups: gmane.emacs.devel Subject: Re: [NonGNU ELPA] New package: llm Date: Tue, 19 Sep 2023 14:19:30 -0400 Message-ID: <CAM6wYYJ6y-J-Pgd4A5DYx6TOFkchsiQzfxEm4y549jU6EtCG=A@mail.gmail.com> References: <CAM6wYYJHa+tCUKO_SsnT77g-4MUM0x4FrkoCekr=T9-UF1ADDA@mail.gmail.com> <E1qTaA2-00038O-UA@fencepost.gnu.org> <CAM6wYY+E=z5VqV2xXMbhbpN7vn+-tyzfOGKFAuG0s+croRmEPA@mail.gmail.com> <E1qV08g-0001mb-11@fencepost.gnu.org> <CAM6wYYLZ26E4rpo2Ae2PyxKSBYQKAXQ6U5_QGMoGx5SQy7AMSA@mail.gmail.com> <ee576b8c-4626-ec10-f09b-15255b6bd277@gmail.com> <E1qcyN2-0001Yd-0P@fencepost.gnu.org> <CAM6wYYJCviVSMK=5EA1=0WdyGDRPKXj1OY-3jPB_d48BJFd+TQ@mail.gmail.com> <E1qe3hh-0001gk-Si@fencepost.gnu.org> <CAM6wYYKgwCGWdVm7xJF+nDHSwETfVGD0AvfKUjkQVqJ9JeLdLA@mail.gmail.com> <CADwFkmnWic_qsz3fHv7ng75vxFyDR4K99jTF3k5RDK8+gfs-1g@mail.gmail.com> <CAM6wYYKuqDjHy8MX8wWFDreW=moABfF9Ofit8RDJn_bKznN9=w@mail.gmail.com> <87v8c69l5f.fsf@posteo.net> Mime-Version: 1.0 Content-Type: multipart/alternative; boundary="000000000000747afd0605ba4d0a" Injection-Info: ciao.gmane.io; posting-host="blaine.gmane.org:116.202.254.214"; logging-data="26680"; mail-complaints-to="usenet@ciao.gmane.io" Cc: Stefan Kangas <stefankangas@gmail.com>, emacs-devel@gnu.org, jporterbugs@gmail.com, rms@gnu.org To: Philip Kaludercic <philipk@posteo.net> Original-X-From: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Tue Sep 19 20:20:56 2023 Return-path: <emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org> Envelope-to: ged-emacs-devel@m.gmane-mx.org Original-Received: from lists.gnu.org ([209.51.188.17]) by ciao.gmane.io with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.92) (envelope-from <emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org>) id 1qifLD-0006jZ-3W for ged-emacs-devel@m.gmane-mx.org; Tue, 19 Sep 2023 20:20:55 +0200 Original-Received: from localhost ([::1] helo=lists1p.gnu.org) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from <emacs-devel-bounces@gnu.org>) id 1qifKG-00019v-3i; Tue, 19 Sep 2023 14:19:56 -0400 Original-Received: from eggs.gnu.org ([2001:470:142:3::10]) by lists.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from <ahyatt@gmail.com>) id 1qifK7-00018u-Fj for emacs-devel@gnu.org; Tue, 19 Sep 2023 14:19:49 -0400 Original-Received: from mail-ed1-x530.google.com ([2a00:1450:4864:20::530]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_128_GCM_SHA256:128) (Exim 4.90_1) (envelope-from <ahyatt@gmail.com>) id 1qifK4-0005yU-9t; Tue, 19 Sep 2023 14:19:47 -0400 Original-Received: by mail-ed1-x530.google.com with SMTP id 4fb4d7f45d1cf-5315b70c50dso2399238a12.2; Tue, 19 Sep 2023 11:19:43 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20230601; t=1695147582; x=1695752382; darn=gnu.org; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:from:to:cc:subject:date:message-id:reply-to; bh=8bhIGKX8Zr3GTgssVZ2HqJY2+Ekp7NboyN5/mDQ2eXI=; b=HrEevI9pn0JErFjaZh0MhJ/HX8uNYZ3bbpLUjZ3+VWZ4y+3MqYAcctmR7dcPEyJz5I SsArAeXHX7fRnM4HXD+f0q1xnCyHQd5Syn0/kqQQvZux2FtQq524kiG5r4EH/0PiA+6s RY65rJ3Yj32cvh9zb+rF8plpHhUm+Y8+L8YFM3pKmevApPIi/HTABjArebrjTbqnMO8X LDFC4HGF+fEzXuqok+5QeBajasq4LcEJ7+vbvd4qcAqI4WfeNlU1ph8s0nCvQojahUOn 7XLS8gPakp1yGkSEEeVhb++Q0kMshKXtJiII7cgEnBy4+Bn0gEXZZDXXelfo3lN7iKvc M7FA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1695147582; x=1695752382; h=cc:to:subject:message-id:date:from:in-reply-to:references :mime-version:x-gm-message-state:from:to:cc:subject:date:message-id :reply-to; bh=8bhIGKX8Zr3GTgssVZ2HqJY2+Ekp7NboyN5/mDQ2eXI=; b=XjiT284+thH3MW5QI8XwhqVPgmv22rXdI0MKeOa30d3tYWfGkawSMLNTJQ76slnKdb OjWGWBY/WxmLBivYodq1wTZPKa0qtXqE256dDuX/UbhkCZ9i2bQtTnidWxMOf2hO32xp aUoi6FeRE1N/quRPia0etLM4aTFmvI+QwA0Wv3VlZBtkTSeee7ANr2zqZzNhiDYV5kqE BWIWF21O/004rWTe9L7NOfz5l4tBLvQl2X7hGGXyURTtslfShF1HzX1GS4m+ujVhQunz 1bYYRqCeIlQ+LwpXx1jdnFiT7LnY4/kuZfA/TT6f0YWxq6ke1ZY8VyMJxGchAZi0ADeU Y22w== X-Gm-Message-State: AOJu0YzdlGPrgsM9L8LvpjXTFByGwKr/N/mFE5tVR2nv/K4PO76NlvCN mPFVqa2tthtCle1TvNSYl9tgWMeWCKMgNrEGrbE= X-Google-Smtp-Source: AGHT+IG11TSITQULvpjBf4ZfKe3Ix4pNB38JaMGvuNxJEqtK3ZYAsJworkMBMLL8u5h+0AIhRkDSaygcXtZRBmSL2iI= X-Received: by 2002:aa7:da81:0:b0:525:76fc:f559 with SMTP id q1-20020aa7da81000000b0052576fcf559mr186215eds.41.1695147581797; Tue, 19 Sep 2023 11:19:41 -0700 (PDT) In-Reply-To: <87v8c69l5f.fsf@posteo.net> Received-SPF: pass client-ip=2a00:1450:4864:20::530; envelope-from=ahyatt@gmail.com; helo=mail-ed1-x530.google.com X-Spam_score_int: -20 X-Spam_score: -2.1 X-Spam_bar: -- X-Spam_report: (-2.1 / 5.0 requ) BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, DKIM_VALID_EF=-0.1, FREEMAIL_FROM=0.001, HTML_MESSAGE=0.001, RCVD_IN_DNSWL_NONE=-0.0001, SPF_HELO_NONE=0.001, SPF_PASS=-0.001 autolearn=ham autolearn_force=no X-Spam_action: no action X-BeenThere: emacs-devel@gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: "Emacs development discussions." <emacs-devel.gnu.org> List-Unsubscribe: <https://lists.gnu.org/mailman/options/emacs-devel>, <mailto:emacs-devel-request@gnu.org?subject=unsubscribe> List-Archive: <https://lists.gnu.org/archive/html/emacs-devel> List-Post: <mailto:emacs-devel@gnu.org> List-Help: <mailto:emacs-devel-request@gnu.org?subject=help> List-Subscribe: <https://lists.gnu.org/mailman/listinfo/emacs-devel>, <mailto:emacs-devel-request@gnu.org?subject=subscribe> Errors-To: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Original-Sender: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Xref: news.gmane.io gmane.emacs.devel:310790 Archived-At: <http://permalink.gmane.org/gmane.emacs.devel/310790> --000000000000747afd0605ba4d0a Content-Type: text/plain; charset="UTF-8" Content-Transfer-Encoding: quoted-printable On Tue, Sep 19, 2023 at 12:34 PM Philip Kaludercic <philipk@posteo.net> wrote: > Andrew Hyatt <ahyatt@gmail.com> writes: > > > I've submitted the configuration for llm and set up the branch from my > > repository last Friday. However, I'm still not seeing this package bei= ng > > reflected in GNU ELPA's package archive. I followed the instructions, > but > > perhaps there's some step that I've missed, or it is only periodically > > rebuilt? > > Did you try to run make build/llm? I get this error: I did build it, and it seemed to work. I=E2=80=99m not sure what I=E2=80=99= m doing differently but I appreciate the patch which I=E2=80=99ll apply later today= . Thank you for your help. > > > --8<---------------cut here---------------start------------->8--- > $ make build/llm > emacs --batch -l /home/philip/.../elpa/admin/elpa-admin.el \ > -f elpaa-batch-make-one-package llm > Cloning branch llm: > Preparing worktree (new branch 'externals/llm') > branch 'externals/llm' set up to track 'origin/externals/llm'. > HEAD is now at 39ae6fc794 Assign copyright to FSF, in preparation of > inclusion to GNU ELPA > > Debugger entered--Lisp error: (search-failed ";;; llm.el ends here") > ... > --8<---------------cut here---------------end--------------->8--- > > In other words the footer is missing. I have prepared a patch that > would address that and a few other checkdoc issues: > > diff --git a/llm.el b/llm.el > index 11b508cb36..08f07b65ca 100644 > --- a/llm.el > +++ b/llm.el > @@ -23,9 +23,9 @@ > > ;;; Commentary: > ;; This file defines a generic interface for LLMs (large language > models), and > -;; functionality they can provide. Not all LLMs will support all of > these, but > +;; functionality they can provide. Not all LLMs will support all of > these, but > ;; programs that want to integrate with LLMs can code against the > interface, and > -;; users can then choose the LLMs they want to use. It's advisable to > have the > +;; users can then choose the LLMs they want to use. It's advisable to > have the > ;; possibility of using multiple LLMs when that make sense for different > ;; functionality. > ;; > @@ -50,7 +50,7 @@ > > (defun llm--warn-on-nonfree (name tos) > "Issue a warning if `llm-warn-on-nonfree' is non-nil. > -NAME is the human readable name of the LLM (e.g 'Open AI'). > +NAME is the human readable name of the LLM (e.g \"Open AI\"). > > TOS is the URL of the terms of service for the LLM. > > @@ -72,7 +72,7 @@ EXAMPLES is a list of conses, where the car is an examp= le > inputs, and cdr is the corresponding example outputs. This is optional. > > INTERACTIONS is a list message sent by either the llm or the > -user. It is a list of `llm-chat-prompt-interaction' objects. This > +user. It is a list of `llm-chat-prompt-interaction' objects. This > is required. > > TEMPERATURE is a floating point number with a minimum of 0, and > @@ -80,8 +80,7 @@ maximum of 1, which controls how predictable the result > is, with > 0 being the most predicatable, and 1 being the most creative. > This is not required. > > -MAX-TOKENS is the maximum number of tokens to generate. This is optiona= l. > -" > +MAX-TOKENS is the maximum number of tokens to generate. This is > optional." > context examples interactions temperature max-tokens) > > (cl-defstruct llm-chat-prompt-interaction > @@ -102,19 +101,20 @@ This should be a cons of the name of the LLM, and > the URL of the > terms of service. > > If the LLM is free and has no restrictions on use, this should > -return nil. Since this function already returns nil, there is no > +return nil. Since this function already returns nil, there is no > need to override it." > (ignore provider) > nil) > > (cl-defgeneric llm-chat (provider prompt) > "Return a response to PROMPT from PROVIDER. > -PROMPT is a `llm-chat-prompt'. The response is a string." > +PROMPT is a `llm-chat-prompt'. The response is a string." > (ignore provider prompt) > (signal 'not-implemented nil)) > > (cl-defmethod llm-chat ((_ (eql nil)) _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-chat :before (provider _) > "Issue a warning if the LLM is non-free." > @@ -130,7 +130,8 @@ ERROR-CALLBACK receives the error response." > (signal 'not-implemented nil)) > > (cl-defmethod llm-chat-async ((_ (eql nil)) _ _ _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-chat-async :before (provider _ _ _) > "Issue a warning if the LLM is non-free." > @@ -143,7 +144,8 @@ ERROR-CALLBACK receives the error response." > (signal 'not-implemented nil)) > > (cl-defmethod llm-embedding ((_ (eql nil)) _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-embedding :before (provider _) > "Issue a warning if the LLM is non-free." > @@ -159,7 +161,8 @@ error signal and a string message." > (signal 'not-implemented nil)) > > (cl-defmethod llm-embedding-async ((_ (eql nil)) _ _ _) > - (error "LLM provider was nil. Please set the provider in the > application you are using.")) > + "Catch trivial configuration mistake." > + (error "LLM provider was nil. Please set the provider in the > application you are using")) > > (cl-defmethod llm-embedding-async :before (provider _ _ _) > "Issue a warning if the LLM is non-free." > @@ -169,7 +172,7 @@ error signal and a string message." > (cl-defgeneric llm-count-tokens (provider string) > "Return the number of tokens in STRING from PROVIDER. > This may be an estimate if the LLM does not provide an exact > -count. Different providers might tokenize things in different > +count. Different providers might tokenize things in different > ways." > (ignore provider) > (with-temp-buffer > @@ -199,3 +202,4 @@ This should only be used for logging or debugging." > ""))) > > (provide 'llm) > +;;; llm.el ends here > > > > > > On Tue, Sep 12, 2023 at 11:05=E2=80=AFAM Stefan Kangas <stefankangas@gm= ail.com> > > wrote: > > > >> Andrew Hyatt <ahyatt@gmail.com> writes: > >> > >> > Another question is whether this should be one package or many. The > >> > many-package option would have the llm and llm-fake package in the > main > >> llm > >> > package, with a package for all llm clients, such as llm-openai and > >> > llm-vertex (which are the two options I have now). If someone has a= n > >> > opinion on this, please let me know. > >> > >> It's easier for users if it's just one package. > >> > --000000000000747afd0605ba4d0a Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable <div>On Tue, Sep 19, 2023 at 12:34 PM Philip Kaludercic <<a href=3D"mail= to:philipk@posteo.net">philipk@posteo.net</a>> wrote:<br></div><div><div= class=3D"gmail_quote"><blockquote class=3D"gmail_quote" style=3D"margin:0 = 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">Andrew Hyatt <<a h= ref=3D"mailto:ahyatt@gmail.com" target=3D"_blank">ahyatt@gmail.com</a>> = writes:<br> <br> > I've submitted the configuration for llm and set up the branch fro= m my<br> > repository last Friday.=C2=A0 However, I'm still not seeing this p= ackage being<br> > reflected in GNU ELPA's package archive.=C2=A0 I followed the inst= ructions, but<br> > perhaps there's some step that I've missed, or it is only peri= odically<br> > rebuilt?<br> <br> Did you try to run make build/llm?=C2=A0 I get this error:</blockquote><div= dir=3D"auto"><br></div><div dir=3D"auto">I did build it, and it seemed to = work. I=E2=80=99m not sure what I=E2=80=99m doing differently but I appreci= ate the patch which I=E2=80=99ll apply later today.=C2=A0 Thank you for you= r help.=C2=A0</div><blockquote class=3D"gmail_quote" style=3D"margin:0 0 0 = .8ex;border-left:1px #ccc solid;padding-left:1ex" dir=3D"auto"><br> <br> --8<---------------cut here---------------start------------->8---<br> $ make build/llm<br> emacs --batch -l /home/philip/.../elpa/admin/elpa-admin.el=C2=A0 =C2=A0 =C2= =A0 \<br> =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0-f elpaa-batch-make-one-package llm<br> Cloning branch llm:<br> Preparing worktree (new branch 'externals/llm')<br> branch 'externals/llm' set up to track 'origin/externals/llm= 9;.<br> HEAD is now at 39ae6fc794 Assign copyright to FSF, in preparation of inclus= ion to GNU ELPA<br> <br> Debugger entered--Lisp error: (search-failed ";;; llm.el ends here&quo= t;)<br> ...<br> --8<---------------cut here---------------end--------------->8---<br> <br> In other words the footer is missing.=C2=A0 I have prepared a patch that<br= > would address that and a few other checkdoc issues:<br> <br> diff --git a/llm.el b/llm.el<br> index 11b508cb36..08f07b65ca 100644<br> --- a/llm.el<br> +++ b/llm.el<br> @@ -23,9 +23,9 @@<br> <br> =C2=A0;;; Commentary:<br> =C2=A0;; This file defines a generic interface for LLMs (large language mod= els), and<br> -;; functionality they can provide. Not all LLMs will support all of these,= but<br> +;; functionality they can provide.=C2=A0 Not all LLMs will support all of = these, but<br> =C2=A0;; programs that want to integrate with LLMs can code against the int= erface, and<br> -;; users can then choose the LLMs they want to use. It's advisable to = have the<br> +;; users can then choose the LLMs they want to use.=C2=A0 It's advisab= le to have the<br> =C2=A0;; possibility of using multiple LLMs when that make sense for differ= ent<br> =C2=A0;; functionality.<br> =C2=A0;;<br> @@ -50,7 +50,7 @@<br> <br> =C2=A0(defun llm--warn-on-nonfree (name tos)<br> =C2=A0 =C2=A0"Issue a warning if `llm-warn-on-nonfree' is non-nil.= <br> -NAME is the human readable name of the LLM (e.g 'Open AI').<br> +NAME is the human readable name of the LLM (e.g \"Open AI\").<br= > <br> =C2=A0TOS is the URL of the terms of service for the LLM.<br> <br> @@ -72,7 +72,7 @@ EXAMPLES is a list of conses, where the car is an example= <br> =C2=A0inputs, and cdr is the corresponding example outputs.=C2=A0 This is o= ptional.<br> <br> =C2=A0INTERACTIONS is a list message sent by either the llm or the<br> -user. It is a list of `llm-chat-prompt-interaction' objects. This<br> +user.=C2=A0 It is a list of `llm-chat-prompt-interaction' objects.=C2= =A0 This<br> =C2=A0is required.<br> <br> =C2=A0TEMPERATURE is a floating point number with a minimum of 0, and<br> @@ -80,8 +80,7 @@ maximum of 1, which controls how predictable the result i= s, with<br> =C2=A00 being the most predicatable, and 1 being the most creative.<br> =C2=A0This is not required.<br> <br> -MAX-TOKENS is the maximum number of tokens to generate.=C2=A0 This is opti= onal.<br> -"<br> +MAX-TOKENS is the maximum number of tokens to generate.=C2=A0 This is opti= onal."<br> =C2=A0 =C2=A0context examples interactions temperature max-tokens)<br> <br> =C2=A0(cl-defstruct llm-chat-prompt-interaction<br> @@ -102,19 +101,20 @@ This should be a cons of the name of the LLM, and the= URL of the<br> =C2=A0terms of service.<br> <br> =C2=A0If the LLM is free and has no restrictions on use, this should<br> -return nil. Since this function already returns nil, there is no<br> +return nil.=C2=A0 Since this function already returns nil, there is no<br> =C2=A0need to override it."<br> =C2=A0 =C2=A0(ignore provider)<br> =C2=A0 =C2=A0nil)<br> <br> =C2=A0(cl-defgeneric llm-chat (provider prompt)<br> =C2=A0 =C2=A0"Return a response to PROMPT from PROVIDER.<br> -PROMPT is a `llm-chat-prompt'. The response is a string."<br> +PROMPT is a `llm-chat-prompt'.=C2=A0 The response is a string."<b= r> =C2=A0 =C2=A0(ignore provider prompt)<br> =C2=A0 =C2=A0(signal 'not-implemented nil))<br> <br> =C2=A0(cl-defmethod llm-chat ((_ (eql nil)) _)<br> -=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))<br> +=C2=A0 "Catch trivial configuration mistake."<br> +=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))<br> <br> =C2=A0(cl-defmethod llm-chat :before (provider _)<br> =C2=A0 =C2=A0"Issue a warning if the LLM is non-free."<br> @@ -130,7 +130,8 @@ ERROR-CALLBACK receives the error response."<br> =C2=A0 =C2=A0(signal 'not-implemented nil))<br> <br> =C2=A0(cl-defmethod llm-chat-async ((_ (eql nil)) _ _ _)<br> -=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))<br> +=C2=A0 "Catch trivial configuration mistake."<br> +=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))<br> <br> =C2=A0(cl-defmethod llm-chat-async :before (provider _ _ _)<br> =C2=A0 =C2=A0"Issue a warning if the LLM is non-free."<br> @@ -143,7 +144,8 @@ ERROR-CALLBACK receives the error response."<br> =C2=A0 =C2=A0(signal 'not-implemented nil))<br> <br> =C2=A0(cl-defmethod llm-embedding ((_ (eql nil)) _)<br> -=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))<br> +=C2=A0 "Catch trivial configuration mistake."<br> +=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))<br> <br> =C2=A0(cl-defmethod llm-embedding :before (provider _)<br> =C2=A0 =C2=A0"Issue a warning if the LLM is non-free."<br> @@ -159,7 +161,8 @@ error signal and a string message."<br> =C2=A0 =C2=A0(signal 'not-implemented nil))<br> <br> =C2=A0(cl-defmethod llm-embedding-async ((_ (eql nil)) _ _ _)<br> -=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using."))<br> +=C2=A0 "Catch trivial configuration mistake."<br> +=C2=A0 (error "LLM provider was nil.=C2=A0 Please set the provider in= the application you are using"))<br> <br> =C2=A0(cl-defmethod llm-embedding-async :before (provider _ _ _)<br> =C2=A0 =C2=A0"Issue a warning if the LLM is non-free."<br> @@ -169,7 +172,7 @@ error signal and a string message."<br> =C2=A0(cl-defgeneric llm-count-tokens (provider string)<br> =C2=A0 =C2=A0"Return the number of tokens in STRING from PROVIDER.<br> =C2=A0This may be an estimate if the LLM does not provide an exact<br> -count. Different providers might tokenize things in different<br> +count.=C2=A0 Different providers might tokenize things in different<br> =C2=A0ways."<br> =C2=A0 =C2=A0 =C2=A0(ignore provider)<br> =C2=A0 =C2=A0 =C2=A0(with-temp-buffer<br> @@ -199,3 +202,4 @@ This should only be used for logging or debugging."= ;<br> =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0"")))<br> <br> =C2=A0(provide 'llm)<br> +;;; llm.el ends here<br> <br> <br> ><br> > On Tue, Sep 12, 2023 at 11:05=E2=80=AFAM Stefan Kangas <<a href=3D"= mailto:stefankangas@gmail.com" target=3D"_blank">stefankangas@gmail.com</a>= ><br> > wrote:<br> ><br> >> Andrew Hyatt <<a href=3D"mailto:ahyatt@gmail.com" target=3D"_bl= ank">ahyatt@gmail.com</a>> writes:<br> >><br> >> > Another question is whether this should be one package or man= y.=C2=A0 The<br> >> > many-package option would have the llm and llm-fake package i= n the main<br> >> llm<br> >> > package, with a package for all llm clients, such as llm-open= ai and<br> >> > llm-vertex (which are the two options I have now).=C2=A0 If s= omeone has an<br> >> > opinion on this, please let me know.<br> >><br> >> It's easier for users if it's just one package.<br> >><br> </blockquote></div></div> --000000000000747afd0605ba4d0a--