From mboxrd@z Thu Jan 1 00:00:00 1970 Path: news.gmane.io!.POSTED.blaine.gmane.org!not-for-mail From: Philip Kaludercic Newsgroups: gmane.emacs.devel Subject: Re: [NonGNU ELPA] New package: llm Date: Tue, 19 Sep 2023 16:34:52 +0000 Message-ID: <87v8c69l5f.fsf@posteo.net> References: Mime-Version: 1.0 Content-Type: multipart/mixed; boundary="=-=-=" Injection-Info: ciao.gmane.io; posting-host="blaine.gmane.org:116.202.254.214"; logging-data="10765"; mail-complaints-to="usenet@ciao.gmane.io" Cc: Stefan Kangas , rms@gnu.org, jporterbugs@gmail.com, emacs-devel@gnu.org To: Andrew Hyatt Original-X-From: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Tue Sep 19 18:36:00 2023 Return-path: Envelope-to: ged-emacs-devel@m.gmane-mx.org Original-Received: from lists.gnu.org ([209.51.188.17]) by ciao.gmane.io with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.92) (envelope-from ) id 1qidhe-0002Sy-Cm for ged-emacs-devel@m.gmane-mx.org; Tue, 19 Sep 2023 18:35:58 +0200 Original-Received: from localhost ([::1] helo=lists1p.gnu.org) by lists.gnu.org with esmtp (Exim 4.90_1) (envelope-from ) id 1qidgi-0005ey-Oo; Tue, 19 Sep 2023 12:35:00 -0400 Original-Received: from eggs.gnu.org ([2001:470:142:3::10]) by lists.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from ) id 1qidgh-0005eq-7K for emacs-devel@gnu.org; Tue, 19 Sep 2023 12:34:59 -0400 Original-Received: from mout02.posteo.de ([185.67.36.66]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from ) id 1qidge-0003y6-O4 for emacs-devel@gnu.org; Tue, 19 Sep 2023 12:34:58 -0400 Original-Received: from submission (posteo.de [185.67.36.169]) by mout02.posteo.de (Postfix) with ESMTPS id C9D48240104 for ; Tue, 19 Sep 2023 18:34:53 +0200 (CEST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=posteo.net; s=2017; t=1695141293; bh=o8i61KTVyRE1XN4kZd8CyC/M8poaWWU24ZZckd93Xxs=; h=From:To:Cc:Subject:Autocrypt:Date:Message-ID:MIME-Version:From; b=fZP0xFTT02TGQU7i/QH+49SAsZzL63JJ+ToVJ6/vfpuUNxTIeYi/PZVAXRW4SxAPj v9W1WWQbkhGePUIZKAP3/vvYD/ZRAkMOWzyAERZz4Vdi+mSn8bbpPR8YfKFTckvpz9 7qFUc2YnlHvB2EcbIgdcNifQmMIZXaS2Lgn5bmLCUAosk0E26H+J3zm5O05RjN7xvs mKieybregFvEUIj6C+ms0r1Cx5rdAVCQvkD4tHrKSmIQuBVUsABtHKR/gm6J3113wb xFyg42Ihjfsxh+j3SE2doH0ujvKmjEYxYadiypuxFlDuL+u6M2vbOACPGS33piZC4j IXU6gSJqpRXnQ== Original-Received: from customer (localhost [127.0.0.1]) by submission (posteo.de) with ESMTPSA id 4RqnM10dmQz9rxK; Tue, 19 Sep 2023 18:34:52 +0200 (CEST) In-Reply-To: (Andrew Hyatt's message of "Tue, 19 Sep 2023 12:26:28 -0400") Autocrypt: addr=philipk@posteo.net; keydata= mDMEZBBQQhYJKwYBBAHaRw8BAQdAHJuofBrfqFh12uQu0Yi7mrl525F28eTmwUDflFNmdui0QlBo aWxpcCBLYWx1ZGVyY2ljIChnZW5lcmF0ZWQgYnkgYXV0b2NyeXB0LmVsKSA8cGhpbGlwa0Bwb3N0 ZW8ubmV0PoiWBBMWCAA+FiEEDg7HY17ghYlni8XN8xYDWXahwukFAmQQUEICGwMFCQHhM4AFCwkI BwIGFQoJCAsCBBYCAwECHgECF4AACgkQ8xYDWXahwulikAEA77hloUiSrXgFkUVJhlKBpLCHUjA0 mWZ9j9w5d08+jVwBAK6c4iGP7j+/PhbkxaEKa4V3MzIl7zJkcNNjHCXmvFcEuDgEZBBQQhIKKwYB BAGXVQEFAQEHQI5NLiLRjZy3OfSt1dhCmFyn+fN/QKELUYQetiaoe+MMAwEIB4h+BBgWCAAmFiEE Dg7HY17ghYlni8XN8xYDWXahwukFAmQQUEICGwwFCQHhM4AACgkQ8xYDWXahwukm+wEA8cml4JpK NeAu65rg+auKrPOP6TP/4YWRCTIvuYDm0joBALw98AMz7/qMHvSCeU/hw9PL6u6R2EScxtpKnWof z4oM Received-SPF: pass client-ip=185.67.36.66; envelope-from=philipk@posteo.net; helo=mout02.posteo.de X-Spam_score_int: -43 X-Spam_score: -4.4 X-Spam_bar: ---- X-Spam_report: (-4.4 / 5.0 requ) BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, DKIM_VALID_EF=-0.1, RCVD_IN_DNSWL_MED=-2.3, RCVD_IN_MSPIKE_H5=0.001, RCVD_IN_MSPIKE_WL=0.001, SPF_HELO_NONE=0.001, SPF_PASS=-0.001 autolearn=ham autolearn_force=no X-Spam_action: no action X-BeenThere: emacs-devel@gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: "Emacs development discussions." List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Original-Sender: emacs-devel-bounces+ged-emacs-devel=m.gmane-mx.org@gnu.org Xref: news.gmane.io gmane.emacs.devel:310787 Archived-At: --=-=-= Content-Type: text/plain Andrew Hyatt writes: > I've submitted the configuration for llm and set up the branch from my > repository last Friday. However, I'm still not seeing this package being > reflected in GNU ELPA's package archive. I followed the instructions, but > perhaps there's some step that I've missed, or it is only periodically > rebuilt? Did you try to run make build/llm? I get this error: --8<---------------cut here---------------start------------->8--- $ make build/llm emacs --batch -l /home/philip/.../elpa/admin/elpa-admin.el \ -f elpaa-batch-make-one-package llm Cloning branch llm: Preparing worktree (new branch 'externals/llm') branch 'externals/llm' set up to track 'origin/externals/llm'. HEAD is now at 39ae6fc794 Assign copyright to FSF, in preparation of inclusion to GNU ELPA Debugger entered--Lisp error: (search-failed ";;; llm.el ends here") ... --8<---------------cut here---------------end--------------->8--- In other words the footer is missing. I have prepared a patch that would address that and a few other checkdoc issues: --=-=-= Content-Type: text/plain Content-Disposition: inline diff --git a/llm.el b/llm.el index 11b508cb36..08f07b65ca 100644 --- a/llm.el +++ b/llm.el @@ -23,9 +23,9 @@ ;;; Commentary: ;; This file defines a generic interface for LLMs (large language models), and -;; functionality they can provide. Not all LLMs will support all of these, but +;; functionality they can provide. Not all LLMs will support all of these, but ;; programs that want to integrate with LLMs can code against the interface, and -;; users can then choose the LLMs they want to use. It's advisable to have the +;; users can then choose the LLMs they want to use. It's advisable to have the ;; possibility of using multiple LLMs when that make sense for different ;; functionality. ;; @@ -50,7 +50,7 @@ (defun llm--warn-on-nonfree (name tos) "Issue a warning if `llm-warn-on-nonfree' is non-nil. -NAME is the human readable name of the LLM (e.g 'Open AI'). +NAME is the human readable name of the LLM (e.g \"Open AI\"). TOS is the URL of the terms of service for the LLM. @@ -72,7 +72,7 @@ EXAMPLES is a list of conses, where the car is an example inputs, and cdr is the corresponding example outputs. This is optional. INTERACTIONS is a list message sent by either the llm or the -user. It is a list of `llm-chat-prompt-interaction' objects. This +user. It is a list of `llm-chat-prompt-interaction' objects. This is required. TEMPERATURE is a floating point number with a minimum of 0, and @@ -80,8 +80,7 @@ maximum of 1, which controls how predictable the result is, with 0 being the most predicatable, and 1 being the most creative. This is not required. -MAX-TOKENS is the maximum number of tokens to generate. This is optional. -" +MAX-TOKENS is the maximum number of tokens to generate. This is optional." context examples interactions temperature max-tokens) (cl-defstruct llm-chat-prompt-interaction @@ -102,19 +101,20 @@ This should be a cons of the name of the LLM, and the URL of the terms of service. If the LLM is free and has no restrictions on use, this should -return nil. Since this function already returns nil, there is no +return nil. Since this function already returns nil, there is no need to override it." (ignore provider) nil) (cl-defgeneric llm-chat (provider prompt) "Return a response to PROMPT from PROVIDER. -PROMPT is a `llm-chat-prompt'. The response is a string." +PROMPT is a `llm-chat-prompt'. The response is a string." (ignore provider prompt) (signal 'not-implemented nil)) (cl-defmethod llm-chat ((_ (eql nil)) _) - (error "LLM provider was nil. Please set the provider in the application you are using.")) + "Catch trivial configuration mistake." + (error "LLM provider was nil. Please set the provider in the application you are using")) (cl-defmethod llm-chat :before (provider _) "Issue a warning if the LLM is non-free." @@ -130,7 +130,8 @@ ERROR-CALLBACK receives the error response." (signal 'not-implemented nil)) (cl-defmethod llm-chat-async ((_ (eql nil)) _ _ _) - (error "LLM provider was nil. Please set the provider in the application you are using.")) + "Catch trivial configuration mistake." + (error "LLM provider was nil. Please set the provider in the application you are using")) (cl-defmethod llm-chat-async :before (provider _ _ _) "Issue a warning if the LLM is non-free." @@ -143,7 +144,8 @@ ERROR-CALLBACK receives the error response." (signal 'not-implemented nil)) (cl-defmethod llm-embedding ((_ (eql nil)) _) - (error "LLM provider was nil. Please set the provider in the application you are using.")) + "Catch trivial configuration mistake." + (error "LLM provider was nil. Please set the provider in the application you are using")) (cl-defmethod llm-embedding :before (provider _) "Issue a warning if the LLM is non-free." @@ -159,7 +161,8 @@ error signal and a string message." (signal 'not-implemented nil)) (cl-defmethod llm-embedding-async ((_ (eql nil)) _ _ _) - (error "LLM provider was nil. Please set the provider in the application you are using.")) + "Catch trivial configuration mistake." + (error "LLM provider was nil. Please set the provider in the application you are using")) (cl-defmethod llm-embedding-async :before (provider _ _ _) "Issue a warning if the LLM is non-free." @@ -169,7 +172,7 @@ error signal and a string message." (cl-defgeneric llm-count-tokens (provider string) "Return the number of tokens in STRING from PROVIDER. This may be an estimate if the LLM does not provide an exact -count. Different providers might tokenize things in different +count. Different providers might tokenize things in different ways." (ignore provider) (with-temp-buffer @@ -199,3 +202,4 @@ This should only be used for logging or debugging." ""))) (provide 'llm) +;;; llm.el ends here --=-=-= Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable > > On Tue, Sep 12, 2023 at 11:05=E2=80=AFAM Stefan Kangas > wrote: > >> Andrew Hyatt writes: >> >> > Another question is whether this should be one package or many. The >> > many-package option would have the llm and llm-fake package in the main >> llm >> > package, with a package for all llm clients, such as llm-openai and >> > llm-vertex (which are the two options I have now). If someone has an >> > opinion on this, please let me know. >> >> It's easier for users if it's just one package. >> --=-=-=--