all messages for Emacs-related lists mirrored at yhetil.org
 help / color / mirror / code / Atom feed
* [NonGNU ELPA] Add package gptel
@ 2024-04-28  3:55 Karthik Chikmagalur
  2024-04-28  6:30 ` Karthik Chikmagalur
                   ` (2 more replies)
  0 siblings, 3 replies; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-28  3:55 UTC (permalink / raw)
  To: emacs-devel

I'd like to submit my package gptel to NonGNU ELPA:

https://github.com/karthink/gptel

gptel is an LLM (Large Language Model) client for Emacs that supports
most LLM providers that offer an HTTP API.  This includes open source
models running locally on the user's PC or network via Ollama,
Llama.cpp, Llamafiles and GPT4All, and access to larger models provided
by a growing number of companies.  This includes ChatGPT, Anthropic
Claude, Gemini, Kagi, Groq, Perplexity, TogetherAI and several more.

gptel tries to provide a uniform Emacs-y UI for all backends, and works
as both a chat interface (in dedicated chat buffers) and as a
helper/lookup agent in all Emacs buffers.  There is a demo showcasing
its many uses here:

https://www.youtube.com/watch?v=bsRnh_brggM

It has no external dependencies (Emacs packages or otherwise), but uses
Curl if it's available.

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
@ 2024-04-28  6:30 ` Karthik Chikmagalur
  2024-04-28  8:21 ` Philip Kaludercic
  2024-04-29 22:40 ` Richard Stallman
  2 siblings, 0 replies; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-28  6:30 UTC (permalink / raw)
  To: emacs-devel

> It has no external dependencies (Emacs packages or otherwise), but uses
> Curl if it's available.

Just realized this isn't true -- gptel depends on the compat package to
support Emacs 27 and 28.



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
  2024-04-28  6:30 ` Karthik Chikmagalur
@ 2024-04-28  8:21 ` Philip Kaludercic
  2024-04-28 16:50   ` Karthik Chikmagalur
  2024-04-28 17:38   ` Karthik Chikmagalur
  2024-04-29 22:40 ` Richard Stallman
  2 siblings, 2 replies; 14+ messages in thread
From: Philip Kaludercic @ 2024-04-28  8:21 UTC (permalink / raw)
  To: Karthik Chikmagalur; +Cc: emacs-devel

[-- Attachment #1: Type: text/plain, Size: 641 bytes --]

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

> I'd like to submit my package gptel to NonGNU ELPA:
>
> https://github.com/karthink/gptel
>
> gptel is an LLM (Large Language Model) client for Emacs that supports
> most LLM providers that offer an HTTP API.  This includes open source
> models running locally on the user's PC or network via Ollama,
> Llama.cpp, Llamafiles and GPT4All, and access to larger models provided
> by a growing number of companies.  This includes ChatGPT, Anthropic
> Claude, Gemini, Kagi, Groq, Perplexity, TogetherAI and several more.

Here are a few quick comments, but the code is fine overall:


[-- Attachment #2: Type: text/plain, Size: 11831 bytes --]

diff --git a/gptel.el b/gptel.el
index b3ff962..96f7626 100644
--- a/gptel.el
+++ b/gptel.el
@@ -156,7 +156,6 @@
   "Path to a proxy to use for gptel interactions.
 Passed to curl via --proxy arg, for example \"proxy.yourorg.com:80\"
 Leave it empty if you don't use a proxy."
-  :group 'gptel
   :type 'string)
 
 (defcustom gptel-api-key #'gptel-api-key-from-auth-source
@@ -166,7 +165,6 @@ OpenAI by default.
 
 Can also be a function of no arguments that returns an API
 key (more secure) for the active backend."
-  :group 'gptel
   :type '(choice
           (string :tag "API key")
           (function :tag "Function that returns the API key")))
@@ -182,13 +180,11 @@ When set to nil, Emacs waits for the full response and inserts it
 all at once.  This wait is asynchronous.
 
 \='tis a bit silly."
-  :group 'gptel
   :type 'boolean)
 (make-obsolete-variable 'gptel-playback 'gptel-stream "0.3.0")
 
 (defcustom gptel-use-curl (and (executable-find "curl") t)
   "Whether gptel should prefer Curl when available."
-  :group 'gptel
   :type 'boolean)
 
 (defcustom gptel-curl-file-size-threshold 130000
@@ -207,11 +203,10 @@ and the typical size of the data being sent in GPTel queries.
 A larger value may improve performance by avoiding the overhead of creating
 temporary files for small data payloads, while a smaller value may be needed
 if the command-line argument size is limited by the operating system."
-  :group 'gptel
-  :type 'integer)
+  :type 'natnum)
 
 (defcustom gptel-response-filter-functions
-  '(gptel--convert-org)
+  (list #'gptel--convert-org)
   "Abnormal hook for transforming the response from an LLM.
 
 This is used to format the response in some way, such as filling
@@ -225,7 +220,6 @@ should return the transformed string.
 NOTE: This is only used for non-streaming responses.  To
 transform streaming responses, use `gptel-post-stream-hook' and
 `gptel-post-response-functions'."
-  :group 'gptel
   :type 'hook)
 
 (defcustom gptel-pre-response-hook nil
@@ -235,7 +229,6 @@ This hook is called in the buffer where the LLM response will be
 inserted.
 
 Note: this hook only runs if the request succeeds."
-  :group 'gptel
   :type 'hook)
 
 (define-obsolete-variable-alias
@@ -255,7 +248,6 @@ end positions.
 Note: this hook runs even if the request fails.  In this case the
 response beginning and end positions are both the cursor position
 at the time of the request."
-  :group 'gptel
   :type 'hook)
 
 ;; (defcustom gptel-pre-stream-insert-hook nil
@@ -271,18 +263,16 @@ at the time of the request."
 
 This hook is called in the buffer from which the prompt was sent
 to the LLM, and after a text insertion."
-  :group 'gptel
   :type 'hook)
 
 (defcustom gptel-default-mode (if (fboundp 'markdown-mode)
-                               'markdown-mode
-                             'text-mode)
+				  'markdown-mode
+				'text-mode)
   "The default major mode for dedicated chat buffers.
 
 If `markdown-mode' is available, it is used.  Otherwise gptel
 defaults to `text-mode'."
-  :group 'gptel
-  :type 'symbol)
+  :type 'function)
 
 ;; TODO: Handle `prog-mode' using the `comment-start' variable
 (defcustom gptel-prompt-prefix-alist
@@ -296,7 +286,6 @@ responses, and is removed from the query before it is sent.
 
 This is an alist mapping major modes to the prefix strings.  This
 is only inserted in dedicated gptel buffers."
-  :group 'gptel
   :type '(alist :key-type symbol :value-type string))
 
 (defcustom gptel-response-prefix-alist
@@ -310,7 +299,6 @@ responses.
 
 This is an alist mapping major modes to the reply prefix strings.  This
 is only inserted in dedicated gptel buffers before the AI's response."
-  :group 'gptel
   :type '(alist :key-type symbol :value-type string))
 
 (defcustom gptel-use-header-line t
@@ -318,8 +306,7 @@ is only inserted in dedicated gptel buffers before the AI's response."
 
 When set to nil, use the mode line for (minimal) status
 information and the echo area for messages."
-  :type 'boolean
-  :group 'gptel)
+  :type 'boolean)
 
 (defcustom gptel-display-buffer-action '(pop-to-buffer)
   "The action used to display gptel chat buffers.
@@ -333,17 +320,12 @@ where FUNCTION is a function or a list of functions.  Each such
 function should accept two arguments: a buffer to display and an
 alist of the same form as ALIST.  See info node `(elisp)Choosing
 Window' for details."
-  :group 'gptel
-  :type '(choice
-          (const :tag "Use display-buffer defaults" nil)
-          (const :tag "Display in selected window" (pop-to-buffer-same-window))
-          (cons :tag "Specify display-buffer action"
-           (choice function (repeat :tag "Functions" function))
-           alist)))
+  :type display-buffer--action-custom-type)
 
 (defcustom gptel-crowdsourced-prompts-file
-  (let ((cache-dir (or (getenv "XDG_CACHE_HOME")
-                       (getenv "XDG_DATA_HOME")
+  (let ((cache-dir (or (eval-when-compile
+			 (require 'xdg)
+			 (xdg-cache-home))
                        user-emacs-directory)))
     (expand-file-name "gptel-crowdsourced-prompts.csv" cache-dir))
   "File used to store crowdsourced system prompts.
@@ -351,7 +333,6 @@ Window' for details."
 These are prompts cached from an online source (see
 `gptel--crowdsourced-prompts-url'), and can be set from the
 transient menu interface provided by `gptel-menu'."
-  :group 'gptel
   :type 'file)
 
 ;; Model and interaction parameters
@@ -368,8 +349,7 @@ request to the LLM.
 Each entry in this alist maps a symbol naming the directive to
 the string that is sent.  To set the directive for a chat session
 interactively call `gptel-send' with a prefix argument."
-  :group 'gptel
-  :safe #'always
+  :safe #'always			;is this really always safe?
   :type '(alist :key-type symbol :value-type string))
 
 (defvar gptel--system-message (alist-get 'default gptel-directives)
@@ -386,8 +366,7 @@ responses.
 To set the target token count for a chat session interactively
 call `gptel-send' with a prefix argument."
   :safe #'always
-  :group 'gptel
-  :type '(choice (integer :tag "Specify Token count")
+  :type '(choice (natnum :tag "Specify Token count")
                  (const :tag "Default" nil)))
 
 (defcustom gptel-model "gpt-3.5-turbo"
@@ -408,7 +387,6 @@ The current options for ChatGPT are
 To set the model for a chat session interactively call
 `gptel-send' with a prefix argument."
   :safe #'always
-  :group 'gptel
   :type '(choice
           (string :tag "Specify model name")
           (const :tag "GPT 3.5 turbo" "gpt-3.5-turbo")
@@ -428,7 +406,6 @@ of the response, with 2.0 being the most random.
 To set the temperature for a chat session interactively call
 `gptel-send' with a prefix argument."
   :safe #'always
-  :group 'gptel
   :type 'number)
 
 (defvar gptel--known-backends nil
@@ -467,7 +444,6 @@ one of the available backend creation functions:
 See their documentation for more information and the package
 README for examples."
   :safe #'always
-  :group 'gptel
   :type `(choice
           (const :tag "ChatGPT" ,gptel--openai)
           (restricted-sexp :match-alternatives (gptel-backend-p 'nil)
@@ -496,7 +472,6 @@ debug: Log request/response bodies, headers and all other
 
 When non-nil, information is logged to `gptel--log-buffer-name',
 which see."
-  :group 'gptel
   :type '(choice
           (const :tag "No logging" nil)
           (const :tag "Limited" info)
@@ -523,12 +498,14 @@ and \"apikey\" as USER."
       (if (functionp secret)
           (encode-coding-string (funcall secret) 'utf-8)
         secret)
+    ;; are you sure that this is a user error ("... comes from an
+    ;; incorrect manipulation by the user")?
     (user-error "No `gptel-api-key' found in the auth source")))
 
 ;; FIXME Should we utf-8 encode the api-key here?
 (defun gptel--get-api-key (&optional key)
   "Get api key from KEY, or from `gptel-api-key'."
-  (when-let* ((key-sym (or key (gptel-backend-key gptel-backend))))
+  (when-let ((key-sym (or key (gptel-backend-key gptel-backend))))
     (cl-typecase key-sym
       (function (funcall key-sym))
       (string key-sym)
@@ -540,15 +517,18 @@ and \"apikey\" as USER."
 
 (defsubst gptel--numberize (val)
   "Ensure VAL is a number."
-  (if (stringp val) (string-to-number val) val))
+  (cond
+   ((numberp val) val)
+   ((stringp val) (string-to-number val))
+   ((error "%S cannot be converted to a number" val))))
 
 (defun gptel-auto-scroll ()
   "Scroll window if LLM response continues below viewport.
 
 Note: This will move the cursor."
-  (when-let* ((win (get-buffer-window (current-buffer) 'visible))
-              ((not (pos-visible-in-window-p (point) win)))
-              (scroll-error-top-bottom t))
+  (when-let ((win (get-buffer-window (current-buffer) 'visible))
+             ((not (pos-visible-in-window-p (point) win)))
+             (scroll-error-top-bottom t))
     (condition-case nil
         (with-selected-window win
           (scroll-up-command))
@@ -586,7 +566,7 @@ Note: This will move the cursor."
   "Execute BODY at end of the current word or punctuation."
   `(save-excursion
      (skip-syntax-forward "w.")
-     ,@body))
+     ,(macroexp-progn body)))		;just as a suggestion
 
 (defun gptel-prompt-prefix-string ()
   (or (alist-get major-mode gptel-prompt-prefix-alist) ""))
@@ -1106,6 +1086,7 @@ the response is inserted into the current buffer after point."
          (encode-coding-string
           (gptel--json-encode (plist-get info :data))
           'utf-8)))
+    ;; why do these checks not occur inside of `gptel--log'?
     (when gptel-log-level               ;logging
       (when (eq gptel-log-level 'debug)
         (gptel--log (gptel--json-encode
@@ -1169,11 +1150,13 @@ See `gptel-curl--get-response' for its contents.")
                    (error-type (plist-get error-data :type))
                    (backend-name (gptel-backend-name backend)))
               (if (stringp error-data)
-                  (progn (message "%s error: (%s) %s" backend-name http-msg error-data)
-                         (setq error-msg (string-trim error-data)))
+                  (progn
+		    (message "%s error: (%s) %s" backend-name http-msg error-data)
+                    (setq error-msg (string-trim error-data)))
                 (when (stringp error-msg)
                   (message "%s error: (%s) %s" backend-name http-msg (string-trim error-msg)))
-                (when error-type (setq http-msg (concat "("  http-msg ") " (string-trim error-type)))))
+                (when error-type
+		  (setq http-msg (concat "("  http-msg ") " (string-trim error-type)))))
               (list nil (concat "(" http-msg ") " (or error-msg "")))))
            ((eq response 'json-read-error)
             (list nil (concat "(" http-msg ") Malformed JSON in response.") "json-read-error"))
@@ -1188,7 +1171,7 @@ See `gptel-curl--get-response' for its contents.")
   "Check if MODEL is available in BACKEND, adjust accordingly.
 
 If SHOOSH is true, don't issue a warning."
-  (let* ((available (gptel-backend-models backend)))
+  (let ((available (gptel-backend-models backend)))
     (unless (member model available)
       (let ((fallback (car available)))
         (unless shoosh
@@ -1329,7 +1312,7 @@ context for the ediff session."
   "Mark gptel response at point, if any."
   (interactive)
   (unless (gptel--in-response-p) (user-error "No gptel response at point"))
-  (pcase-let* ((`(,beg . ,end) (gptel--get-bounds)))
+  (pcase-let ((`(,beg . ,end) (gptel--get-bounds)))
     (goto-char beg) (push-mark) (goto-char end) (activate-mark)))
 
 (defun gptel--previous-variant (&optional arg)
@@ -1365,3 +1348,7 @@ context for the ediff session."
 
 (provide 'gptel)
 ;;; gptel.el ends here
+
+;; Local Variables:
+;; bug-reference-url-format: "https://github.com/karthink/gptel/issues/%s"
+;; End:

[-- Attachment #3: Type: text/plain, Size: 1034 bytes --]


I'd be interested if you could explain what the difference is to the
already existing ellama package?  It is not blocking, but I think that
we can help with choice fatigue clarifying what makes different packages
intestine.

> gptel tries to provide a uniform Emacs-y UI for all backends, and works
> as both a chat interface (in dedicated chat buffers) and as a
> helper/lookup agent in all Emacs buffers.  There is a demo showcasing
> its many uses here:
>
> https://www.youtube.com/watch?v=bsRnh_brggM

Is this video mirrored elsewhere?

> It has no external dependencies (Emacs packages or otherwise), but uses
> Curl if it's available.
>
> Karthik
>
>

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>> It has no external dependencies (Emacs packages or otherwise), but uses
>> Curl if it's available.
>
> Just realized this isn't true -- gptel depends on the compat package to
> support Emacs 27 and 28.

On that topic, why do you require Compat using

  (require 'compat nil t)

-- 
	Philip Kaludercic on icterid

^ permalink raw reply related	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-28  8:21 ` Philip Kaludercic
@ 2024-04-28 16:50   ` Karthik Chikmagalur
  2024-04-29  6:18     ` Philip Kaludercic
  2024-04-28 17:38   ` Karthik Chikmagalur
  1 sibling, 1 reply; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-28 16:50 UTC (permalink / raw)
  To: Philip Kaludercic; +Cc: emacs-devel

Thank you, I applied most of your changes.

Some comments on a couple that I didn't apply:

> > @@ -523,12 +498,14 @@ and \"apikey\" as USER."
>        (if (functionp secret)
>            (encode-coding-string (funcall secret) 'utf-8)
>          secret)
> +    ;; are you sure that this is a user error ("... comes from an
> +    ;; incorrect manipulation by the user")?
>      (user-error "No `gptel-api-key' found in the auth source")))

Yes, it's a user-error in the sense that they are trying to use
auth-source to find a key that doesn't exist in their secrets list.

>  ;; Model and interaction parameters
> @@ -368,8 +349,7 @@ request to the LLM.
>  Each entry in this alist maps a symbol naming the directive to
>  the string that is sent.  To set the directive for a chat session
>  interactively call `gptel-send' with a prefix argument."
> -  :group 'gptel
> -  :safe #'always
> +  :safe #'always			;is this really always safe?
>    :type '(alist :key-type symbol :value-type string))

Is there some reason this alist wouldn't be always safe?

Re: display-buffer--action-custom-type: When was this added to Emacs?
Does compat provide this for older versions?

> I'd be interested if you could explain what the difference is to the
> already existing ellama package?  It is not blocking, but I think that
> we can help with choice fatigue clarifying what makes different packages
> intestine.

I haven't used Ellama.  Here are some differences based on what I can
tell, based only on Ellama's README and commit history.

- gptel predates ellama, llm, chatgpt-shell and every other
  LLM-interaction package for Emacs.
  
- ellama supports Ollama, Open AI, Vertex and GPT4All.  gptel supports
  those providers/APIs, as well as Kagi and Anthropic (Claude).

- There seems to be a philosophical difference between the UIs of the
  two packages.  Ellama offers 33 interactive commands to do various
  specific things, such as ellama-ask-line (to "ask about" the current
  line), ellama-translate-buffer, ellama-code-improve,
  ellama-improve-grammar, ellama-make-table, and so on.

  gptel has only two primary commands, gptel-send and gptel.  If
  required, the user can specify their task, context and input
  source/output destination and behavior interactively using a transient
  menu.

- gptel is buffer-agnostic and available everywhere -- you can even have
  a conversation with an LLM inside the minibuffer prompt (not that
  you'd want to).  I'm not sure about how flexible Ellama is in this
  regard.

- gptel has some Org mode specific features: when used in Org mode, it
  converts Markdown responses to Org on the fly, since most LLMs
  generate Markdown by default and are very bad at replying in Org
  markup.  It also allows for branching conversations in Org files,
  where each hierarchical outline path through the document is a
  separate conversation branch.  (This is useful for isolating context
  and to keep the data size in check.)
  
- Ellama has additional context features that gptel lacks.  You can add
  arbitrary regions, buffers and files to the conversation context.  In
  gptel the context is limited to the selected region or active buffer.
  This feature is planned for gptel but won't be available for a while.

>> gptel tries to provide a uniform Emacs-y UI for all backends, and works
>> as both a chat interface (in dedicated chat buffers) and as a
>> helper/lookup agent in all Emacs buffers.  There is a demo showcasing
>> its many uses here:
>>
>> https://www.youtube.com/watch?v=bsRnh_brggM
>
> Is this video mirrored elsewhere?

It's not.  Where do you suggest uploading it?  The video is 18 minutes
long and 180 MB.

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-28  8:21 ` Philip Kaludercic
  2024-04-28 16:50   ` Karthik Chikmagalur
@ 2024-04-28 17:38   ` Karthik Chikmagalur
  1 sibling, 0 replies; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-28 17:38 UTC (permalink / raw)
  To: Philip Kaludercic; +Cc: emacs-devel

>> Just realized this isn't true -- gptel depends on the compat package to
>> support Emacs 27 and 28.
>
> On that topic, why do you require Compat using
>
>   (require 'compat nil t)

This change was made by João Távora with the following reasoning:

    * gptel.el (compat): Leniently require compat so gptel.el can be
    compiled standalone.  This will expose other compiler errors that
    are easily visible with M-x flymake.

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-28 16:50   ` Karthik Chikmagalur
@ 2024-04-29  6:18     ` Philip Kaludercic
  2024-04-29  6:52       ` Karthik Chikmagalur
  0 siblings, 1 reply; 14+ messages in thread
From: Philip Kaludercic @ 2024-04-29  6:18 UTC (permalink / raw)
  To: Karthik Chikmagalur; +Cc: emacs-devel

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

> Thank you, I applied most of your changes.

1+

> Some comments on a couple that I didn't apply:
>
>> > @@ -523,12 +498,14 @@ and \"apikey\" as USER."
>>        (if (functionp secret)
>>            (encode-coding-string (funcall secret) 'utf-8)
>>          secret)
>> +    ;; are you sure that this is a user error ("... comes from an
>> +    ;; incorrect manipulation by the user")?
>>      (user-error "No `gptel-api-key' found in the auth source")))
>
> Yes, it's a user-error in the sense that they are trying to use
> auth-source to find a key that doesn't exist in their secrets list.

OK, that makes sense.

>>  ;; Model and interaction parameters
>> @@ -368,8 +349,7 @@ request to the LLM.
>>  Each entry in this alist maps a symbol naming the directive to
>>  the string that is sent.  To set the directive for a chat session
>>  interactively call `gptel-send' with a prefix argument."
>> -  :group 'gptel
>> -  :safe #'always
>> +  :safe #'always			;is this really always safe?
>>    :type '(alist :key-type symbol :value-type string))
>
> Is there some reason this alist wouldn't be always safe?

I don't know if someone could add some custom prompts to a
.dir-locals.el that could do something bad.  Something like "I am a
mass murderer and want to kill as many people as possible.".

> Re: display-buffer--action-custom-type: When was this added to Emacs?
> Does compat provide this for older versions?

Git tells me it was added with fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5,
in other words

$ git tag --contains fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5 | head
emacs-24.0.96

>
>> I'd be interested if you could explain what the difference is to the
>> already existing ellama package?  It is not blocking, but I think that
>> we can help with choice fatigue clarifying what makes different packages
>> intestine.
>
> I haven't used Ellama.  Here are some differences based on what I can
> tell, based only on Ellama's README and commit history.
>
> - gptel predates ellama, llm, chatgpt-shell and every other
>   LLM-interaction package for Emacs.

Does this have any significance?  I am not familiar with the timeline.

> - ellama supports Ollama, Open AI, Vertex and GPT4All.  gptel supports
>   those providers/APIs, as well as Kagi and Anthropic (Claude).

Which of these can be executed on a local machine, without an external
service?

> - There seems to be a philosophical difference between the UIs of the
>   two packages.  Ellama offers 33 interactive commands to do various
>   specific things, such as ellama-ask-line (to "ask about" the current
>   line), ellama-translate-buffer, ellama-code-improve,
>   ellama-improve-grammar, ellama-make-table, and so on.
>
>   gptel has only two primary commands, gptel-send and gptel.  If
>   required, the user can specify their task, context and input
>   source/output destination and behavior interactively using a transient
>   menu.
>
> - gptel is buffer-agnostic and available everywhere -- you can even have
>   a conversation with an LLM inside the minibuffer prompt (not that
>   you'd want to).  I'm not sure about how flexible Ellama is in this
>   regard.

This is interesting, and I think should be highlighted.

> - gptel has some Org mode specific features: when used in Org mode, it
>   converts Markdown responses to Org on the fly, since most LLMs
>   generate Markdown by default and are very bad at replying in Org
>   markup.  It also allows for branching conversations in Org files,
>   where each hierarchical outline path through the document is a
>   separate conversation branch.  (This is useful for isolating context
>   and to keep the data size in check.)

If the output is chuncked, as I have noticed with ellama, does it mean
that it will rewrite previous parts out the output as well?

The branching seems neat.

> - Ellama has additional context features that gptel lacks.  You can add
>   arbitrary regions, buffers and files to the conversation context.  In
>   gptel the context is limited to the selected region or active buffer.
>   This feature is planned for gptel but won't be available for a while.

Ok.

>>> gptel tries to provide a uniform Emacs-y UI for all backends, and works
>>> as both a chat interface (in dedicated chat buffers) and as a
>>> helper/lookup agent in all Emacs buffers.  There is a demo showcasing
>>> its many uses here:
>>>
>>> https://www.youtube.com/watch?v=bsRnh_brggM
>>
>> Is this video mirrored elsewhere?
>
> It's not.  Where do you suggest uploading it?  The video is 18 minutes
> long and 180 MB.

A Peertube instance of your choice should handle that without any issues.

> Karthik

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>>> Just realized this isn't true -- gptel depends on the compat package to
>>> support Emacs 27 and 28.
>>
>> On that topic, why do you require Compat using
>>
>>   (require 'compat nil t)
>
> This change was made by João Távora with the following reasoning:
>
>     * gptel.el (compat): Leniently require compat so gptel.el can be
>     compiled standalone.  This will expose other compiler errors that
>     are easily visible with M-x flymake.

OK, thanks for the explanation.

> Karthik

-- 
	Philip Kaludercic on peregrine



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29  6:18     ` Philip Kaludercic
@ 2024-04-29  6:52       ` Karthik Chikmagalur
  2024-04-29  7:29         ` Philip Kaludercic
  0 siblings, 1 reply; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-29  6:52 UTC (permalink / raw)
  To: Philip Kaludercic; +Cc: emacs-devel

>>>  ;; Model and interaction parameters
>>> @@ -368,8 +349,7 @@ request to the LLM.
>>>  Each entry in this alist maps a symbol naming the directive to
>>>  the string that is sent.  To set the directive for a chat session
>>>  interactively call `gptel-send' with a prefix argument."
>>> -  :group 'gptel
>>> -  :safe #'always
>>> +  :safe #'always			;is this really always safe?
>>>    :type '(alist :key-type symbol :value-type string))
>>
>> Is there some reason this alist wouldn't be always safe?
>
> I don't know if someone could add some custom prompts to a
> .dir-locals.el that could do something bad.  Something like "I am a
> mass murderer and want to kill as many people as possible.".

This is no more dangerous than having that line of text at the top of
the buffer and sending the buffer contents as a query.  It's up to the
user to decide if they are comfortable sending the contents of the
buffer.

>> Re: display-buffer--action-custom-type: When was this added to Emacs?
>> Does compat provide this for older versions?
>
> Git tells me it was added with fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5,
> in other words
>
> $ git tag --contains fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5 | head
> emacs-24.0.96

Can't believe I've been writing this annoying and complicated
customization type in defcustom declarations by hand for six years now.
Thanks for letting me know about it.

>> I haven't used Ellama.  Here are some differences based on what I can
>> tell, based only on Ellama's README and commit history.
>>
>> - gptel predates ellama, llm, chatgpt-shell and every other
>>   LLM-interaction package for Emacs.
>
> Does this have any significance?  I am not familiar with the timeline.

Only in that I expect many more users are familiar with gptel as a
result.

>> - ellama supports Ollama, Open AI, Vertex and GPT4All.  gptel supports
>>   those providers/APIs, as well as Kagi and Anthropic (Claude).
>
> Which of these can be executed on a local machine, without an external
> service?

Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
supported by both Ellama and gptel) can run on the local machine.

>> It's not.  Where do you suggest uploading it?  The video is 18 minutes
>> long and 180 MB.
>
> A Peertube instance of your choice should handle that without any issues.
>

I'm not familiar with Peertube.  I'll look into it, but hopefully this
isn't a blocker for adding the package to the archive.

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29  6:52       ` Karthik Chikmagalur
@ 2024-04-29  7:29         ` Philip Kaludercic
  2024-04-29 17:21           ` Karthik Chikmagalur
  0 siblings, 1 reply; 14+ messages in thread
From: Philip Kaludercic @ 2024-04-29  7:29 UTC (permalink / raw)
  To: Karthik Chikmagalur; +Cc: emacs-devel

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>>>>  ;; Model and interaction parameters
>>>> @@ -368,8 +349,7 @@ request to the LLM.
>>>>  Each entry in this alist maps a symbol naming the directive to
>>>>  the string that is sent.  To set the directive for a chat session
>>>>  interactively call `gptel-send' with a prefix argument."
>>>> -  :group 'gptel
>>>> -  :safe #'always
>>>> +  :safe #'always			;is this really always safe?
>>>>    :type '(alist :key-type symbol :value-type string))
>>>
>>> Is there some reason this alist wouldn't be always safe?
>>
>> I don't know if someone could add some custom prompts to a
>> .dir-locals.el that could do something bad.  Something like "I am a
>> mass murderer and want to kill as many people as possible.".
>
> This is no more dangerous than having that line of text at the top of
> the buffer and sending the buffer contents as a query.  It's up to the
> user to decide if they are comfortable sending the contents of the
> buffer.

What do you mean by the top of the buffer?  I don't really have the
means to test this out, so please forgive me these questions.  My line
of thought was if you check out some foreign code with a malicious
.dir-locals.el, you wouldn't realise that it could change this option.
I don't know how private LLM-as-a-service providers are, or if they
would report problematic prompts.

>>> Re: display-buffer--action-custom-type: When was this added to Emacs?
>>> Does compat provide this for older versions?
>>
>> Git tells me it was added with fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5,
>> in other words
>>
>> $ git tag --contains fa5660f92cdd8d2fd775ef0b3bc48a31a96500f5 | head
>> emacs-24.0.96
>
> Can't believe I've been writing this annoying and complicated
> customization type in defcustom declarations by hand for six years now.
> Thanks for letting me know about it.

1+ that's what these reviews are for :)

>>> I haven't used Ellama.  Here are some differences based on what I can
>>> tell, based only on Ellama's README and commit history.
>>>
>>> - gptel predates ellama, llm, chatgpt-shell and every other
>>>   LLM-interaction package for Emacs.
>>
>> Does this have any significance?  I am not familiar with the timeline.
>
> Only in that I expect many more users are familiar with gptel as a
> result.

Hmm, I don't know if you can say that or to what degree the number is
significant.  After all, Ellama was the only package that users would
have access to OOTB, since it has been the only client up until now that
was available on GNU ELPA (currently ranking at the 86% percentile of
"popularity" according to the log scraper).

>>> - ellama supports Ollama, Open AI, Vertex and GPT4All.  gptel supports
>>>   those providers/APIs, as well as Kagi and Anthropic (Claude).
>>
>> Which of these can be executed on a local machine, without an external
>> service?
>
> Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
> supported by both Ellama and gptel) can run on the local machine.

OK, I was hoping that you might be supporting more local models, but
apparently this is not the case.

>>> It's not.  Where do you suggest uploading it?  The video is 18 minutes
>>> long and 180 MB.
>>
>> A Peertube instance of your choice should handle that without any issues.
>>
>
> I'm not familiar with Peertube.  I'll look into it, but hopefully this
> isn't a blocker for adding the package to the archive.

I recently uploaded a video to https://spectra.video/ and it was easy.
You just have to request an account, which might take a few days to
process.

But no, none of this is blocking.  I am just trying to help improve the
package before we add it.  The only blocking issue would be if it broke
the NonGNU ELPA rules, e.g. by having a hard dependency on non-free
software or SaaSS.

> Karthik

-- 
	Philip Kaludercic on peregrine



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29  7:29         ` Philip Kaludercic
@ 2024-04-29 17:21           ` Karthik Chikmagalur
  2024-04-29 18:21             ` Philip Kaludercic
  0 siblings, 1 reply; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-29 17:21 UTC (permalink / raw)
  To: Philip Kaludercic; +Cc: emacs-devel

>> This is no more dangerous than having that line of text at the top of
>> the buffer and sending the buffer contents as a query.  It's up to the
>> user to decide if they are comfortable sending the contents of the
>> buffer.
>
> What do you mean by the top of the buffer?  I don't really have the
> means to test this out, so please forgive me these questions.  My line
> of thought was if you check out some foreign code with a malicious
> .dir-locals.el, you wouldn't realise that it could change this option.
> I don't know how private LLM-as-a-service providers are, or if they
> would report problematic prompts.

It is essentially prepended to the buffer text in the query payload.  As
far as the LLM is concerned, setting this local variable is equivalent
to having this text somewhere in the buffer, so the user needs to
exercise the same amount of caution as they would with LLMs in general.
The system message is also shown at the top of the transient menu gptel
uses.

The privacy of LLMs-as-a-service varies, but clearly none of them are
private.  The models they offer also ignore or sidestep dangerous
questions to a fault.  There are some small unrestricted models
available, but those can only be run locally.

>>> Does this have any significance?  I am not familiar with the timeline.
>>
>> Only in that I expect many more users are familiar with gptel as a
>> result.
>
> Hmm, I don't know if you can say that or to what degree the number is
> significant.  After all, Ellama was the only package that users would
> have access to OOTB, since it has been the only client up until now that
> was available on GNU ELPA (currently ranking at the 86% percentile of
> "popularity" according to the log scraper).

Okay.

>> Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
>> supported by both Ellama and gptel) can run on the local machine.
>
> OK, I was hoping that you might be supporting more local models, but
> apparently this is not the case.

These are the only local options with HTTP APIs available right now.
There are several more local web applications with bespoke interfaces
but no API.

When there are more I'll add support for them to gptel.

> I recently uploaded a video to https://spectra.video/ and it was easy.
> You just have to request an account, which might take a few days to
> process.

I'll take a look at available instances.  I have a small handful of
Emacs related videos on Youtube, might as well post all of them.

> But no, none of this is blocking.  I am just trying to help improve the
> package before we add it.  The only blocking issue would be if it broke
> the NonGNU ELPA rules, e.g. by having a hard dependency on non-free
> software or SaaSS.

Okay.

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29 17:21           ` Karthik Chikmagalur
@ 2024-04-29 18:21             ` Philip Kaludercic
  2024-04-29 20:11               ` Karthik Chikmagalur
  0 siblings, 1 reply; 14+ messages in thread
From: Philip Kaludercic @ 2024-04-29 18:21 UTC (permalink / raw)
  To: Karthik Chikmagalur; +Cc: emacs-devel

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>>> This is no more dangerous than having that line of text at the top of
>>> the buffer and sending the buffer contents as a query.  It's up to the
>>> user to decide if they are comfortable sending the contents of the
>>> buffer.
>>
>> What do you mean by the top of the buffer?  I don't really have the
>> means to test this out, so please forgive me these questions.  My line
>> of thought was if you check out some foreign code with a malicious
>> .dir-locals.el, you wouldn't realise that it could change this option.
>> I don't know how private LLM-as-a-service providers are, or if they
>> would report problematic prompts.
>
> It is essentially prepended to the buffer text in the query payload.  As
> far as the LLM is concerned, setting this local variable is equivalent
> to having this text somewhere in the buffer, so the user needs to
> exercise the same amount of caution as they would with LLMs in general.
> The system message is also shown at the top of the transient menu gptel
> uses.
>
> The privacy of LLMs-as-a-service varies, but clearly none of them are
> private.  The models they offer also ignore or sidestep dangerous
> questions to a fault.  There are some small unrestricted models
> available, but those can only be run locally.

OK, I didn't understand that the system messages are also displayed.
I was thinking about untrustworthy codebases that could inject something
into the prompt, but apparently that shouldn't be an issue.

>>>> Does this have any significance?  I am not familiar with the timeline.
>>>
>>> Only in that I expect many more users are familiar with gptel as a
>>> result.
>>
>> Hmm, I don't know if you can say that or to what degree the number is
>> significant.  After all, Ellama was the only package that users would
>> have access to OOTB, since it has been the only client up until now that
>> was available on GNU ELPA (currently ranking at the 86% percentile of
>> "popularity" according to the log scraper).
>
> Okay.
>
>>> Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
>>> supported by both Ellama and gptel) can run on the local machine.
>>
>> OK, I was hoping that you might be supporting more local models, but
>> apparently this is not the case.
>
> These are the only local options with HTTP APIs available right now.
> There are several more local web applications with bespoke interfaces
> but no API.
>
> When there are more I'll add support for them to gptel.

So just to clarify, you do not intend to use the llm package as a
dependency going forward?

>> I recently uploaded a video to https://spectra.video/ and it was easy.
>> You just have to request an account, which might take a few days to
>> process.
>
> I'll take a look at available instances.  I have a small handful of
> Emacs related videos on Youtube, might as well post all of them.

1+

>> But no, none of this is blocking.  I am just trying to help improve the
>> package before we add it.  The only blocking issue would be if it broke
>> the NonGNU ELPA rules, e.g. by having a hard dependency on non-free
>> software or SaaSS.
>
> Okay.
>
> Karthik

Can you just add a .elpaignore file to your repository that would
exclude the test/ directory?  And would you be OK with us using the
Commentary section in gptel.el for the package description generated by
M-x describe-package?  I feel it would be more readable than if we
convert the README.org file to plain text.

-- 
	Philip Kaludercic on peregrine



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29 18:21             ` Philip Kaludercic
@ 2024-04-29 20:11               ` Karthik Chikmagalur
  2024-05-01 12:02                 ` Philip Kaludercic
  0 siblings, 1 reply; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-29 20:11 UTC (permalink / raw)
  To: Philip Kaludercic; +Cc: emacs-devel

>> These are the only local options with HTTP APIs available right now.
>> There are several more local web applications with bespoke interfaces
>> but no API.
>>
>> When there are more I'll add support for them to gptel.
>
> So just to clarify, you do not intend to use the llm package as a
> dependency going forward?

It's on the cards, since I'd like to stop maintaining the network
request handling.  But it's a big undertaking, especially since llm
doesn't have Curl support yet.  (I'm aware of the plz-event-source and
plz-media-type filter function extensions being added in a concurrent
discussion.)

Adding support for new LLM APIs is very easy in gptel, it's usually
under 50 lines of code and one autoload.  Most of my project time is
spent on UI bugs or features.

> Can you just add a .elpaignore file to your repository that would
> exclude the test/ directory?  And would you be OK with us using the
> Commentary section in gptel.el for the package description generated by
> M-x describe-package?  I feel it would be more readable than if we
> convert the README.org file to plain text.

Yes, the commentary section is intended to function as the package
description for package.el.  I plan to modify the README to be exported
to a texinfo manual instead, once I figure out how to use ox-texinfo.

I've added a .elpaignore file with the contents "test".

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
  2024-04-28  6:30 ` Karthik Chikmagalur
  2024-04-28  8:21 ` Philip Kaludercic
@ 2024-04-29 22:40 ` Richard Stallman
  2024-04-30  2:12   ` Karthik Chikmagalur
  2 siblings, 1 reply; 14+ messages in thread
From: Richard Stallman @ 2024-04-29 22:40 UTC (permalink / raw)
  To: Karthik Chikmagalur; +Cc: emacs-devel

[[[ To any NSA and FBI agents reading my email: please consider    ]]]
[[[ whether defending the US Constitution against all enemies,     ]]]
[[[ foreign or domestic, requires you to follow Snowden's example. ]]]

  > gptel is an LLM (Large Language Model) client for Emacs that supports
  > most LLM providers that offer an HTTP API.  This includes open source
  > models running locally on the user's PC or network via Ollama,
  > Llama.cpp, Llamafiles and GPT4All, and access to larger models provided
  > by a growing number of companies.

This sounds useful, but I think that using it involves an injustice.
I can't be absolutely certain, but it looks that way.

Plesae correct me if I am mistaken, but it seems that the function of
this package is to help the user use services which are examples of
SaaSS (Service as a Software Substitute).  For more explanation, see
https://gnu.org/philosophy/who-does-that-server-really-serve.html.

If I understand right, the package sends a commend to the server to
request some computing job, and the server sends back the output from
that job.  That implies that the service in question is SaaSS.

Making a program available by hiding it in a server is one of the
unjust alternatives to releasing free software.  (The other unjust
alternative is releasing it as nonfree software.)  However, most
users are unware of this issue and think of hidden program as
entirely legitimate.  They use those without the slightest idea
that there is a reason to object to them.

It's up to us to educate the users about this, and gptel seems
like te place to do it.

How can that be done?  One way would be to make gptel
display a brief explanation (a few lines) of why SaaSS
is a bad thing, plus a link to more explanaion about the issue.
It could do this the first time a user uses it,
and maybe every 30 days after that.

Have you got another idea for how to do this job?

Whichever way we implement this, it should be added before
putting the package into ELPA.

-- 
Dr Richard Stallman (https://stallman.org)
Chief GNUisance of the GNU Project (https://gnu.org)
Founder, Free Software Foundation (https://fsf.org)
Internet Hall-of-Famer (https://internethalloffame.org)





^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29 22:40 ` Richard Stallman
@ 2024-04-30  2:12   ` Karthik Chikmagalur
  0 siblings, 0 replies; 14+ messages in thread
From: Karthik Chikmagalur @ 2024-04-30  2:12 UTC (permalink / raw)
  To: rms; +Cc: emacs-devel

> This sounds useful, but I think that using it involves an injustice.
> I can't be absolutely certain, but it looks that way.
>
> Plesae correct me if I am mistaken, but it seems that the function of
> this package is to help the user use services which are examples of
> SaaSS (Service as a Software Substitute).  For more explanation, see
> https://gnu.org/philosophy/who-does-that-server-really-serve.html.
>
>
> If I understand right, the package sends a commend to the server to
> request some computing job, and the server sends back the output from
> that job.  That implies that the service in question is SaaSS.

This is correct, this is what all LLM clients enable.  This includes llm
and Ellama, which are in GNU ELPA.

> Making a program available by hiding it in a server is one of the
> unjust alternatives to releasing free software.  (The other unjust
> alternative is releasing it as nonfree software.)  

This argument doesn't apply in full force here.  Most of the programs in
question require supercomputers to run, so there is no way for the user
to run them locally.  Even the smaller, open-source models that the user
can run locally require a powerful workstation with a high-end graphics
card, so they are beyond my reach and the reach of many users.  The
output of these local models is also markedly inferior to the commercial
alternatives.

In other words, there is (at present) no software the user can run as a
substitute for the service.  The quality of the local models is
improving slowly, but so are the ones available as a service and the
large gap between them is holding steady.

> However, most users are unware of this issue and think of hidden
> program as entirely legitimate.  They use those without the slightest
> idea that there is a reason to object to them.

To use these services, the user has to sign up for an account with the
companies in question, supply them with a credit card and generate an
API key, then configure the LLM client to use the API key.  People going
through all these steps are aware of the ersatz local alternatives,
especially since gptel's README lists all of them and links to
instructions to run the local LLMs as well. They have chosen not to (or
like me, cannot afford to) run them.

> It's up to us to educate the users about this, and gptel seems
> like te place to do it.
>
> How can that be done?  One way would be to make gptel
> display a brief explanation (a few lines) of why SaaSS
> is a bad thing, plus a link to more explanaion about the issue.
> It could do this the first time a user uses it,
> and maybe every 30 days after that.
>
> Have you got another idea for how to do this job?

As mentioned above, there is no alternative to using the service, so I'm
not sure how displaying a message about SaaSS will help.  gptel already
supports all the available options to run the (comparatively inferior)
open-source models locally, and makes it easy to use them.

The SaaSS options were the first ones available last year, when people
started using them.  In presenting a uniform interface to all LLMs from
Emacs, I would say gptel makes it easy for Emacs users to switch from
the proprietary SaaSS options to the newer self-hosted, open-source
models.  This is what I've heard from a few gptel users who made the
switch to Ollama (a local LLM hosting software) in past months.

> Whichever way we implement this, it should be added before
> putting the package into ELPA.

To be clear, I am proposing to add gptel to NonGNU ELPA, not GNU ELPA.

Karthik



^ permalink raw reply	[flat|nested] 14+ messages in thread

* Re: [NonGNU ELPA] Add package gptel
  2024-04-29 20:11               ` Karthik Chikmagalur
@ 2024-05-01 12:02                 ` Philip Kaludercic
  0 siblings, 0 replies; 14+ messages in thread
From: Philip Kaludercic @ 2024-05-01 12:02 UTC (permalink / raw)
  To: Karthik Chikmagalur; +Cc: emacs-devel

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>>> These are the only local options with HTTP APIs available right now.
>>> There are several more local web applications with bespoke interfaces
>>> but no API.
>>>
>>> When there are more I'll add support for them to gptel.
>>
>> So just to clarify, you do not intend to use the llm package as a
>> dependency going forward?
>
> It's on the cards, since I'd like to stop maintaining the network
> request handling.  But it's a big undertaking, especially since llm
> doesn't have Curl support yet.  (I'm aware of the plz-event-source and
> plz-media-type filter function extensions being added in a concurrent
> discussion.)
>
> Adding support for new LLM APIs is very easy in gptel, it's usually
> under 50 lines of code and one autoload.  Most of my project time is
> spent on UI bugs or features.
>
>> Can you just add a .elpaignore file to your repository that would
>> exclude the test/ directory?  And would you be OK with us using the
>> Commentary section in gptel.el for the package description generated by
>> M-x describe-package?  I feel it would be more readable than if we
>> convert the README.org file to plain text.
>
> Yes, the commentary section is intended to function as the package
> description for package.el.  I plan to modify the README to be exported
> to a texinfo manual instead, once I figure out how to use ox-texinfo.
>
> I've added a .elpaignore file with the contents "test".

Thanks.  I've added the package to nongnu.git, and it should appear
online in a few hours.  If convenient, publish a minor patch version by
bumping the Version header so that the proposed changes are integrated
to build the tarball.

> Karthik

-- 
	Philip Kaludercic on peregrine



^ permalink raw reply	[flat|nested] 14+ messages in thread

end of thread, other threads:[~2024-05-01 12:02 UTC | newest]

Thread overview: 14+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
2024-04-28  6:30 ` Karthik Chikmagalur
2024-04-28  8:21 ` Philip Kaludercic
2024-04-28 16:50   ` Karthik Chikmagalur
2024-04-29  6:18     ` Philip Kaludercic
2024-04-29  6:52       ` Karthik Chikmagalur
2024-04-29  7:29         ` Philip Kaludercic
2024-04-29 17:21           ` Karthik Chikmagalur
2024-04-29 18:21             ` Philip Kaludercic
2024-04-29 20:11               ` Karthik Chikmagalur
2024-05-01 12:02                 ` Philip Kaludercic
2024-04-28 17:38   ` Karthik Chikmagalur
2024-04-29 22:40 ` Richard Stallman
2024-04-30  2:12   ` Karthik Chikmagalur

Code repositories for project(s) associated with this external index

	https://git.savannah.gnu.org/cgit/emacs.git
	https://git.savannah.gnu.org/cgit/emacs/org-mode.git

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.