all messages for Emacs-related lists mirrored at yhetil.org
 help / color / mirror / code / Atom feed
From: Karthik Chikmagalur <karthikchikmagalur@gmail.com>
To: Philip Kaludercic <philipk@posteo.net>
Cc: emacs-devel@gnu.org
Subject: Re: [NonGNU ELPA] Add package gptel
Date: Sun, 28 Apr 2024 09:50:23 -0700	[thread overview]
Message-ID: <871q6pa0ts.fsf@gmail.com> (raw)
In-Reply-To: <87ttjlsxro.fsf@posteo.net>

Thank you, I applied most of your changes.

Some comments on a couple that I didn't apply:

> > @@ -523,12 +498,14 @@ and \"apikey\" as USER."
>        (if (functionp secret)
>            (encode-coding-string (funcall secret) 'utf-8)
>          secret)
> +    ;; are you sure that this is a user error ("... comes from an
> +    ;; incorrect manipulation by the user")?
>      (user-error "No `gptel-api-key' found in the auth source")))

Yes, it's a user-error in the sense that they are trying to use
auth-source to find a key that doesn't exist in their secrets list.

>  ;; Model and interaction parameters
> @@ -368,8 +349,7 @@ request to the LLM.
>  Each entry in this alist maps a symbol naming the directive to
>  the string that is sent.  To set the directive for a chat session
>  interactively call `gptel-send' with a prefix argument."
> -  :group 'gptel
> -  :safe #'always
> +  :safe #'always			;is this really always safe?
>    :type '(alist :key-type symbol :value-type string))

Is there some reason this alist wouldn't be always safe?

Re: display-buffer--action-custom-type: When was this added to Emacs?
Does compat provide this for older versions?

> I'd be interested if you could explain what the difference is to the
> already existing ellama package?  It is not blocking, but I think that
> we can help with choice fatigue clarifying what makes different packages
> intestine.

I haven't used Ellama.  Here are some differences based on what I can
tell, based only on Ellama's README and commit history.

- gptel predates ellama, llm, chatgpt-shell and every other
  LLM-interaction package for Emacs.
  
- ellama supports Ollama, Open AI, Vertex and GPT4All.  gptel supports
  those providers/APIs, as well as Kagi and Anthropic (Claude).

- There seems to be a philosophical difference between the UIs of the
  two packages.  Ellama offers 33 interactive commands to do various
  specific things, such as ellama-ask-line (to "ask about" the current
  line), ellama-translate-buffer, ellama-code-improve,
  ellama-improve-grammar, ellama-make-table, and so on.

  gptel has only two primary commands, gptel-send and gptel.  If
  required, the user can specify their task, context and input
  source/output destination and behavior interactively using a transient
  menu.

- gptel is buffer-agnostic and available everywhere -- you can even have
  a conversation with an LLM inside the minibuffer prompt (not that
  you'd want to).  I'm not sure about how flexible Ellama is in this
  regard.

- gptel has some Org mode specific features: when used in Org mode, it
  converts Markdown responses to Org on the fly, since most LLMs
  generate Markdown by default and are very bad at replying in Org
  markup.  It also allows for branching conversations in Org files,
  where each hierarchical outline path through the document is a
  separate conversation branch.  (This is useful for isolating context
  and to keep the data size in check.)
  
- Ellama has additional context features that gptel lacks.  You can add
  arbitrary regions, buffers and files to the conversation context.  In
  gptel the context is limited to the selected region or active buffer.
  This feature is planned for gptel but won't be available for a while.

>> gptel tries to provide a uniform Emacs-y UI for all backends, and works
>> as both a chat interface (in dedicated chat buffers) and as a
>> helper/lookup agent in all Emacs buffers.  There is a demo showcasing
>> its many uses here:
>>
>> https://www.youtube.com/watch?v=bsRnh_brggM
>
> Is this video mirrored elsewhere?

It's not.  Where do you suggest uploading it?  The video is 18 minutes
long and 180 MB.

Karthik



  reply	other threads:[~2024-04-28 16:50 UTC|newest]

Thread overview: 14+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
2024-04-28  6:30 ` Karthik Chikmagalur
2024-04-28  8:21 ` Philip Kaludercic
2024-04-28 16:50   ` Karthik Chikmagalur [this message]
2024-04-29  6:18     ` Philip Kaludercic
2024-04-29  6:52       ` Karthik Chikmagalur
2024-04-29  7:29         ` Philip Kaludercic
2024-04-29 17:21           ` Karthik Chikmagalur
2024-04-29 18:21             ` Philip Kaludercic
2024-04-29 20:11               ` Karthik Chikmagalur
2024-05-01 12:02                 ` Philip Kaludercic
2024-04-28 17:38   ` Karthik Chikmagalur
2024-04-29 22:40 ` Richard Stallman
2024-04-30  2:12   ` Karthik Chikmagalur

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=871q6pa0ts.fsf@gmail.com \
    --to=karthikchikmagalur@gmail.com \
    --cc=emacs-devel@gnu.org \
    --cc=philipk@posteo.net \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this external index

	https://git.savannah.gnu.org/cgit/emacs.git
	https://git.savannah.gnu.org/cgit/emacs/org-mode.git

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.