all messages for Emacs-related lists mirrored at yhetil.org
 help / color / mirror / code / Atom feed
From: Philip Kaludercic <philipk@posteo.net>
To: Karthik Chikmagalur <karthikchikmagalur@gmail.com>
Cc: emacs-devel@gnu.org
Subject: Re: [NonGNU ELPA] Add package gptel
Date: Mon, 29 Apr 2024 18:21:16 +0000	[thread overview]
Message-ID: <87r0eokp2b.fsf@posteo.net> (raw)
In-Reply-To: <87frv484p8.fsf@gmail.com> (Karthik Chikmagalur's message of "Mon, 29 Apr 2024 10:21:55 -0700")

Karthik Chikmagalur <karthikchikmagalur@gmail.com> writes:

>>> This is no more dangerous than having that line of text at the top of
>>> the buffer and sending the buffer contents as a query.  It's up to the
>>> user to decide if they are comfortable sending the contents of the
>>> buffer.
>>
>> What do you mean by the top of the buffer?  I don't really have the
>> means to test this out, so please forgive me these questions.  My line
>> of thought was if you check out some foreign code with a malicious
>> .dir-locals.el, you wouldn't realise that it could change this option.
>> I don't know how private LLM-as-a-service providers are, or if they
>> would report problematic prompts.
>
> It is essentially prepended to the buffer text in the query payload.  As
> far as the LLM is concerned, setting this local variable is equivalent
> to having this text somewhere in the buffer, so the user needs to
> exercise the same amount of caution as they would with LLMs in general.
> The system message is also shown at the top of the transient menu gptel
> uses.
>
> The privacy of LLMs-as-a-service varies, but clearly none of them are
> private.  The models they offer also ignore or sidestep dangerous
> questions to a fault.  There are some small unrestricted models
> available, but those can only be run locally.

OK, I didn't understand that the system messages are also displayed.
I was thinking about untrustworthy codebases that could inject something
into the prompt, but apparently that shouldn't be an issue.

>>>> Does this have any significance?  I am not familiar with the timeline.
>>>
>>> Only in that I expect many more users are familiar with gptel as a
>>> result.
>>
>> Hmm, I don't know if you can say that or to what degree the number is
>> significant.  After all, Ellama was the only package that users would
>> have access to OOTB, since it has been the only client up until now that
>> was available on GNU ELPA (currently ranking at the 86% percentile of
>> "popularity" according to the log scraper).
>
> Okay.
>
>>> Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
>>> supported by both Ellama and gptel) can run on the local machine.
>>
>> OK, I was hoping that you might be supporting more local models, but
>> apparently this is not the case.
>
> These are the only local options with HTTP APIs available right now.
> There are several more local web applications with bespoke interfaces
> but no API.
>
> When there are more I'll add support for them to gptel.

So just to clarify, you do not intend to use the llm package as a
dependency going forward?

>> I recently uploaded a video to https://spectra.video/ and it was easy.
>> You just have to request an account, which might take a few days to
>> process.
>
> I'll take a look at available instances.  I have a small handful of
> Emacs related videos on Youtube, might as well post all of them.

1+

>> But no, none of this is blocking.  I am just trying to help improve the
>> package before we add it.  The only blocking issue would be if it broke
>> the NonGNU ELPA rules, e.g. by having a hard dependency on non-free
>> software or SaaSS.
>
> Okay.
>
> Karthik

Can you just add a .elpaignore file to your repository that would
exclude the test/ directory?  And would you be OK with us using the
Commentary section in gptel.el for the package description generated by
M-x describe-package?  I feel it would be more readable than if we
convert the README.org file to plain text.

-- 
	Philip Kaludercic on peregrine



  reply	other threads:[~2024-04-29 18:21 UTC|newest]

Thread overview: 14+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
2024-04-28  6:30 ` Karthik Chikmagalur
2024-04-28  8:21 ` Philip Kaludercic
2024-04-28 16:50   ` Karthik Chikmagalur
2024-04-29  6:18     ` Philip Kaludercic
2024-04-29  6:52       ` Karthik Chikmagalur
2024-04-29  7:29         ` Philip Kaludercic
2024-04-29 17:21           ` Karthik Chikmagalur
2024-04-29 18:21             ` Philip Kaludercic [this message]
2024-04-29 20:11               ` Karthik Chikmagalur
2024-05-01 12:02                 ` Philip Kaludercic
2024-04-28 17:38   ` Karthik Chikmagalur
2024-04-29 22:40 ` Richard Stallman
2024-04-30  2:12   ` Karthik Chikmagalur

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=87r0eokp2b.fsf@posteo.net \
    --to=philipk@posteo.net \
    --cc=emacs-devel@gnu.org \
    --cc=karthikchikmagalur@gmail.com \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this external index

	https://git.savannah.gnu.org/cgit/emacs.git
	https://git.savannah.gnu.org/cgit/emacs/org-mode.git

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.