unofficial mirror of emacs-devel@gnu.org 
 help / color / mirror / code / Atom feed
From: Karthik Chikmagalur <karthikchikmagalur@gmail.com>
To: Philip Kaludercic <philipk@posteo.net>
Cc: emacs-devel@gnu.org
Subject: Re: [NonGNU ELPA] Add package gptel
Date: Mon, 29 Apr 2024 10:21:55 -0700	[thread overview]
Message-ID: <87frv484p8.fsf@gmail.com> (raw)
In-Reply-To: <87h6fkmxse.fsf@posteo.net>

>> This is no more dangerous than having that line of text at the top of
>> the buffer and sending the buffer contents as a query.  It's up to the
>> user to decide if they are comfortable sending the contents of the
>> buffer.
>
> What do you mean by the top of the buffer?  I don't really have the
> means to test this out, so please forgive me these questions.  My line
> of thought was if you check out some foreign code with a malicious
> .dir-locals.el, you wouldn't realise that it could change this option.
> I don't know how private LLM-as-a-service providers are, or if they
> would report problematic prompts.

It is essentially prepended to the buffer text in the query payload.  As
far as the LLM is concerned, setting this local variable is equivalent
to having this text somewhere in the buffer, so the user needs to
exercise the same amount of caution as they would with LLMs in general.
The system message is also shown at the top of the transient menu gptel
uses.

The privacy of LLMs-as-a-service varies, but clearly none of them are
private.  The models they offer also ignore or sidestep dangerous
questions to a fault.  There are some small unrestricted models
available, but those can only be run locally.

>>> Does this have any significance?  I am not familiar with the timeline.
>>
>> Only in that I expect many more users are familiar with gptel as a
>> result.
>
> Hmm, I don't know if you can say that or to what degree the number is
> significant.  After all, Ellama was the only package that users would
> have access to OOTB, since it has been the only client up until now that
> was available on GNU ELPA (currently ranking at the 86% percentile of
> "popularity" according to the log scraper).

Okay.

>> Ollama, GPT4All and Llama.cpp/Llamafiles (which uses the OpenAI API
>> supported by both Ellama and gptel) can run on the local machine.
>
> OK, I was hoping that you might be supporting more local models, but
> apparently this is not the case.

These are the only local options with HTTP APIs available right now.
There are several more local web applications with bespoke interfaces
but no API.

When there are more I'll add support for them to gptel.

> I recently uploaded a video to https://spectra.video/ and it was easy.
> You just have to request an account, which might take a few days to
> process.

I'll take a look at available instances.  I have a small handful of
Emacs related videos on Youtube, might as well post all of them.

> But no, none of this is blocking.  I am just trying to help improve the
> package before we add it.  The only blocking issue would be if it broke
> the NonGNU ELPA rules, e.g. by having a hard dependency on non-free
> software or SaaSS.

Okay.

Karthik



  reply	other threads:[~2024-04-29 17:21 UTC|newest]

Thread overview: 14+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2024-04-28  3:55 [NonGNU ELPA] Add package gptel Karthik Chikmagalur
2024-04-28  6:30 ` Karthik Chikmagalur
2024-04-28  8:21 ` Philip Kaludercic
2024-04-28 16:50   ` Karthik Chikmagalur
2024-04-29  6:18     ` Philip Kaludercic
2024-04-29  6:52       ` Karthik Chikmagalur
2024-04-29  7:29         ` Philip Kaludercic
2024-04-29 17:21           ` Karthik Chikmagalur [this message]
2024-04-29 18:21             ` Philip Kaludercic
2024-04-29 20:11               ` Karthik Chikmagalur
2024-05-01 12:02                 ` Philip Kaludercic
2024-04-28 17:38   ` Karthik Chikmagalur
2024-04-29 22:40 ` Richard Stallman
2024-04-30  2:12   ` Karthik Chikmagalur

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

  List information: https://www.gnu.org/software/emacs/

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=87frv484p8.fsf@gmail.com \
    --to=karthikchikmagalur@gmail.com \
    --cc=emacs-devel@gnu.org \
    --cc=philipk@posteo.net \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this public inbox

	https://git.savannah.gnu.org/cgit/emacs.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).