unofficial mirror of guile-user@gnu.org 
 help / color / mirror / Atom feed
From: Andy Tai <atai@atai.org>
To: guile-user@gnu.org
Subject: announce: guile_llama_cpp 0.1 release
Date: Sun, 2 Jun 2024 21:45:28 -0700	[thread overview]
Message-ID: <CAJsg1E9MLWkctoqORR=q5FKEJ_7i75JRUd2JvR2GwkEF3P9Qbg@mail.gmail.com> (raw)

# guile_llama_cpp

GNU Guile binding for llama.cpp

This is version 0.1, Copyright 2024 Li-Cheng (Andy) Tai, atai@atai.org
Available as https://codeberg.org/atai/guile_llama_cpp/releases/download/0.1/guile_llama_cpp-0.1.tar.gz


Guile_llama_cpp wraps around llama.cpp APIs so llama.cpp can be
accessed from Guile scripts and programs, in a manner
similar to llama-cpp-ython allowing the use of llama.cpp in Python programs.

Currently a simple Guile script is provided to allow simple "chat"
with a LLM in gguf format.

## setup and build

guile_llama_cpp is written in GNU Guile and C++ and requires

Swig 4.0 or later, GNU guile 3.0, and llama.cpp (obviosuly)

installed on your system.

From sources, guile_llama_cpp can be built via the usual GNU convention,

export LLAMA_CFLAGS=-I<llama_install_dir>/include
export LLAMA_LIBS=-L<llama_install_dir>/lib -lllama

./configure --prefix=<install dir>
make
make install

Once in the future llama.cpp provides pkg-config support, the first
two "export" lines can be omitted.

If you are running GNU Guix on your system, you can get a shall with
all needed dependencies set up with

guix shell -D -f guix.scm

and then use the usual

configure && make && make install

commands to build.

## run

To use guile_llama_cpp to chat with a LLM (Large Language Model), you
need to first download a LLM in gguf format.
See instructions on the web such as
https://stackoverflow.com/questions/67595500/how-to-download-a-model-from-huggingface

As an example, using a "smaller" LLM "Phi-3-mini" from Microsoft; we
would first download the model in gguf format via wget:

wget https://huggingface.co/microsoft/Phi-3-mini-4k-instruct-gguf/resolve/main/Phi-3-mini-4k-instruct-q4.gguf

then you can chat with it, in the build directory:

./pre-inst-env simple.scm  "What are the planets?"
Phi-3-mini-4k-instruct-q4.gguf

The general form to do a chat with a model is to invoke the script
scripts/simple.scm

simple.scm prompt_text model_file_path

in the build directory, pretend the command with

./pre-inst-env

as it sets up the needed paths and environment variables for proper
guile invocation.

Currently, the chat supported is limited; you would see the replies
from the LLM cut of after a sentence or so.
The outputs length issue will be further addressed in future releases.

## roadmap

* support for continuous chat, with long replies
* support for expose the LLM as a web end point, using a web server
built in Guile, so
  the LLM can be done via a web interface, to allow chatting with remote users
* support for embedding LLMs in Guile programs for scenarios like LLM
driven software
  agents

## license

Copyright 2024 Li-Cheng (Andy) Tai
atai@atai.org

This program is licensed under the GNU Lesser General Public License, version 3
or later, as published by the Free Software Foundation. See the license
text in the file COPYING.

gde_appmenu is distributed in the hope that it will be useful, but
WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General
Public License for more details.

Hopefully this program is useful.



             reply	other threads:[~2024-06-03  4:45 UTC|newest]

Thread overview: 2+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2024-06-03  4:45 Andy Tai [this message]
2024-06-03  5:39 ` announce: guile_llama_cpp 0.1 release Nala Ginrut

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

  List information: https://www.gnu.org/software/guile/

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to='CAJsg1E9MLWkctoqORR=q5FKEJ_7i75JRUd2JvR2GwkEF3P9Qbg@mail.gmail.com' \
    --to=atai@atai.org \
    --cc=guile-user@gnu.org \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).