unofficial mirror of guix-patches@gnu.org 
 help / color / mirror / code / Atom feed
* [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing
@ 2024-04-04  3:46 John Fremlin via Guix-patches via
  2024-04-05 11:35 ` bug#70175: " Christopher Baines
  0 siblings, 1 reply; 2+ messages in thread
From: John Fremlin via Guix-patches via @ 2024-04-04  3:46 UTC (permalink / raw)
  To: 70175; +Cc: John Fremlin

OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp

Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
---
 gnu/packages/machine-learning.scm | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index 225bff0ca2..ea3674ce3e 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -542,6 +542,8 @@ (define-public llama-cpp
       (build-system cmake-build-system)
       (arguments
        (list
+        #:configure-flags
+        '(list "-DLLAMA_BLAS=ON" "-DLLAMA_BLAS_VENDOR=OpenBLAS")
         #:modules '((ice-9 textual-ports)
                     (guix build utils)
                     ((guix build python-build-system) #:prefix python:)
@@ -576,8 +578,9 @@ (define-public llama-cpp
               (lambda _
                 (copy-file "bin/main" (string-append #$output "/bin/llama")))))))
       (inputs (list python))
+      (native-inputs (list pkg-config))
       (propagated-inputs
-       (list python-numpy python-pytorch python-sentencepiece))
+       (list python-numpy python-pytorch python-sentencepiece openblas))
       (home-page "https://github.com/ggerganov/llama.cpp")
       (synopsis "Port of Facebook's LLaMA model in C/C++")
       (description "This package provides a port to Facebook's LLaMA collection

base-commit: 1441a205b1ebb610ecfae945b5770734cbe8478c
-- 
2.41.0





^ permalink raw reply related	[flat|nested] 2+ messages in thread

* bug#70175: [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing
  2024-04-04  3:46 [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing John Fremlin via Guix-patches via
@ 2024-04-05 11:35 ` Christopher Baines
  0 siblings, 0 replies; 2+ messages in thread
From: Christopher Baines @ 2024-04-05 11:35 UTC (permalink / raw)
  To: John Fremlin via Guix-patches via; +Cc: John Fremlin, 70175-done

[-- Attachment #1: Type: text/plain, Size: 426 bytes --]

John Fremlin via Guix-patches via <guix-patches@gnu.org> writes:

> OpenBLAS is recommended by https://github.com/ggerganov/llama.cpp
>
> Change-Id: Iaf6f22252da13e2d6f503992878b35b0da7de0aa
> ---
>  gnu/packages/machine-learning.scm | 5 ++++-
>  1 file changed, 4 insertions(+), 1 deletion(-)

Looks good to me, I tweaked the commit message a bit and pushed this to
master as d8a63bbcee616f224c10462dbfb117ec009c50d8.

Chris

[-- Attachment #2: signature.asc --]
[-- Type: application/pgp-signature, Size: 987 bytes --]

^ permalink raw reply	[flat|nested] 2+ messages in thread

end of thread, other threads:[~2024-04-05 11:36 UTC | newest]

Thread overview: 2+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2024-04-04  3:46 [bug#70175] [PATCH] gnu: llama-cpp: support OpenBLAS for faster prompt processing John Fremlin via Guix-patches via
2024-04-05 11:35 ` bug#70175: " Christopher Baines

Code repositories for project(s) associated with this public inbox

	https://git.savannah.gnu.org/cgit/guix.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).