all messages for Guix-related lists mirrored at yhetil.org
 help / color / mirror / code / Atom feed
* Guidelines for pre-trained ML model weight binaries (Was re: Where should we put machine learning model parameters?)
@ 2023-04-03 18:07 Ryan Prior
  2023-04-03 20:48 ` Nicolas Graves via Development of GNU Guix and the GNU System distribution.
  2023-04-06  8:42 ` Simon Tournier
  0 siblings, 2 replies; 21+ messages in thread
From: Ryan Prior @ 2023-04-03 18:07 UTC (permalink / raw)
  To: Nicolas Graves, licensing@fsf.org; +Cc: guix-devel


Hi there FSF Licensing! (CC: Guix devel, Nicholas Graves) This morning I read through the FSDG to see if it gives any guidance on when machine learning model weights are appropriate for inclusion in a free system. It does not seem to offer much.

Many ML models are advertising themselves as "open source", including the llama model that Nicholas (quoted below) is interested in including into Guix. However, according to what I can find in Meta's announcement (https://ai.facebook.com/blog/large-language-model-llama-meta-ai/) and the project's documentation (https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md) the model itself is not covered by the GPLv3 but rather "a noncommercial license focused on research use cases." I cannot find the full text of this license anywhere in 20 minutes of searching, perhaps others have better ideas how to find it or perhaps the Meta team would provide a copy if we ask.

Free systems will see incentive to include trained models in their distributions to support use cases like automatic live transcription of audio, recognition of objects in photos and video, and natural language-driven help and documentation features. I hope we can update the FSDG to help ensure that any such inclusion fully meets the requirements of freedom for all our users.

Cheers,
Ryan


------- Original Message -------
On Monday, April 3rd, 2023 at 4:48 PM, Nicolas Graves via "Development of GNU Guix and the GNU System distribution." <guix-devel@gnu.org> wrote:


> 
> 
> 
> Hi Guix!
> 
> I've recently contributed a few tools that make a few OSS machine
> learning programs usable for Guix, namely nerd-dictation for dictation
> and llama-cpp as a converstional bot.
> 
> In the first case, I would also like to contribute parameters of some
> localized models so that they can be used more easily through Guix. I've
> already discussed this subject when submitting these patches, without a
> clear answer.
> 
> In the case of nerd-dictation, the model parameters that can be used
> are listed here : https://alphacephei.com/vosk/models
> 
> One caveat is that using all these models can take a lot of space on the
> servers, a burden which is not useful because no build step are really
> needed (except an unzip step). In this case, we can use the
> #:substitutable? #f flag. You can find an example of some of these
> packages right here :
> https://git.sr.ht/~ngraves/dotfiles/tree/main/item/packages.scm
> 
> So my question is: Should we add this type of models in packages for
> Guix? If yes, where should we put them? In machine-learning.scm? In a
> new file machine-learning-models.scm (such a file would never need new
> modules, and it might avoid some confusion between the tools and the
> parameters needed to use the tools)?
> 
> 
> --
> Best regards,
> Nicolas Graves


^ permalink raw reply	[flat|nested] 21+ messages in thread
* Guidelines for pre-trained ML model weight binaries (Was re: Where should we put machine learning model parameters?)
@ 2023-04-07  5:50 Nathan Dehnel
  2023-04-07  9:42 ` Simon Tournier
  0 siblings, 1 reply; 21+ messages in thread
From: Nathan Dehnel @ 2023-04-07  5:50 UTC (permalink / raw)
  To: rprior, guix-devel

I am uncomfortable with including ML models without their training
data available. It is possible to hide backdoors in them.
https://www.quantamagazine.org/cryptographers-show-how-to-hide-invisible-backdoors-in-ai-20230302/


^ permalink raw reply	[flat|nested] 21+ messages in thread

end of thread, other threads:[~2023-07-04 20:04 UTC | newest]

Thread overview: 21+ messages (download: mbox.gz / follow: Atom feed)
-- links below jump to the message on this page --
2023-04-03 18:07 Guidelines for pre-trained ML model weight binaries (Was re: Where should we put machine learning model parameters?) Ryan Prior
2023-04-03 20:48 ` Nicolas Graves via Development of GNU Guix and the GNU System distribution.
2023-04-03 21:18   ` Jack Hill
2023-04-06  8:42 ` Simon Tournier
2023-04-06 13:41   ` Kyle
2023-04-06 14:53     ` Simon Tournier
2023-05-13  4:13   ` 宋文武
2023-05-15 11:18     ` Simon Tournier
2023-05-26 15:37       ` Ludovic Courtès
2023-05-29  3:57         ` zamfofex
2023-05-30 13:15         ` Simon Tournier
2023-07-02 19:51           ` Ludovic Courtès
2023-07-03  9:39             ` Simon Tournier
2023-07-04 13:05               ` zamfofex
2023-07-04 20:03                 ` Vagrant Cascadian
  -- strict thread matches above, loose matches on Subject: below --
2023-04-07  5:50 Nathan Dehnel
2023-04-07  9:42 ` Simon Tournier
2023-04-08 10:21   ` Nathan Dehnel
2023-04-11  8:37     ` Simon Tournier
2023-04-11 12:41       ` Nathan Dehnel
2023-04-12  9:32         ` Csepp

Code repositories for project(s) associated with this external index

	https://git.savannah.gnu.org/cgit/guix.git

This is an external index of several public inboxes,
see mirroring instructions on how to clone and mirror
all data and code used by this external index.