From mboxrd@z Thu Jan 1 00:00:00 1970 Received: from eggs.gnu.org ([2001:4830:134:3::10]:50258) by lists.gnu.org with esmtp (Exim 4.71) (envelope-from ) id 1fCXTp-000332-G1 for guix-patches@gnu.org; Sat, 28 Apr 2018 17:34:06 -0400 Received: from Debian-exim by eggs.gnu.org with spam-scanned (Exim 4.71) (envelope-from ) id 1fCXTm-0002wq-DH for guix-patches@gnu.org; Sat, 28 Apr 2018 17:34:05 -0400 Received: from debbugs.gnu.org ([208.118.235.43]:34437) by eggs.gnu.org with esmtps (TLS1.0:RSA_AES_128_CBC_SHA1:16) (Exim 4.71) (envelope-from ) id 1fCXTm-0002wd-A4 for guix-patches@gnu.org; Sat, 28 Apr 2018 17:34:02 -0400 Received: from Debian-debbugs by debbugs.gnu.org with local (Exim 4.84_2) (envelope-from ) id 1fCXTl-00044V-Vy for guix-patches@gnu.org; Sat, 28 Apr 2018 17:34:02 -0400 Subject: bug#31293: [PATCH] gnu: Add python-autograd Resent-To: guix-patches@gnu.org Resent-Message-ID: From: ludo@gnu.org (Ludovic =?UTF-8?Q?Court=C3=A8s?=) References: Date: Sat, 28 Apr 2018 23:33:20 +0200 In-Reply-To: (Fis Trivial's message of "Sat, 28 Apr 2018 03:47:03 +0000") Message-ID: <87fu3f819b.fsf@gnu.org> MIME-Version: 1.0 Content-Type: multipart/mixed; boundary="=-=-=" List-Id: List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , Errors-To: guix-patches-bounces+kyle=kyleam.com@gnu.org Sender: "Guix-patches" To: Fis Trivial Cc: 31293-done@debbugs.gnu.org --=-=-= Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable Fis Trivial skribis: > * gnu/packages/machine-learning.scm (python-autograd, python2-autograd): = New > variables. Applied with the changes below. Thank you! Ludo=E2=80=99. --=-=-= Content-Type: text/x-patch Content-Disposition: inline diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm index e2b113a70..d3af35e17 100644 --- a/gnu/packages/machine-learning.scm +++ b/gnu/packages/machine-learning.scm @@ -719,10 +719,10 @@ mining and data analysis.") (replace 'check (lambda _ (invoke "py.test" "-v")))))) - (synopsis "Efficiently computes derivatives of numpy code") + (synopsis "Efficiently computes derivatives of NumPy code") (description "Autograd can automatically differentiate native Python and -Numpy code. It can handle a large subset of Python's features, including loops -, ifs, recursion and closures, and it can even take derivatives of derivatives +NumPy code. It can handle a large subset of Python's features, including loops, +ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as --=-=-=--