Hi Ludo,


thanks for the review and for proof-reading :-)

I'll send an updated patch in a few days, waiting for some more feedback on this:


Am 06.10.2016 um 23:02 schrieb Ludovic Courtès:

+@code{tests_require}) go into @code{native-inputs}. Examples are
+@emph{setuptools}, @emph{pytest}, @emph{mock}, and @emph{nose}. Of
+course if any of these packages is required at run-time, it needs to be
+set in @code{propagated-inputs}.
I’m not entirely convinced that this is an improvement of what “package
Reference” says.  In particular, it describes ‘native-inputs’ as having
nothing to do with cross-compilation.  OTOH, it has the advantage of
providing concrete instructions to someone focusing on Python.

Thoughts?

My aim was to provide concrete instructions for those wanting to package Python packages. But I will think about how to add some pointer to cross-compilation.

From a Python programmers perspective, "cross-compilation" is a non-issue in most cases. Most Python packages are pure Python, will be compiled into byte-code and thus they are platform independent. (Much like .jar files) This is why most python packages are marked as "noarch" in other Linux distributions. As a Python programmer I think in terms of build-time, maybe test-time and run-time.

This only exception for this are Python packages containing C-code (called "extension modules"). But I have to admit that I have no clue about cross-compiler these. (I do no even find good introductions on this in the web.)

The official Python packaging guide [1] does not really talk about cross-compilation. Only about "Cross-compiling on Windows".

[1] https://docs.python.org/3/distutils/builtdist.html
-- 
Regards
Hartmut Goebel

| Hartmut Goebel          | h.goebel@crazy-compilers.com               |
| www.crazy-compilers.com | compilers which you thought are impossible |