unofficial mirror of guix-patches@gnu.org 
 help / color / mirror / code / Atom feed
From: "Maurice Brémond" <Maurice.Bremond@inria.fr>
To: zimoun <zimon.toutoune@gmail.com>
Cc: 39588@debbugs.gnu.org, "Ludovic Courtès" <ludo@gnu.org>
Subject: [bug#39588] gnu: Add mpich, scalapack-mpich, mumps-mpich, pt-scotch-mpich, python-mpi4py-mpich
Date: Mon, 19 Oct 2020 15:46:20 +0200	[thread overview]
Message-ID: <87lfg2pbv7.fsf@inria.fr> (raw)
In-Reply-To: <CAJ3okZ1hnYkzwJ88=k6OvJ3TdU9uU_+DPFC-AqBDCDuaJc1J_g@mail.gmail.com> (zimoun's message of "Fri, 16 Oct 2020 13:46:16 +0200")

[-- Attachment #1: Type: text/plain, Size: 1085 bytes --]


Hello,

A build of mumps-openmpi with mpich fails:

guix time-machine -- build mumps-openmpi --with-input=openmpi=mpich

[...]
  mpirun -n 3 ./test_scotch_dgraph_check data/bump.grf
  Invalid error code (-2) (error ring index 127 invalid)
  INTERNAL ERROR: invalid error code fffffffe (Ring Index out of range) in MPID_nem_tcp_init:373
  Invalid error code (-2) (error ring index 127 invalid)
  INTERNAL ERROR: invalid error code fffffffe (Ring Index out of range) in MPID_nem_tcp_init:373
  Invalid error code (-2) (error ring index 127 invalid)
  INTERNAL ERROR: invalid error code fffffffe (Ring Index out of range) in MPID_nem_tcp_init:373
  Fatal error in PMPI_Init: Other MPI error, error stack:
  MPIR_Init_thread(586)..............: 
  MPID_Init(224).....................: channel initialization failed
  MPIDI_CH3_Init(105)................: 
  MPID_nem_init(324).................: 
  MPID_nem_tcp_init(175).............: 
  MPID_nem_tcp_get_business_card(401): 
  MPID_nem_tcp_init(373).............: gethostbyname failed, localhost (errno 0)


This is what Ludo reproduced:

[-- Attachment #2: test case --]
[-- Type: text/plain, Size: 5137 bytes --]

From: Ludovic Courtès <ludo@gnu.org>
Subject: Re: [bug#39588] gnu: Add mpich, scalapack-mpich, mumps-mpich, pt-scotch-mpich, python-mpi4py-mpich
To: Maurice Brémond <Maurice.Bremond@inria.fr>
Cc: 39588@debbugs.gnu.org,  zimoun <zimon.toutoune@gmail.com>
Date: Fri, 21 Feb 2020 12:32:44 +0100 (34 weeks, 3 days, 2 hours ago)

Hi,

I actually managed to reproduce it with a minimal test case (attached):

$ guix build -f mpich-test.scm
substitute: updating substitutes from 'https://ci.guix.gnu.org'... 100.0%
La jena derivo estos konstruata:
   /gnu/store/rgr7wnxbgxnp6s96zcnb4ryn3rqfcl7b-mpi-init.drv
building /gnu/store/rgr7wnxbgxnp6s96zcnb4ryn3rqfcl7b-mpi-init.drv...
/gnu/store/pkbg6kllx5xb8vb6kwrwm7qm4rnpmhia-mpich-3.3.2/bin/mpicc: line 215: expr: command not found
/gnu/store/pkbg6kllx5xb8vb6kwrwm7qm4rnpmhia-mpich-3.3.2/bin/mpicc: line 215: expr: command not found
/gnu/store/pkbg6kllx5xb8vb6kwrwm7qm4rnpmhia-mpich-3.3.2/bin/mpicc: line 215: expr: command not found
/gnu/store/pkbg6kllx5xb8vb6kwrwm7qm4rnpmhia-mpich-3.3.2/bin/mpicc: line 215: expr: command not found
/gnu/store/pkbg6kllx5xb8vb6kwrwm7qm4rnpmhia-mpich-3.3.2/bin/mpicc: line 215: expr: command not found
Invalid error code (-2) (error ring index 127 invalid)
INTERNAL ERROR: invalid error code fffffffe (Ring Index out of range) in MPID_nem_tcp_init:373
Invalid error code (-2) (error ring index 127 invalid)
INTERNAL ERROR: invalid error code fffffffe (Ring Index out of range) in MPID_nem_tcp_init:373
Fatal error in PMPI_Init: Other MPI error, error stack:
MPIR_Init_thread(586)..............: 
MPID_Init(224).....................: channel initialization failed
MPIDI_CH3_Init(105)................: 
MPID_nem_init(324).................: 
MPID_nem_tcp_init(175).............: 
MPID_nem_tcp_get_business_card(401): 
MPID_nem_tcp_init(373).............: gethostbyname failed, localhost (errno 0)
Backtrace:
           1 (primitive-load "/gnu/store/iykxzg1n018sigd4c23kx1c4ngz?")
In guix/build/utils.scm:
    652:6  0 (invoke _ . _)

guix/build/utils.scm:652:6: In procedure invoke:
Throw to key `srfi-34' with args `(#<condition &invoke-error [program: "mpiexec" arguments: ("-np" "2" "/gnu/store/8i1dci1wxd6c0q6a2cz4kgb8adfk8rrz-mpi-init") exit-status: 15 term-signal: #f stop-signal: #f] 7ffff6022f40>)'.
builder for `/gnu/store/rgr7wnxbgxnp6s96zcnb4ryn3rqfcl7b-mpi-init.drv' failed with exit code 1
build of /gnu/store/rgr7wnxbgxnp6s96zcnb4ryn3rqfcl7b-mpi-init.drv failed
View build log at '/var/log/guix/drvs/rg/r7wnxbgxnp6s96zcnb4ryn3rqfcl7b-mpi-init.drv.bz2'.
guix build: error: build of `/gnu/store/rgr7wnxbgxnp6s96zcnb4ryn3rqfcl7b-mpi-init.drv' failed


The same program outside the container works just fine:

$ guix environment --ad-hoc mpich -- mpiexec -np 2 "/gnu/store/8i1dci1wxd6c0q6a2cz4kgb8adfk8rrz-mpi-init"
np = 2, rank = 0
np = 2, rank = 1


‘MPL_get_sockaddr’ uses ‘getaddrinfo’ for host name lookup.
Interestingly, ‘getaddrinfo’ fails in the build environment when passed
the flags that ‘MPL_get_sockaddr’ uses:

(computed-file "getaddrinfo"
               #~(pk #$output
                     (getaddrinfo "localhost" #f
                                  (logior AI_ADDRCONFIG AI_V4MAPPED)
                                  AF_INET
                                  SOCK_STREAM
                                  IPPROTO_TCP)))

However, if you comment AF_INET, SOCK_STREAM, and IPPROTO_TCP, it works.

Now we need to see why the ‘ai_family’ hint is causing troubles in
glibc, and perhaps in parallel try to work around it in MPICH…

Ludo’.

PS: I’ll be mostly away from keyboard in the coming days.

(use-modules (guix) (gnu))

(define code
  (plain-file "mpi.c" "
#include <assert.h>
#include <stdio.h>
#include <mpi.h>

int main (int argc, char *argv[]) {
  int err, np, rank;
  err = MPI_Init (&argc, &argv);
  assert (err == 0);
  err = MPI_Comm_size(MPI_COMM_WORLD, &np);
  assert (err == 0);
  err = MPI_Comm_rank(MPI_COMM_WORLD, &rank);
  assert (err == 0);
  printf (\"np = %i, rank = %i\\n\", np, rank);
  return 0;
} "))

(define toolchain (specification->package "gcc-toolchain"))
(define mpich (specification->package "mpich"))

(computed-file "mpi-init"
               (with-imported-modules '((guix build utils))
                 #~(begin
                     (use-modules (guix build utils))

                     (setenv "PATH"
                             (string-append #$(file-append toolchain "/bin") ":"
                                            #$(file-append mpich "/bin")))
                     (setenv "CPATH" #$(file-append mpich "/include"))
                     (setenv "LIBRARY_PATH"
                             (string-append #$(file-append mpich "/lib") ":"
                                            #$(file-append toolchain "/lib")))
                     (invoke "mpicc" "-o" #$output "-Wall" "-g"
                             #$code)

                     ;; Run the MPI code in the build environment.
                     (invoke "mpiexec" "-np" "2" #$output))))


[-- Attachment #3: Type: text/plain, Size: 685 bytes --]



Note that it is ok with the raw mpich patch 
guix time-machine --commit=398ec3c1e265a3f89ed07987f33b264db82e4080 -- time-machine --url=https://gitlab.inria.fr/bremond/guix.git --branch=add-mpich -- build mumps-openmpi --with-input=openmpi=mpich    

I tried a build with the same hwloc as the embedded commit f7b08df258c2e7d04ca2035ddd55a1de91f806d4
(the HEAD used for hwloc in mpich) but the result is the same:

guix time-machine --commit=398ec3c1e265a3f89ed07987f33b264db82e4080 -- time-machine --url=https://gitlab.inria.fr/bremond/guix.git --branch=test-mpich -- build mumps-openmpi --with-input=openmpi=mpich

(the 2 steps time-machine needed is another question...)


Maurice

  reply	other threads:[~2020-10-19 13:47 UTC|newest]

Thread overview: 28+ messages / expand[flat|nested]  mbox.gz  Atom feed  top
2020-02-13 10:44 [bug#39588] gnu: Add mpich, scalapack-mpich, mumps-mpich, pt-scotch-mpich, python-mpi4py-mpich Maurice Brémond
2020-02-17 17:26 ` Ludovic Courtès
2020-02-17 18:20   ` zimoun
2020-02-20  9:08     ` Ludovic Courtès
2020-02-20 10:23       ` zimoun
2020-02-21  8:03         ` Ludovic Courtès
2020-02-21  8:40           ` zimoun
2020-02-25 16:41         ` zimoun
2020-10-15 19:50       ` zimoun
2020-10-16  9:32         ` Ludovic Courtès
2020-10-16 11:46           ` zimoun
2020-10-19 13:46             ` Maurice Brémond [this message]
2020-10-20 20:55               ` Ludovic Courtès
2020-10-23  9:33                 ` Maurice Brémond
2020-10-23 15:26                   ` Ludovic Courtès
2020-10-23 17:04                     ` Maurice Brémond
2020-11-02 14:02                       ` bug#39588: " Ludovic Courtès
2020-10-21 14:43               ` [bug#39588] (off-topic) double time-machine explanations zimoun
2020-10-23  8:41                 ` Maurice Brémond
2020-02-18 17:58   ` [bug#39588] gnu: Add mpich, scalapack-mpich, mumps-mpich, pt-scotch-mpich, python-mpi4py-mpich Maurice Brémond
2020-02-18 18:22     ` zimoun
2020-02-19 11:45       ` Maurice Brémond
2020-02-19 12:11         ` zimoun
2020-02-19 13:34     ` zimoun
2020-02-21  9:01       ` Maurice Brémond
2020-02-20  9:38     ` Ludovic Courtès
2020-02-21  8:46       ` Maurice Brémond
2020-02-21 11:32         ` Ludovic Courtès

Reply instructions:

You may reply publicly to this message via plain-text email
using any one of the following methods:

* Save the following mbox file, import it into your mail client,
  and reply-to-all from there: mbox

  Avoid top-posting and favor interleaved quoting:
  https://en.wikipedia.org/wiki/Posting_style#Interleaved_style

  List information: https://guix.gnu.org/

* Reply using the --to, --cc, and --in-reply-to
  switches of git-send-email(1):

  git send-email \
    --in-reply-to=87lfg2pbv7.fsf@inria.fr \
    --to=maurice.bremond@inria.fr \
    --cc=39588@debbugs.gnu.org \
    --cc=ludo@gnu.org \
    --cc=zimon.toutoune@gmail.com \
    /path/to/YOUR_REPLY

  https://kernel.org/pub/software/scm/git/docs/git-send-email.html

* If your mail client supports setting the In-Reply-To header
  via mailto: links, try the mailto: link
Be sure your reply has a Subject: header at the top and a blank line before the message body.
Code repositories for project(s) associated with this public inbox

	https://git.savannah.gnu.org/cgit/guix.git

This is a public inbox, see mirroring instructions
for how to clone and mirror all data and code used for this inbox;
as well as URLs for read-only IMAP folder(s) and NNTP newsgroup(s).