-
Notifications
You must be signed in to change notification settings - Fork 112
Description
Description
I am building an MPI Fortran program using a pretty basic configuration. I have tried this build on three systems and on two of them, this problem occurs. (System info and meta.yaml below) By all indications, the build succeeds. However, when you try to run the executable with MPI, mpi_init
just does nothing and the MPI environment is never initialized.
$ fpm build --flag '-ffree-line-length-none'
+ which mpiexec
/usr/bin/mpiexec
emuinit.F done.
[...]
core_loop.F done.
libemu.a done.
emu done.
[100%] Project compiled successfully.
Looking at the library dependencies of the executable produced by fpm shows that the MPI libraries are missing:
$ ldd build/gfortran_735EAD39D1DF8EB7/app/emu
linux-vdso.so.1 (0x00007ffcfed4e000)
libgfortran.so.5 => /lib/x86_64-linux-gnu/libgfortran.so.5 (0x00007f7e6cd09000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f7e6cc22000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f7e6cc02000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f7e6c9d8000)
libquadmath.so.0 => /lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007f7e6c990000)
/lib64/ld-linux-x86-64.so.2 (0x00007f7e6dbf9000)
Building the same code in the same environment with make
produces an executable with the MPI libraries linked:
$ ldd bin/emu
linux-vdso.so.1 (0x00007ffeed9ad000)
libmpi_mpifh.so.40 => /lib/x86_64-linux-gnu/libmpi_mpifh.so.40 (0x00007f55001ea000)
libgfortran.so.5 => /lib/x86_64-linux-gnu/libgfortran.so.5 (0x00007f54fff0f000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f54ffe28000)
libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007f54ffe08000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f54ffbde000)
libmpi.so.40 => /lib/x86_64-linux-gnu/libmpi.so.40 (0x00007f54ffaa7000)
libopen-pal.so.40 => /lib/x86_64-linux-gnu/libopen-pal.so.40 (0x00007f54ff9f4000)
libquadmath.so.0 => /lib/x86_64-linux-gnu/libquadmath.so.0 (0x00007f54ff9ac000)
/lib64/ld-linux-x86-64.so.2 (0x00007f5500628000)
libopen-rte.so.40 => /lib/x86_64-linux-gnu/libopen-rte.so.40 (0x00007f54ff8ef000)
libhwloc.so.15 => /lib/x86_64-linux-gnu/libhwloc.so.15 (0x00007f54ff891000)
libevent_core-2.1.so.7 => /lib/x86_64-linux-gnu/libevent_core-2.1.so.7 (0x00007f54ff85c000)
libevent_pthreads-2.1.so.7 => /lib/x86_64-linux-gnu/libevent_pthreads-2.1.so.7 (0x00007f54ff857000)
libz.so.1 => /lib/x86_64-linux-gnu/libz.so.1 (0x00007f54ff83b000)
libudev.so.1 => /lib/x86_64-linux-gnu/libudev.so.1 (0x00007f54ff811000)
Expected Behaviour
The build should not appear successful in this scenario, where the MPI libraries are not linked (or whatever the true underlying problem is). Ideally, the MPI libraries should always link correctly, but if something is preventing that, it should be reported.
Version of fpm
0.9.0
Platform and Architecture
Ubuntu 22.04
Additional Information
Good system:
- gfortran 9.3.1 and 4.8.5
- Intel MPI 2021.2.0
- CentOS 7.9.2009
Bad system 1:
- gfortran 9.5.0
- OpenMPI 4.1.2
- Ubuntu 22.04
Bad system 2:
- gfortran 9.4.0
- Intel MPI 2021.2.0
- Linux Mint 17.1
Bad system 2 is really old, so I was ready to blame that until the same thing happened on a separate, up-to-date system.
Here is the fpm.toml used in all cases:
fpm.toml
name = "emu"
[[executable]]
name = "emu"
source-dir = "src"
main = "emu.F"
[dependencies]
mpi = "*"
[fortran]
implicit-typing = true
implicit-external = true
[preprocess]
[preprocess.cpp]
suffixes = ["F"]
macros = ["EMU_MPI"]
Just in case, I did also test without the preprocessing, which had no effect.