Scientific Linux 7.4: undefined symbol: ompi_mpi_logical8 error

39 views Asked by At

I am trying to run a Python Package that uses MPI for running tasks in parallel.

as per the package's guidelines; I installed MPICH 4.2.0, followed my mpi4py.

However, after running their example command, I get the following response:

Traceback (most recent call last):
  File "sGC_parameter_estimation.py", line 8, in <module>
    from mpi4py.futures import MPIPoolExecutor
  File "/home/people/hkaya/.local/lib/python3.6/site-packages/mpi4py/futures/__init__.py", line 20, in <module>
    from .pool import MPIPoolExecutor
  File "/home/people/hkaya/.local/lib/python3.6/site-packages/mpi4py/futures/pool.py", line 14, in <module>
    from . import _lib
  File "/home/people/hkaya/.local/lib/python3.6/site-packages/mpi4py/futures/_lib.py", line 20, in <module>
    from .. import MPI
ImportError: /home/people/hkaya/.local/lib/python3.6/site-packages/mpi4py/MPI.cpython-36m-x86_64-linux-gnu.so: undefined symbol: ompi_mpi_logical8

when I run "ldd" for the said .so file; here is the output:

ldd /home/people/hkaya/.local/lib/python3.6/site-packages/mpi4py/MPI.cpython-36m-x86_64-linux-gnu.so
    linux-vdso.so.1 =>  (0x00007ffefd45f000)
    libdl.so.2 => /lib64/libdl.so.2 (0x00002b8515af4000)
    libpython3.6m.so.1.0 => /lib64/libpython3.6m.so.1.0 (0x00002b8515cf9000)
    libpthread.so.0 => /lib64/libpthread.so.0 (0x00002b8516221000)
    libc.so.6 => /lib64/libc.so.6 (0x00002b851643d000)
    /lib64/ld-linux-x86-64.so.2 (0x000055806dd65000)
    libutil.so.1 => /lib64/libutil.so.1 (0x00002b8516801000)
    libm.so.6 => /lib64/libm.so.6 (0x00002b8516a04000)

Although running -objdump -tT gives a very long list, where ompi_mpi_logical8 is included with an "UND" tag.

I also tried reinstalling mpi4py as suggested here, which did not resolve my issue. In fact, I noticed that mpi4py was installed in a different directory /usr/local/lib64/python3.6/site-packages from the one that the package was trying to import MPI from.

So, I am guessing that there is a clash between two mpi4py versions. But I cannot say for sure.

mpiexec --version gives two different outputs for when I am root vs not.

In root:

mpiexec (OpenRTE) 4.1.2

Report bugs to http://www.open-mpi.org/community/help/

When I'm in my local home directory:

HYDRA build details: Version: 4.2.0rc2 Release Date: Tue Jan 16 14:53:48 CST 2024 CC: gcc -std=gnu99
Configure options: '--disable-option-checking' '--prefix=/home/people/hkaya/mpich-install' '--with-hwloc=embedded' '--cache-file=/dev/null' '--srcdir=/home/people/hkaya/Downloads/mpich-4.2.0rc2/src/pm/hydra' 'CC=gcc -std=gnu99' 'CFLAGS= -O2' 'LDFLAGS=' 'LIBS=' 'CPPFLAGS= -DNETMOD_INLINE=netmod_inline_ofi -I/temp/hkaya/mpich-4.1.2/src/mpl/include -I/home/people/hkaya/Downloads/mpich-4.2.0rc2/src/mpl/include -I/home/people/hkaya/Downloads/mpich-4.2.0rc2/modules/json-c -I/temp/hkaya/mpich-4.1.2/modules/json-c -I/home/people/hkaya/Downloads/mpich-4.2.0rc2/modules/hwloc/include -I/temp/hkaya/mpich-4.1.2/modules/hwloc/include -D_REENTRANT -I/temp/hkaya/mpich-4.1.2/src/mpi/romio/include -I/home/people/hkaya/Downloads/mpich-4.2.0rc2/src/pmi/include -I/temp/hkaya/mpich-4.1.2/src/pmi/include -I/temp/hkaya/mpich-4.1.2/modules/yaksa/src/frontend/include -I/home/people/hkaya/Downloads/mpich-4.2.0rc2/modules/yaksa/src/frontend/include -I/temp/hkaya/mpich-4.1.2/modules/libfabric/include -I/home/people/hkaya/Downloads/mpich-4.2.0rc2/modules/libfabric/include' Process Manager: pmi Launchers available: ssh rsh fork slurm ll lsf sge manual persist Topology libraries available: hwloc Resource management kernels available: user slurm ll lsf sge pbs cobalt Demux engines available: poll select

So, my question is: how can solve this contradiction?

Thanks in advance

0

There are 0 answers