[QE-users] Installation verification

Mahmood Naderan mahmood.nt at gmail.com
Mon Apr 29 15:06:39 CEST 2019


Hi
I want to know if I have correctly built qe with openmpi-4.0. So, I ran the
following command and got this error

[root at rocks7 q-e-qe-6.4]# ./configure
--prefix=/share/apps/softwares/q-e-qe-6.4
MPIF90=/share/apps/softwares/openmpi-4.0.1/bin/mpif90
...
[root at rocks7 q-e-qe-6.4]# make all
...
[mahmood at rocks7 job]$ /share/apps/softwares/openmpi-4.0.1/bin/mpirun
/share/apps/softwares/q-e-qe-6.4/bin/pw.x
--------------------------------------------------------------------------
As of version 3.0.0, the "sm" BTL is no longer available in Open MPI.

Efficient, high-speed same-node shared memory communication support in
Open MPI is available in the "vader" BTL.  To use the vader BTL, you
can re-run your job with:

    mpirun --mca btl vader,self,... your_mpi_application
--------------------------------------------------------------------------
--------------------------------------------------------------------------
A requested component was not found, or was unable to be opened.  This
means that this component is either not installed or is unable to be
used on your system (e.g., sometimes this means that shared libraries
that the component requires are unable to be found/loaded).  Note that
Open MPI stopped checking at the first component that it did not find.

Host:      rocks7.jupiterclusterscu.com
Framework: btl
Component: sm
--------------------------------------------------------------------------
--------------------------------------------------------------------------
It looks like MPI_INIT failed for some reason; your parallel process is
likely to abort.  There are many reasons that a parallel process can
fail during MPI_INIT; some of which are due to configuration or environment
problems.  This failure appears to be an internal failure; here's some
additional information (which may only be relevant to an Open MPI
developer):

  mca_bml_base_open() failed
  --> Returned "Not found" (-13) instead of "Success" (0)
--------------------------------------------------------------------------
[rocks7:19796] *** An error occurred in MPI_Init
[rocks7:19796] *** reported by process [85917697,3255307776955514882]
[rocks7:19796] *** on a NULL communicator
[rocks7:19796] *** Unknown error
[rocks7:19796] *** MPI_ERRORS_ARE_FATAL (processes in this communicator
will now abort,
[rocks7:19796] ***    and potentially your MPI job)
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] PMIX ERROR: UNREACHABLE in file
server/pmix_server.c at line 2079
[rocks7.jupiterclusterscu.com:19784] 31 more processes have sent help
message help-mpi-btl-sm.txt / btl sm is dead
[rocks7.jupiterclusterscu.com:19784] Set MCA parameter
"orte_base_help_aggregate" to 0 to see all help / error messages
[rocks7.jupiterclusterscu.com:19784] 8 more processes have sent help
message help-mca-base.txt / find-available:not-valid
[rocks7.jupiterclusterscu.com:19784] 4 more processes have sent help
message help-mpi-runtime.txt / mpi_init:startup:internal-failure
[rocks7.jupiterclusterscu.com:19784] 3 more processes have sent help
message help-mpi-errors.txt / mpi_errors_are_fatal unknown handle



Is that normal and need input file? I also tried with a sample input file
but got the same error.


Regards,
Mahmood
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20190429/8d8da895/attachment.html>


More information about the users mailing list