[Q-e-developers] MPI problems with ELPA
Gabriele Sclauzero
gabriele.sclauzero at epfl.ch
Tue Oct 22 17:35:32 CEST 2013
Dear all,
I've recently started using ELPA in place of Scalapack for large scale calculations and I indeed see a very good improvement of the performance.
Unfortunately, I've found a recurrent problem when running relax calculations (not due to the relaxation itself, though, I believe). The program crashes because of some MPI-related issues. The error message from the system looks as follows:
Abort(1) on node 1732 (rank 1732 in comm 1140850688): Fatal error in PMPI_Comm_split: Other MPI error, error stack:
PMPI_Comm_split(474).................: MPI_Comm_split(comm=0xc4000004, color=2, key=27, new_comm=0x1fffff7478) failed
PMPI_Comm_split(456).................:
MPIR_Comm_split_impl(228)............:
MPIR_Get_contextid_sparse_group(1071): Too many communicators
The crash happens during the Davidson diagonalization after a few ionic cycles of the relaxation (after roughly ~200 Davidson diagonalizations). It happens both with v.5.0.3 and with the latest SVN revision. If I compile with Scalapack in place of ELPA, both versions work fine (but are slower...).
Compilation details (see also attached make.sys):
BG/Q machine, XLF 14.1, XLC 12.1, ESSL 5.1, Scalapack 2.0.2
./configure --enable-openmp --with-elpa
The calculation was run on 256 nodes with the following command line:
runjob -n 2048 -p 8 --envs OMP_NUM_THREADS=4 --cwd [...] : /home/sclauzer/Codes/espresso/5.0.3_ELPA/bin/pw.x -nband 1 -npool 1 -ndiag 1024 -ntg 4 -in [...]
The system is quite large, a slab with ~1400 atoms and ~3000 bands. I don't know if the problem would show up for a smaller system or on fewer nodes, but I can try to provide a smaller example in order to investigate the problem more easily. Hopefully, this is something simple that the ELPA/Scalapack and BGQ experts among you can spot at a glance.
Best,
Gabriele
-------------- next part --------------
A non-text attachment was scrubbed...
Name: make.sys
Type: application/octet-stream
Size: 5099 bytes
Desc: not available
URL: <http://lists.quantum-espresso.org/pipermail/developers/attachments/20131022/b9bdc35b/attachment.obj>
More information about the developers
mailing list