[QE-users] MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 103.
sui xiong tay
taysuixiong at hotmail.com
Thu Jun 30 21:55:13 CEST 2022
Dear QE team and users,
I am using QE version 6.6.0 on a HPC with One Rome CPU @ 2.0GHz (32 Cores per Node). I am currently looking into a structure attached in this email. However, there was an error when running the simulation shown below:
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 103.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
There was no CRASH file but somehow the code did not seem to run.
Thank you for you help!
Shaun Tay.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20220630/9b3fa233/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: R-3m
Type: application/octet-stream
Size: 1535 bytes
Desc: R-3m
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20220630/9b3fa233/attachment.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: slurm-18318670.out
Type: application/octet-stream
Size: 401 bytes
Desc: slurm-18318670.out
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20220630/9b3fa233/attachment-0001.obj>
More information about the users
mailing list