<html><head><style type="text/css"><!-- DIV {margin:0px;} --></style></head><body><div style="font-family:times new roman,new york,times,serif;font-size:12pt"><div>Dear PwScf users,<br>I am getting MPI error during PWCOND run at particular point. The standard output gives:<br><br>MPI_Allreduce: invalid communicator: Unknown error 2064 (rank 0, comm 16)<br>Rank (0, MPI_COMM_WORLD) : Call stack within LAM:<br>Rank (0, MPI_COMM_WORLD) : -MPI_Allreduce()<br>Rank (0, MPI_COMM_WORLD) : -main()<br>------------------------------------------------------<br>One of the processes started by mpirun has exited with a nonzero exit........<br>...<br>..<br>..<br>PID 16077 failed on node n1 (192.168.0.12) with exit status 1.<br>-----------------------------<br><br><br>I am running espresso-4.0 installed with lam-mpi 7.1.4 and ifort 10.1 on dual cpu quad core xeon processors. There are 2 machines running in parallel. Thus 4 cpus or 16 cores are available. However, this
calculation was ran on 4 cores only. The scf and relax calculations do not crash. One more thing to note is: recently I have enhanced RAM on both the machines. And after enhnacement did not do any reinstallation or recompilation of any software including lam-mpi. <br><br>Note : Through search on forum archives I found a post by Derek :<br><span><a target="_blank" href="http://www.democritos.it/pipermail/pw_forum/2005-November/003219.html">http://www.democritos.it/pipermail/pw_forum/2005-November/003219.html</a></span><br>which indicates this might be a problem related to mpi or def.h......<br><br><br>Thank you.<br><br>Regards<br>Sagar Ambavale<br>PhD Student<br>The M.S. Uni. of Baroda<br>India<br></div>
<!-- cg2.c50.mail.in.yahoo.com compressed Wed Oct 28 05:27:34 PDT 2009 -->
</div><br>
<!--5--><hr size=1></hr> Now, send attachments up to 25MB with Yahoo! India Mail. <a href="http://in.rd.yahoo.com/tagline_galaxy_2/*http://in.overview.mail.yahoo.com/photos" target="_blank">Learn how</a>.</body></html>