[Pw_forum] problem with parallel ph.x

Ezad Shojaee ezadshojaee at hotmail.com
Fri Jul 27 10:45:18 CEST 2007


Hii have attached the output file of ph.x (calculation of IFC's for a cubic grid ), you can see what i was trying to describe in the nscf calculation part of the 2nd point in the grid( where some eigenvalues are zero )what's the matter?and also i have this error message later: MPI_Recv: message truncated (rank 0, comm 9)Rank (0, MPI_COMM_WORLD): Call stack within LAM:Rank (0, MPI_COMM_WORLD):  - MPI_Recv()Rank (0, MPI_COMM_WORLD):  - main() these are the specs of ours:  1- machine=AMD opteron dualcore64bit,  2- OS=CentOS64bit 4.4,  3- compiler=intel fortran9.1,  4- MPI= lamMPI 7.1.3,  5- espresso=3.2 and i have different datas depending on the # of npools!! i should say that ph.x goes correctly in the serial mode, and i am able to run scf calculation in the parallel mode (with the same results of the serial)hope this could be enough for buggin(the input is quite known by the info above)
_________________________________________________________________
Express yourself instantly with MSN Messenger! Download today it's FREE!
http://messenger.msn.click-url.com/go/onm00200471ave/direct/01/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20070727/0f1a727e/attachment.html>
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: ph.out
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20070727/0f1a727e/attachment.ksh>


More information about the users mailing list