[Pw_forum] phcg restart problem

weeliat owl1sg at yahoo.com
Wed Jan 8 21:12:46 CET 2014


Hello,

Is there any chance that anyone has an insight on this restart problem in phcg v5.02 which generated some mpi error as mentioned in my earlier thread?  

If not, i will probably have to try the ph or use VASP instead.  Any suggestions will be very much appreciated.

Thanks,

wee liat
Carnegie Mellon Uni




On Saturday, January 4, 2014 12:56 PM, weeliat <owl1sg at yahoo.com> wrote:
 
Dear All,

Thanks for the reply.  I have since compiled v5.0.2 but got the same error upon restart.

The initial output from the phcg run is shown below.  I stopped it prematurely to test the restart capability:


     Program PHCG v.5.0.2 (svn rev. 9392) starts on  3Jan2014 at 13:20:23 

     This program is part of the open-source Quantum ESPRESSO suite
     for quantum simulation of materials; please cite
         "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
          URL http://www.quantum-espresso.org", 
     in publications or presentations arising from this work. More details at
     http://www.quantum-espresso.org/quote.php

     Parallel version (MPI), running on    32 processors
     R & G space division:  proc/nbgrp/npool/nimage =      32

   Info: using nr1, nr2, nr3 values from input

   Info: using nr1s, nr2s, nr3s values from input

     IMPORTANT: XC functional enforced from input :
     Exchange-correlation      =  SLA  PZ   NOGX NOGC ( 1 1 0 0 0)
     EXX-fraction              =        0.00
     Any further DFT definition will be discarded
     Please, verify this is what you really want

               file H.pz-vbc.UPF: wavefunction(s)  1S renormalized
 
     Parallelization info
     --------------------
     sticks:   dense  smooth     PW     G-vecs:    dense   smooth      PW
     Min        1118    1118    280               159290   159290   19920
     Max        1120    1120    282               159300   159300   19926
     Sum       35781   35781   8969              5097465  5097465  637483
     Tot       17891   17891   4485
 

 ***  Starting Conjugate Gradient minimization         ***
 ***  pol. #   1 : 138 iterations
 ***  pol. #   2 : 138 iterations
 ***  pol. #   3 : 136 iterations

ATOMIC_POSITIONS
Se       0.000244440   0.000055869   0.221532863
Co      -0.000378387   0.106617176   0.320298820
Se       0.142919561   0.083743738   0.333072366
P        0.010001498   0.214824211   0.249814145
.(edited)

.
.

.
 ***  Starting Conjugate Gradient minimization         ***
     d2ion: alpha =   0.50
 ***  mode #   1 : using asr
 ***  mode #   2 : using asr
 ***  mode #   3 : using asr
 ***  mode #   4 : 126 iterations
 ***  mode #   5 : 124 iterations
 ***  mode #   6 : 126 iterations
 ***  mode #   7 : 118 iterations


Everything in the initial run seems fine. The restart phcg.x ouput shows this:


     Program PHCG v.5.0.2 (svn rev. 9392) starts on  4Jan2014 at 12:29:52 

     This program is part of the open-source Quantum ESPRESSO suite
     for quantum simulation of materials; please cite
         "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
          URL http://www.quantum-espresso.org", 
     in publications or presentations arising from this work. More details at
     http://www.quantum-espresso.org/quote.php

     Parallel version (MPI), running on    32 processors
     R & G space division:  proc/nbgrp/npool/nimage =      32

   Info: using nr1, nr2, nr3 values from input

   Info: using nr1s, nr2s, nr3s values from input

     IMPORTANT: XC functional enforced from input :
     Exchange-correlation      =  SLA  PZ   NOGX NOGC ( 1 1 0 0 0)
     EXX-fraction              =        0.00
     Any further DFT definition will be discarded
     Please, verify this is what you really want

               file H.pz-vbc.UPF: wavefunction(s)  1S renormalized
 
     Parallelization info
     --------------------
     sticks:   dense  smooth     PW     G-vecs:    dense   smooth      PW
     Min        1118    1118    280               159290   159290   19920
     Max        1120    1120    282               159300   159300   19926
     Sum       35781   35781   8969              5097465  5097465  637483
     Tot       17891   17891   4485
 

The output stops here with a error msg from the pbs:

An error occurred in MPI_Allreduce
on communicator MPI COMMUNICATOR 9 SPLIT FROM 7
MPI_ERR_TRUNCATE: message truncated
MPI_ERRORS_ARE_FATAL (your MPI job will now abort)

I am not sure if this error is due to my compilers or mpi software (seems less probable as the initial run is ok) or is there something with the restart function in phcg?


Thanks,

wee liat
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20140108/82996370/attachment.html>


More information about the users mailing list