[QE-developers] Problem with parallel phonon calculation in quantum-espresso/6.2.1
Armin Taheri
armin.taheri at utoronto.ca
Fri May 4 19:07:32 CEST 2018
Hello,
I just want to report a problem I am dealing with in quantum-espresso/6.2.1. I used the following commands to run a parallel phonon calculation:
1-mpirun -np 40 pw.x -nk 2 < scf.in > scf.out
2-mpirun -np 320 ph.x -ni 8 -nk 2 < ph.in > ph.out
3-mpirun -np 40 ph.x -nk 2 < ph.in > ph.out
I used recover=.true. command for the last simulation. However, I am getting the following error in the ph.out output:
Program PHONON v.6.2 (svn rev. 14038) starts on 4May2018 at 12:40:33
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI & OpenMP), running on 80 processor cores
Number of MPI processes: 40
Threads/MPI process: 2
MPI processes distributed on 1 nodes
K-points division: npool = 2
R & G space division: proc/nbgrp/npool/nimage = 20
Reading data from directory:
./_ph0/beta-arsenene.save/
IMPORTANT: XC functional enforced from input :
Exchange-correlation = PBE ( 1 4 3 4 0 0)
Any further DFT definition will be discarded
Please, verify this is what you really want
Parallelization info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Min 56 45 14 8416 6017 1088
Max 57 47 15 8431 6025 1109
Sum 1135 913 295 168399 120441 21889
negative rho (up, down): 2.474E-05 0.000E+00
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine phq_readin (1):
pw.x run with a different number of processors. Use wf_collect=.true.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine phq_readin (1):
pw.x run with a different number of processors. Use wf_collect=.true.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine phq_readin (1):
pw.x run with a different number of processors. Use wf_collect=.true.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine phq_readin (1):
pw.x run with a different number of processors. Use wf_collect=.true.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
I do not know why I am getting this error as the number of processors for pw.x and ph.x are the same (command 1 & 3). I will really appreciate if you let me know if this is a bug or i am doing something wrong. Below is also first lines of the pw.x output. I highlighted the number of processors:
Program PWSCF v.6.2 (svn rev. 14038) starts on 3May2018 at 15:31:57
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI & OpenMP), running on 80 processor cores
Number of MPI processes: 40
Threads/MPI process: 2
MPI processes distributed on 1 nodes
K-points division: npool = 2
R & G space division: proc/nbgrp/npool/nimage = 20
Waiting for input...
Reading input from standard input
Current dimensions of program PWSCF are:
Max number of different atomic species (ntypx) = 10
Max number of k-points (npk) = 40000
Max angular momentum in pseudopotentials (lmaxx) = 3
Best regards,
- Armin
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/developers/attachments/20180504/2d20aa1d/attachment.html>
More information about the developers
mailing list