[QE-users] error in postprocessing with dos.x and projwfc.x QE 7.4.1
giuseppe.mattioli at mlib.ism.cnr.it
giuseppe.mattioli at mlib.ism.cnr.it
Mon Dec 22 14:02:20 CET 2025
Dear users and developers
I've completed, apparently with no error, a huge calculation with QE
7.4.1. exiting with proper message
PWSCF : 20d15h10m CPU 22d 8h42m WALL
This run was terminated on: 6:47: 8 22Dec2025
=------------------------------------------------------------------------------=
JOB DONE.
=------------------------------------------------------------------------------=
Now I want to postprocess the results, as I've done thousands of times,
using dos.x and projwfc.x. I've not tried with pp.x, yet.
However, dos.x and projwfc.x stop and complain:
This is dos.x
Program DOS v.7.4.1 starts on 22Dec2025 at 13:57:17
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502
(2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901
(2017);
"P. Giannozzi et al., J. Chem. Phys. 152 154105 (2020);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More
details at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 48 processors
MPI processes distributed on 1 nodes
R & G space division: proc/nbgrp/npool/nimage = 48
877109 MiB available memory on the printing compute node when the
environment starts
Reading xml data from directory:
/scratch/giuseppe/ag+mgo+pc/ag+mgo+cupc/run/bck_tmp/ag+mgo+cupc-otop20-vdwdf2ahbr-4l-esm-aocc.save/
Message from routine qes_read:parallel_infoType:
error reading nprocs
Message from routine qes_read:parallel_infoType:
error reading nthreads
Message from routine qes_read:parallel_infoType:
error reading ntasks
Message from routine qes_read:parallel_infoType:
error reading nbgrp
Message from routine qes_read:parallel_infoType:
error reading npool
Message from routine qes_read:parallel_infoType:
error reading ndiag
Message from routine qexsd_readschema :
error in parallel_info of xsd data file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine read_xml_file (3):
fatal error reading xml file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
and this is the analogous behavior of projwfc.x
Program PROJWFC v.7.4.1 starts on 22Dec2025 at 13:57:19
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502
(2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901
(2017);
"P. Giannozzi et al., J. Chem. Phys. 152 154105 (2020);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More
details at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 48 processors
MPI processes distributed on 1 nodes
R & G space division: proc/nbgrp/npool/nimage = 48
877037 MiB available memory on the printing compute node when the
environment starts
Reading xml data from directory:
/scratch/giuseppe/ag+mgo+pc/ag+mgo+cupc/run/bck_tmp/ag+mgo+cupc-otop20-vdwdf2ahbr-4l-esm-aocc.save/
Message from routine qes_read:parallel_infoType:
error reading nprocs
Message from routine qes_read:parallel_infoType:
error reading nthreads
Message from routine qes_read:parallel_infoType:
error reading ntasks
Message from routine qes_read:parallel_infoType:
error reading nbgrp
Message from routine qes_read:parallel_infoType:
error reading npool
Message from routine qes_read:parallel_infoType:
error reading ndiag
Message from routine qexsd_readschema :
error in parallel_info of xsd data file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine read_xml_file (3):
fatal error reading xml file
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
What can I do? I'm a bit nervous because the main calculation ran for 22
days...
Thank you in advance for your attention and for hints and tips
Giuseppe
More information about the users
mailing list