[Pw_forum] dos.x issues
Wilbert James Futalan
wilbert.james.futalan at gmail.com
Sun Nov 12 06:41:38 CET 2017
Good day Paolo,
Thanks for responding. dos.x allows parallelization. To prove my point
and as a workaround to this problem which has bothering me for quite
some time now, I ran it on the same cluster, but using 5.4.0:
(Although I also tried doing the single-thread-single-processor run
as inspired by http://qe-forge.org/pipermail/pw_forum/2015-September/107940.html).
Program DOS v.5.4.0 starts on 12Nov2017 at 14:22:57
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI & OpenMP), running on 12 processor cores
Number of MPI processes: 4
Threads/MPI process: 3
R & G space division: proc/nbgrp/npool/nimage = 4
Info: using nr1, nr2, nr3 values from input
Info: using nr1, nr2, nr3 values from input
IMPORTANT: XC functional enforced from input :
Exchange-correlation = PBE ( 1 4 3 4 0 0)
Any further DFT definition will be discarded
Please, verify this is what you really want
file C.pbe-n-kjpaw_psl.0.1.UPF: wavefunction(s) 2P renormalized
Parallelization info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Min 268 108 37 5896 1489 284
Max 269 109 38 5899 1509 285
Sum 1075 433 151 23589 5985 1139
Check: negative/imaginary core charge= -0.000004 0.000000
Gaussian broadening (default values): ngauss,degauss= 0 0.003675
DOS : 1.05s CPU 0.82s WALL
This run was terminated on: 14:22:57 12Nov2017
=------------------------------------------------------------------------------=
JOB DONE.
=------------------------------------------------------------------------------=
But this workaround still does not solve the root cause of the
problem. I would like to use the newest version of QE as much as
possible, but this problem really gets on my nerves. Until the time it
works properly, I might have to stick to older releases.
Date: Fri, 10 Nov 2017 16:14:25 +0100
From: Paolo Giannozzi <p.giannozzi at gmail.com>
Subject: Re: [Pw_forum] dos.x issues
To: PWSCF Forum <pw_forum at pwscf.org>
Message-ID:
<CAPMgbCsUWmXuJUybE5m21dCEZ6BwBvqdu5Lz=ea2TN38EZ6xvA at mail.gmail.com>
Content-Type: text/plain; charset="utf-8"
I would first of all try to run it on a single processor. I don't think
dos.x is parallelized.
Paolo
On Fri, Nov 10, 2017 at 1:47 PM, Wilbert James Futalan
<wilbert.james.futalan at gmail.com> wrote:
> Hi, I'm having some trouble running dos.x in our cluster. I tried running the very same pw.x (scf and nscf) and dos.x in my own personal computer and they both seem to work just fine. Running it on the cluster however gives this error:
>
> Parallel version (MPI & OpenMP), running on 12 processor cores
> Number of MPI processes: 4
> Threads/MPI process: 3
> R & G space division: proc/nbgrp/npool/nimage = 4
>
> Info: using nr1, nr2, nr3 values from input
>
> Info: using nr1, nr2, nr3 values from input
> rank 0 in job 1 piv01_49932 caused collective abort of all ranks
> exit status of rank 0: return code 174
>
> What do you think could be wrong. I'm using QE 6.1 for the cluster but 5.4 for my computer.
>
> Thanks.
>
>
>
> --
> Wilbert James C. Futalan
> Research Fellow I
> Laboratory of Electrochemical Engineering
> Department of Chemical Engineering
> University of the Philippines - Diliman
>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://pwscf.org/mailman/listinfo/pw_forum
>
--
Paolo Giannozzi, Dip. Scienze Matematiche Informatiche e Fisiche,
Univ. Udine, via delle Scienze 208, 33100 Udine, Italy
Phone +39-0432-558216, fax +39-0432-558222
--
Engr. Wilbert James C. Futalan
Research Fellow I
Laboratory of Electrochemical Engineering
Department of Chemical Engineering
University of the Philippines - Diliman
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20171112/c30bbfe8/attachment.html>
More information about the users
mailing list