[QE-users] High k-point density memory estimates
Eric Glen Suter
esuter at uga.edu
Tue Oct 22 18:10:17 CEST 2019
Hi all,
I'm running jobs with a relatively high number of kpoints with QE 6.2. For a while I was using the dynamical memory estimates output by the pw.x executable as a guidepost for requesting memory on a cluster, and this worked well when I had a relatively small number of kpoints. But it turns out that for these dense grids I need significantly more than the estimate indicates. The estimate says a few hundred Mb, but the actual resources used indicate it's more on the order of several tens of Gb.
I find in the user manual an estimate of the number of double precision complex floating point numbers that would be needed (
O = mMN + PN + pN1N2N3 + qNr1Nr2Nr3
) and this seems to give an estimate on the order of what the output file says.
Is there something I'm missing that goes into determining how much memory a job should take? I'm mostly interested to know if there's a good way to predict how much memory future jobs will need so I can make a more educated guess when I request memory.
Any insights you can offer are greatly appreciated.
Best regards,
Eric Suter
----------------------------
PhD Candidate, Dept. of Physics and Astronomy
Center for Simulational Physics
University of Georgia
----------------------------
email: esuter at uga.edu
phone: 912-856-3071
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20191022/da7340b0/attachment.html>
More information about the users
mailing list