[QE-users] Error in routine ggen too many g-vectors
Paolo Giannozzi
p.giannozzi at gmail.com
Fri Apr 27 21:31:07 CEST 2018
This is what I get from the latest version (1 proc but but the result is
the same on 64):
G-vector sticks info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Sum 1417 1159 349 230125 165041 27287
So 226332 G-vectors is definitely not correct. I vaguely remember that a
similar problem was fixed some time ago.
Paolo
On Fri, Apr 27, 2018 at 5:21 PM, Thomas Brumme <thomas.brumme at uni-leipzig.de
> wrote:
> Dear Sohail Ahmad,
>
> I don't know what is the origin of the "too many g-vectors"
> error but I see several errors in your input.
>
> - you want to calculate a 2-dimensional system, so why do
> you need k-points in the z direction? You should actually see
> no dispersion at all in this direction if you correctly setup your
> system
>
> - as mentioned in the dipole example for PP and in all papers
> using such a correction, the dipole needs to be in the vacuum
> region, i.e., there should be no interaction with the system
> and the dipole. Your system starts at the origin and the dipole
> is from 0.9 till 1.1 (i.e. 0.1) - it is not only too close but even
> within the system
>
> - if your system does not converge within - lets say - 300
> electronic steps you have a real problem; 900 makes no sense.
>
> Kind regards
>
> Thomas
>
> On 27.04.2018 16:46, Sohail Ahmad wrote:
>
> Dear QE users and experts
>
> I am trying to relax the bilayer of PdSe2 after introducing electric
> field,,\
> everytime i am getting the following error . I am using QE 6.0. Its
> perfectly running on other machine..
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> %%%%%%%%%%%%%%%%%%
> Error in routine ggen 1 (226332):
> too many g-vectors
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> %%%%%%%%%%%%%%%%%%%
>
> Input/output/batchfile are attached
> ------------------------------------------------------------
> ----------------------------------------
> &control
> calculation = 'relax',
> restart_mode = 'from_scratch',
> pseudo_dir= '/home/sohail/scratch/pseudo',
> outdir='./OUT',
> prefix='PdSe2bAAel004',
> tefield = .true.,
> dipfield = .true.,
> etot_conv_thr = 1.0d-5,
> forc_conv_thr = 1.0d-4,
> /
> &system
> ibrav = 4, a = 3.82, b = 3.82, c = 20.0, cosAB = -0.5, cosAC = 0.0,
> cosBC = 0.0,
> nat = 6, ntyp = 2,
> ecutwfc = 80,
> ecutrho = 400,
> nbnd = 40,
> occupations = 'smearing', smearing = 'gaussian', degauss = 0.001,
> edir = 3,
> eamp = 0.004,
> emaxpos = 0.9,
> eopreg = 0.2,
> /
> &electrons
> mixing_beta = 0.3,
> conv_thr = 1.0d-9,
> electron_maxstep = 900,
> /
> &ions
> ion_dynamics = 'bfgs',
> /
> ATOMIC_SPECIES
> Pd 106.42 Pd.pbe-mt_fhi.UPF
> Se 78.96 Se.pbe-mt_fhi.UPF
> ATOMIC_POSITIONS angstrom
> Pd 0.000000000 0.000000000 0.000000000
> Se -1.908913627 1.102119324 1.282624049
> Se 1.908913627 -1.102119324 -1.282624049
> Pd 0.000000000 0.000000000 5.000000000
> Se -1.908913627 1.102119324 6.282624049
> Se 1.908913627 -1.102119324 3.717375951
> K_POINTS AUTOMATIC
> 16 16 4 0 0 0
> ------------------------------------------------------------
> --------------------------------------------------------
> #!/bin/bash
> #SBATCH --nodes=4
> #SBATCH --tasks-per-node=16
> #SBATCH --time=24:00:00
> #SBATCH --job-name=test
> #SBATCH --mem-per-cpu=10000
>
> echo "Nodes I am on:"
> cat $SLURM_JOB_NODELIST
> echo "Current working directory is `pwd`"
> echo "Running on `hostname`"
>
> module load quantumespresso/6.0
>
> echo "Starting run at: `date`"
> srun pw.x < PdSe2bAAel004.rx.in > PdSe2bAAel004.rx.out
>
> echo "Finished run at: `date`"
>
> ------------------------------------------------------------
> ---------------------------------------------------------
>
> Program PWSCF v.6.0 (svn rev. 13079) starts on 27Apr2018 at 7:15:21
>
> This program is part of the open-source Quantum ESPRESSO suite
> for quantum simulation of materials; please cite
> "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
> URL http://www.quantum-espresso.org",
> in publications or presentations arising from this work. More details
> at
> http://www.quantum-espresso.org/quote
>
> Parallel version (MPI), running on 64 processors
> R & G space division: proc/nbgrp/npool/nimage = 64
> Waiting for input...
> Reading input from standard input
>
> Current dimensions of program PWSCF are:
> Max number of different atomic species (ntypx) = 10
> Max number of k-points (npk) = 40000
> Max angular momentum in pseudopotentials (lmaxx) = 3
> Presently no symmetry can be used with electric field
>
> file Pd.pbe-mt_fhi.UPF: wavefunction(s) 4f renormalized
> file Se.pbe-mt_fhi.UPF: wavefunction(s) 4f renormalized
>
> Subspace diagonalization in iterative solution of the eigenvalue
> problem:
> one sub-group per band group will be used
> scalapack distributed-memory algorithm (size of sub-group: 5* 5
> procs)
>
> Message from routine setup:
> no reason to have ecutrho>4*ecutwfc
>
> Parallelization info
> --------------------
> sticks: dense smooth PW G-vecs: dense smooth PW
> Min 22 18 5 3524 2574 417
> Max 23 19 6 3555 2586 438
> Sum 1417 1159 349 226331 165041 27287
> :
> :
> :
> :
> :
> :
> number of k points= 516 gaussian smearing, width (Ry)= 0.0010
>
> Number of k-points >= 100: set verbosity='high' to print them.
>
> Dense grid: 226331 G-vectors FFT dimensions: ( 45, 45, 243)
>
> Smooth grid: 165041 G-vectors FFT dimensions: ( 45, 45, 216)
>
> Estimated max dynamical RAM per process > 104.83Mb
>
> Estimated total allocated dynamical RAM > 6709.07Mb
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> %%%%%%%%%%%%%%%%%%%
> Error in routine ggen 1 (226332):
> too many g-vectors
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> %%%%%%%%%%%%%%%%%%%
>
> stopping ...
>
> ---------------------------------------------------------------------
>
> *Sohail Ahmad*
> Department of Physics
> King Khalid University
> Abha, Saudi Arabia
> --------------------------------------------------------------------
>
>
> _______________________________________________
> users mailing listusers at lists.quantum-espresso.orghttps://lists.quantum-espresso.org/mailman/listinfo/users
>
>
> --
> Dr. rer. nat. Thomas Brumme
> Wilhelm-Ostwald-Institute for Physical and Theoretical Chemistry
> Leipzig University
> Phillipp-Rosenthal-Strasse 31
> 04103 Leipzig
>
> Tel: +49 (0)341 97 36456
>
> email: thomas.brumme at uni-leipzig.de
>
>
> _______________________________________________
> users mailing list
> users at lists.quantum-espresso.org
> https://lists.quantum-espresso.org/mailman/listinfo/users
>
--
Paolo Giannozzi, Dip. Scienze Matematiche Informatiche e Fisiche,
Univ. Udine, via delle Scienze 208, 33100 Udine, Italy
Phone +39-0432-558216, fax +39-0432-558222
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20180427/6d979ec1/attachment.html>
More information about the users
mailing list