[Pw_forum] GIPAW memory problem?
Aleksander Jaworski
aleksander.jaworski at mmk.su.se
Wed May 21 18:27:29 CEST 2014
Dear Davide,
Using SVN versions of QE and GIPAW my glass nmr inputs are passing
initialization step finally, but then I'm getting an error:
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine cdiaghg (118):
S matrix not positive definite
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
I have checked with '-ndiag 1' flag and it is same.
Do you have any ideas how to proceed further?
Seems that I'm really unlucky with my system or/and machine:):)
greetings,
Aleksander
full nmr output below:
Program QE v.5.1rc2 starts on 20May2014 at 7: 3:26
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details
at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 1 processors
***** This is GIPAW svn revision unknown *****
Parallelizing q-star over 1 images
Message from routine gipaw_readin:
*** isolve is obsolete, use diagonalization instead ***
Message from routine gipaw_readin:
*** iverbosity is obsolete, use verbosity instead ***
Info: using nr1, nr2, nr3 values from input
Info: using nr1s, nr2s, nr3s values from input
IMPORTANT: XC functional enforced from input :
Exchange-correlation = SLA PW PBX PBC ( 1 4 3 4 0)
Any further DFT definition will be discarded
Please, verify this is what you really want
file Sc.pbe-spn-kjpaw_psl.0.2.3.UPF: wavefunction(s) 3P 3D
renormalized
WARNING: atomic wfc # 3 for atom type 2 has zero norm
WARNING: atomic wfc # 4 for atom type 2 has zero norm
WARNING: atomic wfc # 3 for atom type 3 has zero norm
WARNING: atomic wfc # 4 for atom type 3 has zero norm
G-vector sticks info
--------------------
sticks: dense smooth PW G-vecs: dense smooth PW
Sum 8109 3249 885 550217 138985 20341
Subspace diagonalization in iterative solution of the eigenvalue
problem:
a serial algorithm will be used
GIPAW projectors -----------------------------------------------
atom= Sc l=0 rc= 1.3000 rs= 0.8667
atom= Sc l=0 rc= 1.3000 rs= 0.8667
atom= Sc l=1 rc= 1.5000 rs= 1.0000
atom= Sc l=1 rc= 1.8000 rs= 1.2000
atom= Sc l=2 rc= 1.6000 rs= 1.0667
atom= Sc l=2 rc= 1.6000 rs= 1.0667
projs nearly linearly dependent: l=2 n1,n2= 1, 2 s= 0.99858295
atom= Al l=0 rc= 2.0000 rs= 1.3333
atom= Al l=1 rc= 2.0000 rs= 1.3333
atom= Al l=0 rc= 2.0000 rs= 1.3333
atom= Al l=1 rc= 2.0000 rs= 1.3333
atom= Al l=2 rc= 2.0000 rs= 1.3333
projs nearly linearly dependent: l=0 n1,n2= 1, 2 s= 0.99761080
projs nearly linearly dependent: l=1 n1,n2= 1, 2 s= 0.99948100
atom= Si l=0 rc= 2.0000 rs= 1.3333
atom= Si l=1 rc= 2.0000 rs= 1.3333
atom= Si l=0 rc= 2.0000 rs= 1.3333
atom= Si l=1 rc= 2.0000 rs= 1.3333
atom= Si l=2 rc= 2.0000 rs= 1.3333
projs nearly linearly dependent: l=0 n1,n2= 1, 2 s= 0.99373142
projs nearly linearly dependent: l=1 n1,n2= 1, 2 s= 0.99879796
atom= O l=0 rc= 1.3000 rs= 0.8667
atom= O l=0 rc= 1.3000 rs= 0.8667
atom= O l=1 rc= 1.3000 rs= 0.8667
atom= O l=1 rc= 1.3000 rs= 0.8667
projs nearly linearly dependent: l=1 n1,n2= 1, 2 s= 0.99505596
-----------------------------------------------------------------
alpha_pv= 96.5280 eV
GIPAW job: nmr
NMR macroscopic correction: yes
0.6667 0.0000 0.0000
0.0000 0.6667 0.0000
0.0000 0.0000 0.6667
Largest allocated arrays est. size (Mb) dimensions
KS wavefunctions at k 26.58 Mb ( 17422, 100)
KS wavefunctions at k+q 26.58 Mb ( 17422, 100)
First-order wavefunctions 265.84 Mb ( 17422, 100, 10)
First-order wavefunct (US) 159.50 Mb ( 17422, 100, 6)
Charge/spin density 2.85 Mb ( 373248, 1)
Induced current 25.63 Mb ( 373248, 3,3,1)
Induced magnetic field 25.63 Mb ( 373248, 3,3,1)
NL pseudopotentials 71.78 Mb ( 17422, 270)
GIPAW NL terms 95.70 Mb ( 17422, 360)
Computing the magnetic susceptibility isolve=1 ethr=
0.1000E-13
k-point # 1 of 36 pool # 1 cpu time: 39.1
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine cdiaghg (118):
S matrix not positive definite
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
On Wed, 14 May 2014 13:55:57 +0200, Davide Ceresoli
<davide.ceresoli at istm.cnr.it> wrote:
> Dear Aleksander,
> I've updated 'configure'. Do 'svn update' in qe-gipaw directory.
> I've tested with intel xe2011, intel xe2013 and gfortran-4.8 and
> both openmpi-1.6.x and mpich-3.x.
>
> Let us know!
>
> Best,
> Davide
>
>
>
> On 05/13/2014 08:41 PM, Aleksander Jaworski wrote:
>> Thank you very much Davide for your reply and suggestions.
>>
>> I'm using gfortran and mpif90 compilers.
>>
>> I have tried to run gipaw.x with '-ndiag 1' but is not changing
anything.
>>
>> I have compiled SVN version of QE and then using
>>
>> 'svn checkout svn://cvs.qe-forge.org/scmrepos/svn/qe-gipaw/trunk
>> qe-gipaw'
>>
>> downloaded SVN version of gipaw. But when I'm trying to compile it
error
>> occurs:
>>
>>
>> checking quantum-Espresso version... 5.1rc2
>> configure: error: Cannot compile against this version of
quantum-Espresso
>>
>> how could I go around that?
>>
>> To be honest, I'm not fully aware of the I/O bound, and I don't know
how
>> to identify it as a bottleneck.
>>
>> Thanks for checking my inputs. Fact that they are running on your
machine
>> means something fishy with my installation.
>>
>> best,
>> Aleksander
>>
>>
>>
>> On Tue, 13 May 2014 14:01:43 +0200, Davide Ceresoli
>> <davide.ceresoli at istm.cnr.it> wrote:
>>> Dear Aleksander,
>>> the gipaw behavior you reported is clearly odd. The
initialization
>>> phase should last few seconds and do not consume much memory.
>>>
>>> I've just tested your input files and they work on my machine.
>>> In your case, GIPAW appears to be stuck while reading the charge
>>> density or in check_para_diag routine. Could you have an I/O problem?
>>> Which compiler/MPI are you using?
>>>
>>> Could you try gipaw with '-ndiag 1'? or, could you try the SVN
>>> version (both QE and GIPAW)?
>>>
>>> Let me know if nothing can work.
>>>
>>> Best,
>>> Davide
>>>
>>>
>>>
>>>
>>> On 05/12/2014 05:18 PM, Aleksander Jaworski wrote:
>>>> Dear QE users and developers,
>>>>
>>>>
>>>> I'm experiencing issues with running gipaw.x on the glass structures
>>>> which
>>>> have been created from the classical MD simulations trajectories.
>>>> On the glass pw.x is running smoothly and converging properly.
Gipaw.x
>> is
>>>> starting, but showing very strange behavior in terms of memory
handling
>>>> and
>>>> never terminating the job.
>>> _______________________________________________
>>> Pw_forum mailing list
>>> Pw_forum at pwscf.org
>>> http://pwscf.org/mailman/listinfo/pw_forum
>>
More information about the users
mailing list