[Pw_forum] Cholesky error in higher number of bands

Sitangshu Bhattacharya sitangshu at iiita.ac.in
Mon Apr 17 05:59:05 CEST 2017


Dear Kanak,

Sometimes this error also gets generated when you launch parallel
processing. Can you try shutting down the parallelism and re-run your QE
structure? If the error disappear, then, you havn't possibly done the
parallelism correctly. Are you using Open MPI?
Sometime back this error popped out in my case, however it got removed
after proper installation of openmpi.
Hope this will work now!

With regards,

On Sun, Apr 16, 2017 at 6:16 PM, Kanak Datta <kanak at umich.edu> wrote:

> Dear Dr. Bhattacharya
>
> I have tried cg diagonalization already. It did not work.
>
> Thanks
> Kanak
>
> _______________________________________
> Kanak Datta
> Graduate Student
> Electrical  Engineering and Computer Science
> University of Michigan, Ann Arbor
>
> On Sun, Apr 16, 2017 at 8:24 AM, Sitangshu Bhattacharya <
> sitangshu at iiita.ac.in> wrote:
>
>> Dear Kanak,
>>
>> This is a diagonalization error. My experience is this: You may switch
>> from "david" to "cg" in your relax/scf/nscf QE file and try again. "cg" is
>> most stable and robust.  You may also change the pseudo if this does'nt
>> work. But this would hardly required.
>>
>> Regards,
>> Sitangshu
>>
>> On Sun, Apr 16, 2017 at 5:06 PM, Kanak Datta <kanak at umich.edu> wrote:
>>
>>>
>>> Dear researchers
>>>
>>> I want to perform GW calculation using Berkeley GW package. To begin
>>> with, however, I need bands simulation with a large number of bands using
>>> pw.x in QE. I am doing a calculation on hBN monolayer now. However, when I
>>> use more than 1000 bands in bands simulation I get the following error:
>>>
>>>      Error in routine  cdiaghg (983):
>>>       problems computing cholesky
>>>
>>> I have seen that the same error occurs for other materials when the
>>> number of bands is very large. Has anyone faced this error before? I will
>>> be very grateful if someone could suggest a way around this problem.
>>>
>>> Thanks in advance
>>>
>>> Kanak Datta
>>> University of Michigan, Ann Arbor
>>>
>>>
>>>
>>>
>>> _______________________________________
>>> Kanak Datta
>>> Graduate Student
>>> Electrical  Engineering and Computer Science
>>> University of Michigan, Ann Arbor
>>>
>>> _______________________________________________
>>> Pw_forum mailing list
>>> Pw_forum at pwscf.org
>>> http://pwscf.org/mailman/listinfo/pw_forum
>>>
>>
>>
>>
>> --
>> **********************************************
>> Sitangshu Bhattacharya (সিতাংশু ভট্টাচার্য), Ph.D
>> Assistant Professor,
>> Room No. 2221, CC-1,
>> Nanoscale Electro-Thermal Laboratory,
>> Department of Electrical and Communication Engineering,
>> Indian Institute of Information Technology-Allahabad
>> Uttar Pradesh 211 012
>> India
>> Telephone: 91-532-2922000 Extn.: 2131
>> Web-page: http://profile.iiita.ac.in/sitangshu/
>> Institute: http://www.iiita.ac.in/
>>
>>
>> _______________________________________________
>> Pw_forum mailing list
>> Pw_forum at pwscf.org
>> http://pwscf.org/mailman/listinfo/pw_forum
>>
>
>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://pwscf.org/mailman/listinfo/pw_forum
>



-- 
**********************************************
Sitangshu Bhattacharya (সিতাংশু ভট্টাচার্য), Ph.D
Assistant Professor,
Room No. 2221, CC-1,
Nanoscale Electro-Thermal Laboratory,
Department of Electrical and Communication Engineering,
Indian Institute of Information Technology-Allahabad
Uttar Pradesh 211 012
India
Telephone: 91-532-2922000 Extn.: 2131
Web-page: http://profile.iiita.ac.in/sitangshu/
Institute: http://www.iiita.ac.in/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20170417/7f1c9141/attachment.html>


More information about the users mailing list