[Pw_forum] Speeding up the calculations

Mehmet Topsakal metokal at gmail.com
Mon Aug 16 12:33:09 CEST 2010


I need to clarify this situation after Huiqun's response:

"Siesta+PHON: 520 sec at 8CPU" was for two displacement calculations in total.
If you have 16 cpu, you can
run 2 displacement calculation at the same time. This reduces 520 to 260.

After this post, you and me learned that it is also possible to reduce 328
with q_point splitting.....

You can examine my files at http://unam.bilkent.edu.tr/mt2/random/pw_forum/


On Mon, Aug 16, 2010 at 11:45 AM, Huiqun Zhou <hqzhou at nju.edu.cn> wrote:

>  Mehmet,
>
> Hope you are the same person who posted a very good tutorial on phonon
> calculation via
> small displacement method (SDM) and DFPT at Siesta forum.
>
> Interestingly, I noticed that in your tutorial DFPT (PW+PH: 328 sec at 8CPU)
> was much faster
> than SDM (Siesta+PHON: 520 sec at 8CPU, VASP+PHON: 1674 sec at 16CPU) for the
> case
> of graphene, at least by the criteria of getting identical phonon
> dispersion relation. So, your
> current recommendation means that was just a individual case for you,
> right?
>
> My feel (no concret numbers) with qe shows that for the previous version
> of qe (before 4.0
> or earllier?) PW+PH was somewhat slower than PW+PHON, but the situation is
> reversed
> with recent versions of qe.
>
> If you have further examples, please share with us.
>
> Thank you in advance!
>
> huiqun zhou
> @earth sciences, nanjing university, china
>
>
> ----- Original Message -----
> *From:* Mehmet Topsakal <metokal at gmail.com>
> *To:* PWSCF Forum <pw_forum at pwscf.org>
> *Sent:* Sunday, August 15, 2010 3:52 PM
> *Subject:* Re: [Pw_forum] Speeding up the calculations
>
> Dear  Bipul,
>
> You can try Small Displacement Method ( see
> http://www.homepages.ucl.ac.uk/~ucfbdxa/ ) which has different idea than
> DFPT.
> It requires you to make 2 displacement in a supercell and calculate the
> forces with single scf. You can run scf's at the same time.
> For example, you need to calculate the forces in 5x5x1 graphene supercell
> having 50 carbon atoms for 2 displacemet.
>
> BUT you cannot observe LO-TO splitting in polar materials (such as
> honeycomb monolayer ZnO) by using SDM.
>
> Regards.
>
> On Sat, Aug 14, 2010 at 9:44 PM, Bipul Rakshit <bipulrr at gmail.com> wrote:
>
>> Hi PWSCF user,
>> I am mainly doing Phonon calculation, which generally is time consuming (
>> even in parallel). I just want to know if there is any possible way to speed
>> up the calculations. i.e. by installing certain libraries or so. Or using
>> some appropirate methods
>>
>> The machine had many nodes, and each nodes have 8/16 processors. Some of
>> the other details of the machines are as follows:
>>
>>  Intel(R) Xeon(R) CPU  E5345  @ 2.33GHz, 64 bit
>> 4GB RAM
>>
>> compiler: mpif90 and icc
>>
>> Thanks
>> --
>> Bipul Rakshit
>> Research Fellow
>> S N Bose Centre for Basic Sciences,
>> Salt Lake,
>> Kolkata 700 098
>> India
>>
>> _______________________________________________
>> Pw_forum mailing list
>> Pw_forum at pwscf.org
>> http://www.democritos.it/mailman/listinfo/pw_forum
>>
>>
>
>
> --
>
> Mehmet Topsakal  (Ph.D. Student)
> UNAM-Institute of Materials Science and Nanotechnology.
> Bilkent University. 06800 Bilkent, Ankara/Türkiye
> Tel: 0090 312 290 3527 ; Fax: 0090 312 266 4365
> UNAM-web  : www.nano.org.tr
>
>  ------------------------------
>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
>
>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
>
>


-- 

Mehmet Topsakal  (Ph.D. Student)
UNAM-Institute of Materials Science and Nanotechnology.
Bilkent University. 06800 Bilkent, Ankara/Türkiye
Tel: 0090 312 290 3527 ; Fax: 0090 312 266 4365
UNAM-web  : www.nano.org.tr
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20100816/50189984/attachment.html>


More information about the users mailing list