[QE-users] QE Parallelization for Phonon Dispersion calculations

Kiran Yadav kiranyadav0816 at gmail.com
Tue Nov 24 10:27:20 CET 2020


Dear Lorenzo,
Is there any parallelization method so that I can make images such that 20
dyn files will get distributed unequally on the equal number of processors
or dyn files will get distributed equally on unequal number of processors ?

Because in my case as per my observation dyn1-10 are not taking too much
time, most of the time has been spent on dyn11-dyn20 generation. So, If I
could run dyn 1-10  on one image and distribute remaining dyn11-20 on 3
images CPU walltime can be decreased. Is it possible to do something like
that?

I tried parallelization on 256 processors (#PBS -l select=16:ncpus=16) for
10hrs using the following parallelization command
time -p mpirun -np $PBS_NTASKS ph.x -ni 4 -nk 4 -nt 4 -nd 16 -input  ph.in >
ph.out
in this case 20dyn files got distributed in 4 images i.e 5 dyn on one
image.  256/4=

Thanks & Regards,
Kiran Yadav
Research Scholar
Electronic Materials Laboratory (TX-200G)
Dept. of Materials Science & Engineering
Indian Institute of Technology, Delhi




On Tue, Nov 24, 2020 at 1:19 PM Lorenzo Paulatto <paulatz at gmail.com> wrote:

>
> > I have been trying to calculate Phonon dispersion with 6*6*6 nq grid.
> > It generates dyn0+20 other dynamical matrices, but the time taken by
> > each dynamical matrix file is different for completion. I ran these
> > phonon dispersion calculations using parallelization, but couldn't
> > optimize correctly.
>
> This is normal, because different q-points have different symmetries,
> the code has to use different numbers of k-points.
>
>
> regards
>
>
> _______________________________________________
> Quantum ESPRESSO is supported by MaX (www.max-centre.eu)
> users mailing list users at lists.quantum-espresso.org
> https://lists.quantum-espresso.org/mailman/listinfo/users
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20201124/002cba56/attachment.html>


More information about the users mailing list