[Pw_forum] parallel scaling in PWSCF

Andrea Ferretti ferretti.andrea at unimore.it
Wed Apr 26 22:28:21 CEST 2006


Hi everybody,

I am currently running a Copper surface with 140 Cu atoms + a molecule...
the system has 1642 electrons and (due to metallicity) the calculation is 
performed for 985 bands (few kpt, like 4)...
due to the 11 electrons for each Cu atom, I have a huge number of bands in 
a (relatively) small cell, and so a (relatively) low number of PWs respect 
to nbnd.
taking a look at the dimension of wfc, no problem with memory in 
principle, even if, due to the weird 
dimensions of the system, non-scalable memory is quite large, around 1Gb.

on a IBM Sp5 machine I observed a severe limit in the scaling passing from 
32 to 64 procs using both espresso 2.1.x and espresso 3.0...
( anyway, I succeeded in performing a "relax" calculation for the system 
!!!! )

as far as I know, this problem might be connected to a serial part in the 
diagonalization which has been parallelized in the current CVS version 
(as already pointed out by Axel)...
At the moment I am testing this CVS version against my system, I will let 
you know the results as soon as possible...

cheers
andrea  


-- 
Andrea Ferretti
Dipartimento di Fisica, Universita' di Modena e Reggio Emilia
Natl. Res. Center S3 INFM-CNR  ( http://s3.infm.it )
Via Campi 213/A I-41100 Modena, Italy
Tel:     +39 059 2055283
Fax:     +39 059 374794
E-mail:  ferretti.andrea at unimore.it
URL:     http://www.nanoscience.unimo.it

Please, if possible, don't  send me MS Word or PowerPoint attachments
Why? See:  http://www.gnu.org/philosophy/no-word-attachments.html


On Wed, 26 Apr 2006, Fernando A Reboredo wrote:

> Nichols
> The largest things I have run with pwscf are cobalt clusters with stuff on 
> the surface
> 55 Co atoms 12 CO molecules and 12au of vacuum around, 45 Ry ecut.
> ~560 electron including spin and PBE. Those calculation run in 48 nodes 90 
> processors. I did not study the scaling carefully. The scaling was 
> reasonably good provided the number of proccessors was a submultiple of the 
> FFT grid dimension.
>  Fernando Reboredo
> 
> ----- Original Message ----- 
> From: "Axel Kohlmeyer" <akohlmey at cmm.upenn.edu>
> To: <pw_forum at pwscf.org>
> Sent: Wednesday, April 26, 2006 1:38 PM
> Subject: Re: [Pw_forum] parallel scaling in PWSCF
> 
> 
> On 4/26/06, Nichols A. Romero <naromero at gmail.com> wrote:
> > Hi,
> 
> hi.
> 
> > Has anyone performed any very-large scale DFT calculations on PWSCF
> > using over 64 processors and over 1000 atoms? Does anyone know what
> > its current limits (system sizes and processors) are on parallel
> > computing environments with fast interconnects?
> 
> i've recently run a comparatively large job (272 atoms, 560 electrons,
> 4x4x4 k-points) across 768 processors on a Cray XT3. i had to hack
> some constants to make it work. however the scaling is not (yet)
> so good and depending on what kind of atoms you want to put into
> your system, i.e. if it is metallic, you may be better of with gamma
> only and car-parrinello dynamics. i ran the same system as above
> with a car-parrinello code (one that is not in quantum espresso) and
> would scale out at 512 nodes for gamma point only on an IBM BG/L.
> even though i had to use a much smaller time step, i did get much
> more trajectory that way.
> 
> judging from the CVS commit messages, efforts to optimize the
> quantum espresso codes for that kind of machine with large numbers
> of nodes are underway. note, that for systems as large as that,
> you might run into the scaling limitations of DFT with plane waves,
> so for a system that big using one of those 'linear scaling' DFT codes
> could be a more promising alternative.
> 
> best regards,
>      axel.
> 
> 
> >
> > Thanks,
> > --
> > Nichols A. Romero, Ph.D.
> > 1613 Denise Dr. Apt. D
> > Forest Hill, MD 21050
> > 443-567-8328 (C)
> > 410-306-0709 (O)
> > _______________________________________________
> > Pw_forum mailing list
> > Pw_forum at pwscf.org
> > http://www.democritos.it/mailman/listinfo/pw_forum
> >
> >
> 
> 
> --
> =======================================================================
> Axel Kohlmeyer   akohlmey at cmm.chem.upenn.edu   http://www.cmm.upenn.edu
>   Center for Molecular Modeling   --   University of Pennsylvania
> Department of Chemistry, 231 S.34th Street, Philadelphia, PA 19104-6323
> tel: 1-215-898-1582,  fax: 1-215-573-6233,  office-tel: 1-215-898-5425
> =======================================================================
> If you make something idiot-proof, the universe creates a better idiot.
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum 
> 
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
> 



More information about the users mailing list