[Pw_forum] phonon calculation for large system

Gabriele Sclauzero sclauzer at sissa.it
Thu Apr 30 12:47:09 CEST 2009


Huiqun Zhou wrote:
> Gabriele,
> 
> I just downloaded the CVS version two days ago, but found no INPUT_PH
> in Doc. Should you please post one, or let me know how to generate it?

try:
cvs update -d doc-def

you should find then a directory doc-def inside your cvs tree, and inside there will be 
template files for automatic generation of the documentation.
 From inside doc-def, type
make
but this probably won't work unless you have all needed tcl libraries installed.
Anyway you can read directly the INPUT_PH.def file, searching for start_q, start_irr, etc. 
etc.

GS

> 
> Huiqun Zhou
> @Earth Sciences, Nanjing University, China
> 
> 
> ----- Original Message ----- 
> From: "Gabriele Sclauzero" <sclauzer at sissa.it>
> To: "PWSCF Forum" <pw_forum at pwscf.org>
> Sent: Thursday, April 30, 2009 3:46 PM
> Subject: Re: [Pw_forum] phonon calculation for large system
> 
> 
>>
>> zq wu wrote:
>>> Dear pwscf users,
>>>
>>> I need to do phonon calculation for large system (several hundreds atoms
>>> per cell). Since i can use lots of processors, I plan to submit many
>>> jobs,  each of which will calculate 1 phonon mode. But i do not know
>>> whether it is available to get final dynamic matrix from these separate
>>> runs and how to do it. Can any one help me?
>>>
>>>  Currently, phonon modes are calculated one by one in sequence in phonon
>>> calculation. Can we do some parallelization here? Can we divide the
>>> processors into several groups and let each group take care of one or
>>> more modes in a way similar to K-points parallelization?  I think that
>>> the parallel efficeincy for modes parallelization should also be close
>>> to K-points parallelization since the phonon modes calculation is
>>> independent of each other.  The modes parallelization should be very
>>> useful for system with large numbers of atoms.  Does anyone have the
>>> idea how hard it is to do the modes parallelization?
>> If I correctly understood, i think that this kind of parallelization has 
>> already been
>> exploited in the latest versions of CVS, in order to use ph.x on the GRID. 
>> I don't know if
>> all the related problems have been fixed at this stage (have they, 
>> Paolo?), but you can
>> try to download the CVS and use the keywords start_q, last_q, start_irr, 
>> last_irr (see
>> INPUT_PH) to split your job in many independent jobs. Sorry, I don't know 
>> the details, but
>> at least now you're aware that there is this possibility available (though 
>> not extensively
>> tested, I think, so use _with_ _care_ and help debugging, please).
>>
>> HTH
>>
>> GS
>>
>>
>>> Thanks
>>>
>>> Zhongqing
>>>
>>> CACS  University of Southern California
>>>
>>>
>>>
>>> ------------------------------------------------------------------------
>>>
>>> _______________________________________________
>>> Pw_forum mailing list
>>> Pw_forum at pwscf.org
>>> http://www.democritos.it/mailman/listinfo/pw_forum
>> -- 
>>
>>
>> o ------------------------------------------------ o
>> | Gabriele Sclauzero, PhD Student                  |
>> | c/o:   SISSA & CNR-INFM Democritos,              |
>> |        via Beirut 2-4, 34014 Trieste (Italy)     |
>> | email: sclauzer at sissa.it                         |
>> | phone: +39 040 3787 511                          |
>> | skype: gurlonotturno                             |
>> o ------------------------------------------------ o
>> _______________________________________________
>> Pw_forum mailing list
>> Pw_forum at pwscf.org
>> http://www.democritos.it/mailman/listinfo/pw_forum
>>
> 
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
> 

-- 


o ------------------------------------------------ o
| Gabriele Sclauzero, PhD Student                  |
| c/o:   SISSA & CNR-INFM Democritos,              |
|        via Beirut 2-4, 34014 Trieste (Italy)     |
| email: sclauzer at sissa.it                         |
| phone: +39 040 3787 511                          |
| skype: gurlonotturno                             |
o ------------------------------------------------ o



More information about the users mailing list