[Pw_forum] How to using gamma point calculation with high efficiency

vega lew vegalew at hotmail.com
Sat Jun 28 16:39:45 CEST 2008


Dear all,

Thanks to the help of Axel Kohlmeyer
and 
Paolo Giannozzi. I could calculate with high efficiency when using 30-40 k-points.
 
I compile Q-E on my cluster with 10.1.015 vision of intel compilers successfully and correctly. Now my cluster can calculate 
very fast when calculating the structure relaxtion with 30-40 k-points. But on my cluster which has 5 quad-core CPU, I must 
using 20 pools to get the highest CPU usage (most of time 90%+, but it's unstable.  70%+ in average was shown by 'sar' 
command). Thanks to Axel's advices, I set the environmental variable OMP_NUM_THREADS=1. The CPU usage in every 5 computers was always the same case. The calculation can be achieved  fast.
If I using 10 or 5 pools the CPU usage can't reach that high. Is this up to snuff?

After testing the lattice optimizations, another questions rises. I need to calculate the surface structure with gamma point only, because of the system composed of ~80 atoms ( scientists always calculate gamma point optimization in my area of researching). But when I calculate the surface structure with gamma point only, I couldn't use many pools. Therefore the cpu usage for gamma point calculation is coming down, about ~20% again. How could I calculation with a high cpu useage?

Thank you for reading.

regards, 

Vega Lew
PH.D Candidate in Chemical Engineeringhh
State Key Laboratory of Materials-oriented Chemical Engineering
College of Chemistry and Chemical Engineering
Nanjing University of Technology, 210009, Nanjing, Jiangsu, China
_________________________________________________________________
Invite your mail contacts to join your friends list with Windows Live Spaces. It's easy!
http://spaces.live.com/spacesapi.aspx?wx_action=create&wx_url=/friends.aspx&mkt=en-us
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20080628/995004bd/attachment.html>


More information about the users mailing list