[Pw_forum] pw.x running but nothing happens

Bipul Rakshit bipulrr at gmail.com
Sun Sep 6 09:03:29 CEST 2009


Dear Wangqj,
The same thing happens to me.
since you are using large no. of wfc, although it shows the job is running
in 8 procs, but sometimes if the installation is not proper, it is running
in 1 procs only.

So better you check the parallel installation using a small job, with
different no. of procs and see whether its taking lesser time as  no. of
procs increases or not?

cheers

2009/9/6 wangqj1 <wangqj1 at 126.com>

>
> Dear pwscf users
>      When I run vc-relax on the computing cluster use one node which has 8
> CPUs.
> The output file is as following:
>
> Program PWSCF     v.4.0.1  starts ...
>      Today is  6Sep2009 at  7:49:30
>      Parallel version (MPI)
>      Number of processors in use:       8
>      R & G space division:  proc/pool =    8
>      For Norm-Conserving or Ultrasoft (Vanderbilt) Pseudopotentials or PAW
> .....................................................................
>      Initial potential from superposition of free atoms
>      starting charge  435.99565, renormalised to  436.00000
>      Starting wfc are  254 atomic +    8 random wfc
>
> After one day ,it still like this and no iteration has completed ,there is
> also no error was turn up .There is no error in the input file because I
> have test it on anthoer computer which has 4 CPUs and it runs well .
> I can't find the reason about this ,any help will be appreciated .
> Best Regards
> Q.J.Wang
> XiangTan University
>
>
>
> ------------------------------
> 没有广告的终身免费邮箱,www.yeah.net <http://www.yeah.net/?from=footer>
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
>
>


-- 
Dr. Bipul Rakshit
Research Associate,
S N Bose Centre for Basic Sciences,
Salt Lake,
Kolkata 700 098
India
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20090906/197c5bd6/attachment.html>


More information about the users mailing list