[Pw_forum] The -np option and openmpi.

Andre Martinotto almartinotto at gmail.com
Sat Aug 28 18:19:09 CEST 2010


Dear Joaquim,

I suggest you do some tests using -ntg parallelization level also  (
http://www.quantum-espresso.org/user_guide/node18.html).

For example:

/opt/openmpi/bin/mpirun -np 64 /opt/espresso-4.1.2/bin/pw.x -npool 8 -ntg 8
< input.in

I got a good performance using -ntg parallelization level

Bests,
Andre

On Sat, Aug 28, 2010 at 11:10 AM, joaquin peralta <jperaltac at gmail.com>wrote:

> Dear André
>
> I'm embarrassed, it was always my mistake. I have scripts that
> generate these pbs and the option was always bad allocated for
> openmpi. I was using -np instead of -npool.
>
> I will try again, in order to check the performance
>
> Thank you so much
>
> Joaquin
>
> On Sat, Aug 28, 2010 at 8:42 AM, Andre Martinotto
> <almartinotto at gmail.com> wrote:
> > Dear Joaquin,
> >
> > In principle the process of compilation appears to be correct.
> >
> > What is your execution command? You are using a -npool 8?  The number of
> > process (-np ) is equal to 64 in both cases, but in the second case I
> > believe that you are using a -npool 8.
> >
> > For example, something like:
> > /opt/openmpi/bin/mpirun -np 64 /opt/espresso-4.1.2/bin/pw.x -npool 8 <
> > entrada.in
> >
> >
> > Best regards,
> > André Luis Martinotto
> >
> > Andre Martinotto
> > Email: almartinotto at gmail.com
> > Computing Department
> > Universidade de Caxias do Sul
> > Caxias do Sul - RS, Brazil
> >
> > On Sat, Aug 28, 2010 at 3:05 AM, joaquin peralta <jperaltac at gmail.com>
> > wrote:
> >>
> >> Dear Forum
> >>
> >> I compiled a couple of days ago, openmpi 1.4 and quantum espresso,
> >> however when I sent the job to the queue system. The nodes show that
> >> the command was executed with the "-np 8" option, but the output not :
> >>
> >>      Parallel version (MPI), running on    64 processors
> >>      R & G space division:  proc/pool =   64
> >>
> >> and with openMPI show me different status :
> >>
> >>   Parallel version (MPI), running on   64 processors
> >>   K-points division:     npool     =    8
> >>   R & G space division:  proc/pool =   8
> >>
> >> I'm a little bit confused, I don't understand what I did bad in the
> >> compilation procedure of openmpi1.4 or espresso.
> >>
> >> OpenMPI Settings
> >>
> >> ./configure --prefix=/local/openmpi --disable-dlopen
> >> --enable-mpirun-prefix-by-default --enable-static --enable-mpi-threads
> >> --with-valgrind --without-slurm --with-tm --without-xgrid
> >> --without-loadleveler --without-elan --without-gm --without-mx
> >> --with-udapl --without-psm CC=icc CXX=icpc F77=ifort FC=ifort
> >>
> >> And the quantum espresso setting in the compilation are located here :
> >>
> >> http://www.lpmd.cl/jperalta/uploads/Site/make-ompi.sys
> >>
> >> Really every help I appreciate a lot, because the difference in time
> >> calculations, is huge using the -np for my cases.
> >>
> >> Joaquin Peralta
> >> Materials Science and Engineering
> >> Iowa State University
> >>
> >> --
> >> ----------------------------------------------------
> >> Group of NanoMaterials
> >> ----------------------------------------------------
> >> http://www.gnm.cl
> >> ----------------------------------------------------
> >> Joaquín Andrés Peralta Camposano
> >> ----------------------------------------------------
> >> http://www.lpmd.cl/jperalta
> >>
> >> In a world without frontiers,
> >> who needs Gates and Win.
> >> _______________________________________________
> >> Pw_forum mailing list
> >> Pw_forum at pwscf.org
> >> http://www.democritos.it/mailman/listinfo/pw_forum
> >
> >
> > _______________________________________________
> > Pw_forum mailing list
> > Pw_forum at pwscf.org
> > http://www.democritos.it/mailman/listinfo/pw_forum
> >
> >
>
>
>
> --
> ----------------------------------------------------
> Group of NanoMaterials
> ----------------------------------------------------
> http://www.gnm.cl
> ----------------------------------------------------
> Joaquín Andrés Peralta Camposano
> ----------------------------------------------------
> http://www.lpmd.cl/jperalta
>
> In a world without frontiers,
> who needs Gates and Win.
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20100828/6b8f187d/attachment.html>


More information about the users mailing list