[QE-users] subspace diagonalization in QE-6.4.1
Manoel VF Barrionuevo
manoelvfb at gmail.com
Fri Nov 22 01:25:41 CET 2019
Dear community.
I'm a newbie in Quantum-ESPRESSO, and I got stuck in a situation a
little confusing for me. Long story short: I was trying to compile
QE-6.4.1 recently, and as far as I can see everything is ok, but when I
run it I see that the subspace diagonalization is placed as serial and
not parallel.
So, that being said here is some details of what I have done so far:
After I downloaded the source and applied the patch, I loaded the
necessary modules (my case were: intel2019, mkl2019, intelmpi2019,
openmpi version 2.1.5 is already the default loaded openmpi in my
working environment), and I set the environment variables below wihthin
the qe-641 directory:
ESPRESSO_ROOT=$(pwd)
export SCALAPACK_LIBS="-L${MKLROOT}/lib/intel64 -libmkl_scalapack_lp64
-lmkl_blacs_intelmpi_lp64 -lmkl_intel_lp64 -lmkl_core -lmkl_intel_thread
-lpthread -lm"
export LAPACK_LIBS="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64
-libmkl_intel_lp64 -lmkl_blacs_intelmpi_lp64 -lmkl_intel_lp64 -lmkl_core
-lmkl_intel_thread -lpthread -lm"
export BLAS_LIBS="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64
-lmkl_blacs_intelmpi_lp64 -lmkl_intel_lp64 -lmkl_core -lmkl_intel_thread
-lpthread -lm"
export CC="icc -D_Float128=__float128"
export FC=ifort
export F77=ifort
export MPIF90=mpiifort
Then fired configure as:
./configure --prefix=${ESPRESSO_ROOT} --enable-parallel --enable-openmp
--with-scalapack=intel
Finally I got a make.inc that seems like this (only changed CFLAGS and
FFLAGS by adding -xCORE-AVX512):
IFLAGS = -I$(TOPDIR)/include -I$(TOPDIR)/FoX/finclude
-I$(TOPDIR)/S3DE/iotk/include/ -I${MKLROOT}/include
[...]
DFLAGS = -D__DFTI -D__MPI -D__SCALAPACK
[...]
MPIF90 = mpiifort
F90 = ifort
CC = icc -D_Float128=__float128
F77 = ifort
[...]
CFLAGS = -O3 -xCORE-AVX512 $(DFLAGS) $(IFLAGS)
F90FLAGS = $(FFLAGS) -nomodule -qopenmp -fpp $(FDFLAGS)
$(CUDA_F90FLAGS) $(IFLAGS) $(MODFLAGS)
FFLAGS = -O2 -xCORE-AVX512 -assume byterecl -g -traceback
-qopenmp -I${MKLROOT}/include/fftw
So, after that I fired 'make all' and 'make install' and everything went
fine without any error message.
Hence I prepared the following script to run a calculation:
export I_MPI_HYDRA_BOOTSTRAP_EXEC=/opt/pbs/bin/pbs_tmrsh
export I_MPI_HYDRA_BOOTSTRAP=rsh
export FI_PROVIDER=tcp
export OMP_NUM_THREADS=1
export MKL_NUM_THREADS=1
export KMP_AFFINITY=scatter,granularity=fine,1
nk=V
ni=W
nb=X
nd=Y
nt=Z
N=`echo "$ni*$nk*$nb" | bc -l`
CMD_QE=/opt/sw/qe-641/bin/pw
mpiexec.hydra -n $N $CMD_QE/pw.x -npool $nk -ndiag $nd -ntg $nt -ni $ni
-nb $nb < $CALC_DIR/input.in >& $CALC_DIR/output.out
Everything that I run using this script goes fine, but the question is
still: why subspace diagonalization is serial?
I have played a lot by changing -nk -nb -nd -nt values without success.
I already have installed the QE-6.1.0 version and I took almost the same
approach as here and everything works fine, this is, I get the subspace
diagonalization to work in parallel.
The reason I would like to try QE-6.4.1 is due the grimme-d3 method it
has implemented (and because I would like to try GPU in near future for
learning purposes).
I apologize for my question, and I will be glad to get some help from
you to better understand what is missing and where I'm doing it wrong.
Nevertheless, if I missed any information and there is something that I
should tell, please, just name it and I will be happy to clarify
everything as much as I can.
Thanks,
--
UNICAMP Materials Simulation Lab
___________________________________________________
PhD. Student Manoel Victor Frutuoso Barrionuevo
Institute of Chemistry - University of Campinas
Campinas - 13083-970 - São Paulo - Brasil
umsl.iqm.unicamp.br
+55(17)-99723-3966
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20191121/2f916e43/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: umsl6.png
Type: image/png
Size: 189031 bytes
Desc: not available
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20191121/2f916e43/attachment.png>
More information about the users
mailing list