[QE-users] Rcompilation of QE 6.7 by Cygwin 64 - result does not work ...
Michal Husak
Michal.Husak at vscht.cz
Fri Apr 16 15:28:56 CEST 2021
Hi
I had recompiled QE 6.7 under Cygwin 64 with gl fortran ... (Windows 7 ) ...
Compilation and installation worked without any error ...
Than I had tried to process a simple l-alanin test input witch works
OK with Windows 7 6.4 QE binaries.
Otutup with error message follow bellow
Error in routine good_fft_order (1):
invalid np
Any idea what is wrong ?
Can enybody eventualy offer 6.7 QE binary compiled for Microsoft MPI
with added link to the XC library
(I nedd SCAN fucntional working) ...
Michal
Michal at Krtek /cygdrive/f/qe_projects/test_cygwin64/LALA_space_group
$ pw.x < LALA_scf.in
Program PWSCF v.6.7MaX starts on 16Apr2021 at 15:19:45
This program is part of the open-source Quantum ESPRESSO suite
for quantum simulation of materials; please cite
"P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
"P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
URL http://www.quantum-espresso.org",
in publications or presentations arising from this work. More details at
http://www.quantum-espresso.org/quote
Parallel version (MPI), running on 1 processors
MPI processes distributed on 1 nodes
Waiting for input...
Reading input from standard input
Current dimensions of program PWSCF are:
Max number of different atomic species (ntypx) = 10
Max number of k-points (npk) = 40000
Max angular momentum in pseudopotentials (lmaxx) = 3
file C.pbe-n-kjpaw_psl.1.0.0.UPF: wavefunction(s) 2S 2P renormalized
file O.pbe-n-kjpaw_psl.0.1.UPF: wavefunction(s) 2P renormalized
Subspace diagonalization in iterative solution of the eigenvalue problem:
a serial algorithm will be used
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Error in routine good_fft_order (1):
invalid np
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
stopping ...
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
--------------------------------------------------------------------------
Michal at Krtek /cygdrive/f/qe_projects/test_cygwin64/LALA_space_group
$
More information about the users
mailing list