[Wannier] Crash showing direct lattice mismatch
Christoph Wolf
wolf.christoph at qns.science
Fri Jun 8 07:07:30 CEST 2018
You can see there is a tiny mismatch or 0.001xxx. Increase the number of
digits (do not copy from nscf.out as they are rounded!) It works best if
you take the value from your scf or nscf input file!
Best,
Chris
On Fri, Jun 8, 2018 at 1:56 PM, Anindya Bose <anindya at iiita.ac.in> wrote:
> Dear Wannier90 experts, I am getting the error while running the code
>
> Program PW2WANNIER v.6.2 (svn rev. 14038) starts on 2Jun2018 at 17:44:24
>
> This program is part of the open-source Quantum ESPRESSO suite
> for quantum simulation of materials; please cite
> "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
> "P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 (2017);
> URL http://www.quantum-espresso.org",
> in publications or presentations arising from this work. More details
> at
> http://www.quantum-espresso.org/quote
>
> Parallel version (MPI), running on 4 processors
>
> MPI processes distributed on 1 nodes
> R & G space division: proc/nbgrp/npool/nimage = 4
>
> Reading nscf_save data
>
> Reading data from directory:
> ./Graphene.save/
>
> IMPORTANT: XC functional enforced from input :
> Exchange-correlation = PZ ( 1 1 0 0 0 0)
> Any further DFT definition will be discarded
> Please, verify this is what you really want
>
>
> Parallelization info
> --------------------
> sticks: dense smooth PW G-vecs: dense smooth PW
> Min 118 118 49 43266 43266 11501
> Max 119 119 50 43335 43335 11556
> Sum 475 475 199 173243 173243 46137
>
>
> Spin CASE ( default = unpolarized )
>
> Wannier mode is: standalone
>
> -----------------
> *** Reading nnkp
> -----------------
>
> Checking info from wannier.nnkp file
>
> Something wrong!
> rlatt(i,j) = 0.99566756459498174 at(i,j)= 1.0000000000000000
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> %%%%%%%%%%%%%%%%%%
> Error in routine pw2wannier90 (4):
> Direct lattice mismatch
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
> %%%%%%%%%%%%%%%%%%
>
> stopping ...
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
>
> I have enclosed all my input and output files with this mail.Please help
> me to resolve this isuue.
>
> Thanks and regards,
> Anindya Bose
> _______________________________________________
> Wannier mailing list
> Wannier at lists.quantum-espresso.org
> https://lists.quantum-espresso.org/mailman/listinfo/wannier
>
>
--
Postdoctoral Researcher
Center for Quantum Nanoscience, Institute for Basic Science
Ewha Womans University, Seoul, South Korea
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/wannier/attachments/20180608/c5f23ec9/attachment.html>
More information about the Wannier
mailing list