[QE-users] Problems with hp.x
Giuseppe Mattioli
giuseppe.mattioli at ism.cnr.it
Mon Jun 15 22:55:59 CEST 2020
Dear Dominik
I suppose that the problem is not in hp.x, but in the application of
the linear-response method itself to Zn(2+). Zn(2+) is a d10
transition metal, with the 3d band fully occupied. In ZnO, e.g., the
Zn 3d band is quite narrow and placed below the O 2p valence band, and
I suppose that the same holds for ZnS, with the Zn 3d band pushing up
the S 3p band. When you apply the LR method to Zn, you compute
quantities such as d(alpha)/dn, where alpha is the (small)
perturbation and n is the occupation number of d orbitals on site I
(see International Journal of Quantum Chemistry 2014, 114, 14 for
details). If the shell is full, then you can perturb whatever you want
but you will never obtain more than the full occupation of the shell
that you already have in the unperturbed system. This is likely the
reason for the crazy values of LR U you obtain. If you want to correct
the strong delocalization error of the Zn 3d narrow band within the
DFT+U formalism, then you must use a "semiempirical" approach,
choosing, e.g., the U value that places the Zn 3d shell at the correct
distance from the valence band maximum. In this case, I would
recommend the use of a second +U correction on the S 3p shell, which
should ensure a good recovery of the ZnS band gap. I've satisfactorily
used this scheme in the case of ZnO in several publications, from
which you may want to take inspiration (Adv. Energy Mater. 2014, 4,
1301694).
HTH
Giuseppe
Quoting dv009200 at fh-muenster.de:
> Hello everyone,
>
> I'm trying to calculate the hubbard u parameter for Zn in Zinc sulfide
> (sphalerite structure) with the help of the hp.x code. The calculations
> terminate normally without any errors. The problem is that I get
> (presumably) way too high values for U that also won't converge (if I take
> the value I got from a one-shot calculation and plug it in the SCF input
> and then redo the HP calculation).
>
> For example in the first step I calculate a U = 75.7035 in the second
> iteration I get U = 804.2405 and in the third U = 30999.2684.
>
> This seems unreasonable considering that the calculations for the provided
> examples in the 'HP' folder work fine and converge fast without such a
> massive change to a certain value for U using the above described scheme.
>
> Has someone an idea what is causing this trouble in my system? I already
> tried different PPs, functionals, U_projection_type, thresholds and k and
> q point grids all without success.
>
> Below is my input for the scf and hp calculation
>
> SCF-input:
> &control
> calculation='scf'
> restart_mode='from_scratch',
> pseudo_dir = '/home/dominik/codes/QE6.5/pseudo/'
> outdir='/home/dominik/codes/QE6.5/tempdir/'
> prefix='zns'
> /
> &SYSTEM
> ibrav = 2
> celldm(1)=10.291937439
> nat = 2
> ntyp = 2
> ecutwfc = 60.0
> ecutrho= 720.0
> lda_plus_u = .true.
> lda_plus_u_kind = 0
> U_projection_type = 'atomic'
> Hubbard_U(1) = 1d-8
> /
> &electrons
> mixing_beta=0.7
> conv_thr=1d-15
> /
> ATOMIC_SPECIES
> Zn 65.39 Zn.pbe-dn-rrkjus_psl.0.2.2.UPF
> S 32.07 S.pbe-n-rrkjus_psl.0.1.UPF
> ATOMIC_POSITIONS {alat}
> Zn 0.000000 0.000000 0.000000
> S 0.250000 0.250000 0.250000
> K_POINTS automatic
> 12 12 12 0 0 0
>
>
> HP-input:
> &inputhp
> prefix='zns'
> outdir='/home/dominik/codes/QE6.5/tempdir/'
> nq1 = 2
> nq2 = 2
> nq3 = 2
> conv_thr_chi = 1.0d-10
> iverbosity =2
> /
>
>
> Best regards
>
>
> Dominik Voigt
>
> Dominik Voigt
> PhD Student University of Applied Sciences Münster
> Department of Physical Chemistry
>
> _______________________________________________
> Quantum ESPRESSO is supported by MaX (www.max-centre.eu/quantum-espresso)
> users mailing list users at lists.quantum-espresso.org
> https://lists.quantum-espresso.org/mailman/listinfo/users
GIUSEPPE MATTIOLI
CNR - ISTITUTO DI STRUTTURA DELLA MATERIA
Via Salaria Km 29,300 - C.P. 10
I-00015 - Monterotondo Scalo (RM)
Mob (*preferred*) +39 373 7305625
Tel + 39 06 90672342 - Fax +39 06 90672316
E-mail: <giuseppe.mattioli at ism.cnr.it>
More information about the users
mailing list