[QE-users] File .wfc1 is not generated by scf run
Pietro Delugas
pdelugas at sissa.it
Tue Jul 31 10:25:22 CEST 2018
I don't know what to suggest.
If the issue were the size of the wfc file ( I don't know why it should
be though ) you can run the postprocessing code using more processes ,
this will split the .wfc file in as many files as the number of mpi
processes.
other thing one can try is to run scf with wf_collect set to false and
disk_io set to high and see if pw is able to write the .wfc files that
then could be read directly by the pp code ( which in this case must be
executed with the same number of processes as pw )
On 30/07/2018 15:56, Martina Lessio wrote:
> Dear Pietro and Lorenzo,
>
> Thanks so much for your prompt response, I really appreciate it. Here
> are my answers to your questions:
> - Pietro, thanks so much for explaining the process by which the .wfc1
> file gets created, I was obviously confused about it. I checked and,
> like you said, I have .dat files in the folders corresponding to
> different k-points in the .save folder. I also don't have a .wfc1 file
> right after running the scf. So this all looks correct based on what
> you said.
> I did not report the error message verbatim, the name of the .wfc1
> file is printed correctly in the post processing output. I will copy
> below my input for the postprocessing code and also the full output I
> get in case it might help understand the issue.
>
> - Lorenzo, yes, I double-checked and my scf calculation is converged.
>
> I would really appreciate any further suggestion. Once again, I did
> not experience such issues with smaller supercell so maybe that gives
> us a clue to what the issue is?
> I am copying below the scf input, the PP input and the PP output with
> the error message.
>
> Thank you so much!
>
> All the best,
> Martina
>
> SCF input file:
> &control
> calculation = 'scf'
> restart_mode='from_scratch',
> prefix='MoTe2ml_super5x5relaxNOsoc_def_sm_scf',
> wf_collect=.TRUE.
> pseudo_dir = '/home/mlessio/espresso-5.4.0/pseudo/',
> outdir='./'
> /
> &system
> ibrav= 4, A=17.65, B=17.65, C=16.882, cosAB=-0.5, cosAC=0, cosBC=0,
> nat= 75, ntyp= 2,
> ecutwfc =60.
> occupations='smearing', smearing='gaussian', degauss=0.01
> nspin=1
> /
> &electrons
> mixing_mode = 'plain'
> mixing_beta = 0.2
> conv_thr = 1.0d-10
> /
> ATOMIC_SPECIES
> Te 127.6 Te_ONCV_PBE_FR-1.1.upf
> Mo 95.96 Mo_ONCV_PBE_FR-1.0.upf
> ATOMIC_POSITIONS {crystal}
> Te 0.134763127 0.067705673 0.314838709
> Te 0.135581610 0.267790815 0.313767464
> Te 0.134763126 0.467057438 0.314838722
> Te 0.133426199 0.666852373 0.315066533
> Te 0.133426206 0.866573824 0.315066528
> Te 0.333830680 0.067661368 0.314045748
> Te 0.337736908 0.270977783 0.312935985
> Te 0.337736909 0.466759125 0.312935994
> Te 0.333830676 0.666169311 0.314045769
> Te 0.333147629 0.866573820 0.315066533
> Te 0.532942565 0.067705695 0.314838702
> Te 0.533240880 0.270977796 0.312935965
> Mo 0.533333335 0.466666657 0.289322903
> Te 0.533240864 0.662263084 0.312935994
> Te 0.532942569 0.865236902 0.314838722
> Te 0.733136182 0.066568083 0.315103907
> Te 0.732209205 0.267790808 0.313767451
> Te 0.729022255 0.466759128 0.312935965
> Te 0.729022252 0.662263081 0.312935985
> Te 0.732209231 0.864418420 0.313767464
> Te 0.933431927 0.066568091 0.315103913
> Te 0.933431915 0.266863851 0.315103907
> Te 0.932294341 0.467057436 0.314838702
> Te 0.932338652 0.666169315 0.314045748
> Te 0.932294331 0.865236897 0.314838709
> Te 0.133264211 0.066582612 0.099418260
> Te 0.132725778 0.266362898 0.098974905
> Te 0.133264209 0.466681585 0.099418245
> Te 0.133234326 0.666468628 0.099305464
> Te 0.133234334 0.866765696 0.099305470
> Te 0.333453459 0.066906928 0.098996402
> Te 0.331845525 0.265879796 0.094385362
> Te 0.331845522 0.465965727 0.094385355
> Te 0.333453453 0.666546534 0.098996380
> Te 0.333531374 0.866765693 0.099305464
> Te 0.533318419 0.066582631 0.099418267
> Te 0.534034274 0.265879815 0.094385388
> Te 0.533333335 0.466666657 0.085781188
> Te 0.534034262 0.668154471 0.094385355
> Te 0.533318422 0.866735819 0.099418245
> Te 0.733260257 0.066630122 0.099465712
> Te 0.733637120 0.266362893 0.098974923
> Te 0.734120236 0.465965734 0.094385388
> Te 0.734120239 0.668154464 0.094385362
> Te 0.733637148 0.867274252 0.098974905
> Te 0.933369890 0.066630128 0.099465706
> Te 0.933369876 0.266739776 0.099465712
> Te 0.933417405 0.466681582 0.099418267
> Te 0.933093092 0.666546536 0.098996402
> Te 0.933417392 0.866735813 0.099418260
> Mo 0.066910177 0.133342732 0.207211832
> Mo 0.066910181 0.333567419 0.207211831
> Mo 0.066714643 0.533429235 0.207488709
> Mo 0.066613591 0.733306803 0.206964121
> Mo 0.066714624 0.933285385 0.207488709
> Mo 0.267519938 0.134476688 0.206033961
> Mo 0.268356012 0.334177979 0.204473191
> Mo 0.267519939 0.533043231 0.206033959
> Mo 0.266693218 0.733306792 0.206964121
> Mo 0.266693225 0.933386407 0.206964121
> Mo 0.466956762 0.134476690 0.206033961
> Mo 0.468945823 0.337891653 0.201981378
> Mo 0.468945838 0.531054173 0.201981377
> Mo 0.466956758 0.732480091 0.206033959
> Mo 0.466570767 0.933285385 0.207488709
> Mo 0.666432579 0.133342735 0.207211833
> Mo 0.665821985 0.334177980 0.204473191
> Mo 0.662108323 0.531054164 0.201981378
> Mo 0.665821981 0.731644003 0.204473191
> Mo 0.666432565 0.933089829 0.207211831
> Mo 0.866666673 0.133333333 0.207317092
> Mo 0.866657239 0.333567418 0.207211833
> Mo 0.865523322 0.533043230 0.206033961
> Mo 0.865523308 0.732480089 0.206033961
> Mo 0.866657248 0.933089841 0.207211832
> K_POINTS {automatic}
> 2 2 1 0 0 0
>
> PP input file:
> &inputpp
> prefix='MoTe2ml_super5x5relaxNOsoc_def_sm_scf',
> outdir='/home/mlessio/MoTe2/NormConservingPseudo/MONOLAYER/DEFECT_SIMULATION/RemoveSOC/5x5x1/nspin1/AddSmearing/PP_ILDOS/SCF/'
> filplot = 'MoTe2ml_5x5noSOCdef_VB-0.4eV_VB'
> plot_num= 10
> emin=2.0832
> emax=2.4832
> /
> &plot
> iflag=3
> output_format=6
> fileout='MoTe2ml_5x5noSOCdef_VB-0.4eV_VB.cube'
> /
>
> PP output:
>
> -------------------------------------------------------------------------
>
> [[27902,1],0]: A high-performance Open MPI point-to-point messaging module
>
> was unable to find any relevant network interfaces:
>
>
> Module: OpenFabrics (openib)
>
> Host: compute-0-5
>
>
> Another transport will be used instead, although this may result in
>
> lower performance.
>
>
> NOTE: You can disable this warning by setting the MCA parameter
>
> btl_base_warn_component_unused to 0.
>
> --------------------------------------------------------------------------
>
>
> Program POST-PROC v.5.4.0 starts on 29Jul2018 at 15:46:19
>
>
> This program is part of the open-source Quantum ESPRESSO suite
>
> for quantum simulation of materials; please cite
>
> "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 (2009);
>
> URL http://www.quantum-espresso.org",
>
> in publications or presentations arising from this work. More details at
>
> http://www.quantum-espresso.org/quote
>
>
> Parallel version (MPI), running on 1 processors
>
>
> Reading data from directory:
>
> /home/mlessio/MoTe2/NormConservingPseudo/MONOLAYER/DEFECT_SIMULATION/RemoveSOC/5x5x1/nspin1/AddSmearing/PP_ILDOS/SCF/MoTe2ml_super5x5relaxNOsoc_def_sm_scf.save
>
>
> Info: using nr1, nr2, nr3 values from input
>
>
> Info: using nr1, nr2, nr3 values from input
>
>
> IMPORTANT: XC functional enforced from input :
>
> Exchange-correlation = PBE ( 1 4 3 4 0 0)
>
> Any further DFT definition will be discarded
>
> Please, verify this is what you really want
>
>
>
> G-vector sticks info
>
> --------------------
>
> sticks: dense smooth PW G-vecs: dense smooth PW
>
> Sum 18421 18421 4741 1930105 1930105 251743
>
>
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
> Error in routine diropn (10):
>
> error opening
> /home/mlessio/MoTe2/NormConservingPseudo/MONOLAYER/DEFECT_SIMULATION/RemoveSOC/5x5x1/nspin1/AddSmearing/PP_ILDOS/SCF/MoTe2ml_super5x5relaxNOsoc_def_sm_scf.wfc1
>
> %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
>
>
> stopping ...
>
> --------------------------------------------------------------------------
>
> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>
> with errorcode 1.
>
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>
> You may or may not see output from other processes, depending on
>
> exactly when Open MPI kills them.
>
> --------------------------------------------------------------------------
>
>
>
> On Mon, Jul 30, 2018 at 7:31 AM, Lorenzo Paulatto <paulatz at gmail.com
> <mailto:paulatz at gmail.com>> wrote:
>
> Are you sure the calculation is converged? If it stops because the
> a max number of iterations has been reached (check the output
> carefully, or is not obvious) the wfc file won't be created,
> unless you set disk_io to some higher value (medium or high, I'm
> not sure). Of course the real problem would be that the scf does
> not converge.
>
> --
> Lorenzo Paulatto
> Written on a virtual keyboard with real fingers
>
> On 29 Jul 2018 22:35, "Martina Lessio" <ml4132 at columbia.edu
> <mailto:ml4132 at columbia.edu>> wrote:
>
> Dear all,
>
> I am running some simulations of defects in MoTe2 slabs with
> the goal of plotting the integrated local density of states.
> I have been using different supercell sizes in order to
> simulate different defect concentrations. However, when I use
> a relatively large supercell (5x5), I start incurring the
> problem that the scf run does not seem to print the .wfc1 (it
> only prints the .save folder). Due to the lack of the .wfc1
> file the post processing code crashes with the following
> error message:
>
> Error in routine diropn (10):
> error opening filename.wfc1
>
> I am having a hard time understanding the issue since I
> usually get the .wfc1 file when I run an scf using
> wf_collect=.true. on smaller supercells. My only guess is that
> this file is way too large to be printed given that this is a
> relatively large supercell. However, I checked with my cluster
> administrator and there should not be any space issue.
> I am copying below my input file for the scf run that fails to
> print the .wfc1 file (I am running version 5.4 of QE).
>
> Thank you in advance for your help.
>
> All the best,
> Martina
>
> --
> Martina Lessio, Ph.D.
> Frontiers of Science Lecturer in Discipline
> Postdoctoral Research Scientist
> Department of Chemistry
> Columbia University
>
> Input file:
>
>
> &control
> calculation = 'scf'
> restart_mode='from_scratch',
> prefix='MoTe2ml_super5x5relaxNOsoc_def_sm_scf',
> wf_collect=.TRUE.
> pseudo_dir = '/home/mlessio/espresso-5.4.0/pseudo/',
> outdir='./'
> /
>
>
> &system
> ibrav= 4, A=17.65, B=17.65, C=16.882, cosAB=-0.5, cosAC=0,
> cosBC=0,
> nat= 75, ntyp= 2,
> ecutwfc =60.
> occupations='smearing', smearing='gaussian', degauss=0.01
> nspin=1
> /
>
> &electrons
> mixing_mode = 'plain'
> mixing_beta = 0.2
> conv_thr = 1.0d-10
> /
>
> ATOMIC_SPECIES
> Te 127.6 Te_ONCV_PBE_FR-1.1.upf
> Mo 95.96 Mo_ONCV_PBE_FR-1.0.upf
>
> ATOMIC_POSITIONS {crystal}
> Te 0.134763127 0.067705673 0.314838709
> Te 0.135581610 0.267790815 0.313767464
> Te 0.134763126 0.467057438 0.314838722
> Te 0.133426199 0.666852373 0.315066533
> Te 0.133426206 0.866573824 0.315066528
> Te 0.333830680 0.067661368 0.314045748
> Te 0.337736908 0.270977783 0.312935985
> Te 0.337736909 0.466759125 0.312935994
> Te 0.333830676 0.666169311 0.314045769
> Te 0.333147629 0.866573820 0.315066533
> Te 0.532942565 0.067705695 0.314838702
> Te 0.533240880 0.270977796 0.312935965
> Mo 0.533333335 0.466666657 0.289322903
> Te 0.533240864 0.662263084 0.312935994
> Te 0.532942569 0.865236902 0.314838722
> Te 0.733136182 0.066568083 0.315103907
> Te 0.732209205 0.267790808 0.313767451
> Te 0.729022255 0.466759128 0.312935965
> Te 0.729022252 0.662263081 0.312935985
> Te 0.732209231 0.864418420 0.313767464
> Te 0.933431927 0.066568091 0.315103913
> Te 0.933431915 0.266863851 0.315103907
> Te 0.932294341 0.467057436 0.314838702
> Te 0.932338652 0.666169315 0.314045748
> Te 0.932294331 0.865236897 0.314838709
> Te 0.133264211 0.066582612 0.099418260
> Te 0.132725778 0.266362898 0.098974905
> Te 0.133264209 0.466681585 0.099418245
> Te 0.133234326 0.666468628 0.099305464
> Te 0.133234334 0.866765696 0.099305470
> Te 0.333453459 0.066906928 0.098996402
> Te 0.331845525 0.265879796 0.094385362
> Te 0.331845522 0.465965727 0.094385355
> Te 0.333453453 0.666546534 0.098996380
> Te 0.333531374 0.866765693 0.099305464
> Te 0.533318419 0.066582631 0.099418267
> Te 0.534034274 0.265879815 0.094385388
> Te 0.533333335 0.466666657 0.085781188
> Te 0.534034262 0.668154471 0.094385355
> Te 0.533318422 0.866735819 0.099418245
> Te 0.733260257 0.066630122 0.099465712
> Te 0.733637120 0.266362893 0.098974923
> Te 0.734120236 0.465965734 0.094385388
> Te 0.734120239 0.668154464 0.094385362
> Te 0.733637148 0.867274252 0.098974905
> Te 0.933369890 0.066630128 0.099465706
> Te 0.933369876 0.266739776 0.099465712
> Te 0.933417405 0.466681582 0.099418267
> Te 0.933093092 0.666546536 0.098996402
> Te 0.933417392 0.866735813 0.099418260
> Mo 0.066910177 0.133342732 0.207211832
> Mo 0.066910181 0.333567419 0.207211831
> Mo 0.066714643 0.533429235 0.207488709
> Mo 0.066613591 0.733306803 0.206964121
> Mo 0.066714624 0.933285385 0.207488709
> Mo 0.267519938 0.134476688 0.206033961
> Mo 0.268356012 0.334177979 0.204473191
> Mo 0.267519939 0.533043231 0.206033959
> Mo 0.266693218 0.733306792 0.206964121
> Mo 0.266693225 0.933386407 0.206964121
> Mo 0.466956762 0.134476690 0.206033961
> Mo 0.468945823 0.337891653 0.201981378
> Mo 0.468945838 0.531054173 0.201981377
> Mo 0.466956758 0.732480091 0.206033959
> Mo 0.466570767 0.933285385 0.207488709
> Mo 0.666432579 0.133342735 0.207211833
> Mo 0.665821985 0.334177980 0.204473191
> Mo 0.662108323 0.531054164 0.201981378
> Mo 0.665821981 0.731644003 0.204473191
> Mo 0.666432565 0.933089829 0.207211831
> Mo 0.866666673 0.133333333 0.207317092
> Mo 0.866657239 0.333567418 0.207211833
> Mo 0.865523322 0.533043230 0.206033961
> Mo 0.865523308 0.732480089 0.206033961
> Mo 0.866657248 0.933089841 0.207211832
>
> K_POINTS {automatic}
> 2 2 1 0 0 0
>
> _______________________________________________
> users mailing list
> users at lists.quantum-espresso.org
> <mailto:users at lists.quantum-espresso.org>
> https://lists.quantum-espresso.org/mailman/listinfo/users
> <https://lists.quantum-espresso.org/mailman/listinfo/users>
>
>
>
> _______________________________________________
> users mailing list
> users at lists.quantum-espresso.org
> <mailto:users at lists.quantum-espresso.org>
> https://lists.quantum-espresso.org/mailman/listinfo/users
> <https://lists.quantum-espresso.org/mailman/listinfo/users>
>
>
>
>
> --
> Martina Lessio, Ph.D.
> Frontiers of Science Lecturer in Discipline
> Postdoctoral Research Scientist
> Department of Chemistry
> Columbia University
>
>
> _______________________________________________
> users mailing list
> users at lists.quantum-espresso.org
> https://lists.quantum-espresso.org/mailman/listinfo/users
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20180731/f09067fa/attachment.html>
More information about the users
mailing list