<div dir="ltr"><div class="gmail_default" style="font-size:small">Dear Quantum Espresso forum,</div><div class="gmail_default" style="font-size:small"> Yesterday I noted a mysterious error in ph.x (Version 7.1) observed for NaCl using the primitive cell. The error was quite strange, usually mentioning an error in cdiaghg which is a routine in the LAXlib directory. I now think that it has to do with requesting too many cores for my calculation because my error can be removed by requesting fewer cores. As noted in the original post, the error did not appear in the QE versus 6.3, so presumably LAXlib files have been changed in the 7.1 version. I very much appreciate the fact that QE seems to adapt to the number of nodes and cpus per node available in each run, but perhaps I need to increase my understanding of how this works for the various programs. There was no problem running pw.x or ph.x for the gamma point with 32 cores, but ph.x failed for q /=0. Both pw.x and ph.x with all q vectors seem to have run correctly with 8 cores. Are there some guidelines for choosing how many nodes/cores to request ? In any case, thanks for listening. Sincerely, Natalie Holzwarth</div><div class="gmail_default" style="font-size:small"><br></div><div><div dir="ltr" class="gmail_signature" data-smartmail="gmail_signature">N. A. W. Holzwarth email: <a href="mailto:natalie@wfu.edu" target="_blank">natalie@wfu.edu</a><div>Department of Physics web: <a href="http://www.wfu.edu/~natalie" target="_blank">http://www.wfu.edu/~natalie</a></div><div>Wake Forest University phone: 1-336-758-5510 </div><div>Winston-Salem, NC 27109 USA office: Rm. 300 Olin Physical Lab</div></div></div><br><br><div class="gmail_quote"><div dir="ltr" class="gmail_attr">---------- Forwarded message ---------<br>From: <strong class="gmail_sendername" dir="auto">Holzwarth, Natalie</strong> <span dir="auto"><<a href="mailto:natalie@wfu.edu">natalie@wfu.edu</a>></span><br>Date: Tue, Oct 25, 2022 at 4:50 PM<br>Subject: Very strange issue with ph.x in QE 7.1 which appeared OK in QE 6.3<br>To: Quantum Espresso users Forum <<a href="mailto:users@lists.quantum-espresso.org">users@lists.quantum-espresso.org</a>><br></div><br><br><div dir="ltr"><div style="font-size:small">Dear Quantum Espresso Forum, </div><div style="font-size:small"><br></div><div style="font-size:small">I have been using ph.x from QE 7.1 for quite a few low-symmetry materials but for several simple fcc crystals, the program</div><div style="font-size:small">fails to diagonalize the first non-gamma point phonon matrix (see 7.1G.out) file. However I ran the same program with QE 6.3</div><div style="font-size:small">which diagonalizes all of the phonon modes of NaCl. In this test, I am using pseudos from SSSP. The input file is pasted below </div><div style="font-size:small">and the two output files from ph.x are hopefully attached. Has anyone else seen this issue?. </div><div style="font-size:small">Thanks very much for any advice about this, Natalie</div><div style="font-size:small"><br></div><div style="font-size:small">input file for vc-relax and phonon calculation:</div><div style="font-size:small">------------------------------------------------------------</div><div style="font-size:small">cat runQE.slurm<br>#!/bin/tcsh<br>#<br>#SBATCH --nodes=1<br>#SBATCH --ntasks-per-node=32<br>#SBATCH --cpus-per-task=1<br>#SBATCH --account="natalieGrp"<br>#SBATCH --output="JOB-%j.o"<br>#SBATCH --error="JOB-%j.e"<br>#SBATCH --mail-user=<a href="mailto:natalie@wfu.edu" target="_blank">natalie@wfu.edu</a><br>#SBATCH --mail-type=BEGIN,END,FAIL<br>#SBATCH --time=0-300:00:00<br>#SBATCH --mem=96gb<br>#SBATCH --partition=large<br>umask 002<br># Note: SLURM has no batch input for cputime, excluding.<br>#<br>#<br>echo 'hostname' `/bin/hostname`<br>echo 'job directory' `pwd`<br>#<br>setenv TMPDIR /scratch/$SLURM_JOBID<br>echo 'Reset TMPDIR for this job to ' $TMPDIR<br><br>module load apps/quantum-espresso/7.1<br>set PW=pw.x<br>set PH=ph.x<br><br><br>#NOTE:SLURM defaults to running jobs in the directory where submitted;<br>#NOTE:Consider --workdir directive instead; and check functionality!<br>cd ${SLURM_SUBMIT_DIR}<br><br>cat > PSI.in << EOF<br>&CONTROL<br> calculation = "vc-relax",<br> pseudo_dir = '/deac/phy/natalieGrp/natalie/wfurc9/EL7/QE/Downloadedpseudos'<br> verbosity = "high",<br> outdir = "$TMPDIR/",<br> prefix = 'PSI',<br> restart_mode = 'from_scratch',<br> nstep = 300,<br> dt = 20,<br> forc_conv_thr = 1.0D-5,<br> etot_conv_thr = 1.0D-6,<br> tstress = .true.,<br> tprnfor = .true.,<br>/<br>&SYSTEM<br>ibrav = 2,<br>celldm(1) = 10.6,<br>nat = 2,<br>ntyp = 2,<br>nosym =.FALSE.,<br>use_all_frac = .TRUE.,<br>ecutwfc = 81.d0,<br>/<br>&ELECTRONS<br> conv_thr = 1.D-7,<br> electron_maxstep = 200,<br>/<br>&IONS<br>/<br>&CELL<br> cell_dynamics='bfgs',<br> wmass = 1.00,<br> press = 0.0,<br>/<br>ATOMIC_SPECIES<br> Na 22.989769 Na_ONCV_PBEsol-1.0.upf<br> Cl 35.45 Cl.pbesol-n-rrkjus_psl.1.0.0.UPF<br><br>ATOMIC_POSITIONS {crystal}<br> Na 0.0 0.0 0.0<br> Cl 0.5 0.50 0.50<br><br>K_POINTS AUTOMATIC<br> 8 8 8 0 0 0<br>EOF<br><br>mpirun $PW -in PSI.in > PSI.out<br><br>cat > <a href="http://PSI.phG.in" target="_blank">PSI.phG.in</a> << EOF<br>phonons<br><br>&inputph<br>outdir = '$TMPDIR',<br>prefix = 'PSI',<br>epsil = .true.,<br>ldisp = .true.,<br>fildyn = 'dyn.G',<br>tr2_ph = 1.0d-14,<br>start_q=1,<br>last_q=8,<br>nq1 = 4,<br>nq2 =4,<br>nq3 = 4,<br>/<br><br>EOF<br>mpirun $PH -in <a href="http://PSI.phG.in" target="_blank">PSI.phG.in</a> > PSI.phG.out<br><br><br><br><br><br>ls -Flag $TMPDIR<br><br>'rm' -r $TMPDIR<br><br>/usr/local/bin/slurm_mem_report -v<br></div><div style="font-size:small">------------------------------------------------------------</div><div><div dir="ltr" data-smartmail="gmail_signature">N. A. W. Holzwarth email: <a href="mailto:natalie@wfu.edu" target="_blank">natalie@wfu.edu</a><div>Department of Physics web: <a href="http://www.wfu.edu/~natalie" target="_blank">http://www.wfu.edu/~natalie</a></div><div>Wake Forest University phone: 1-336-758-5510 </div><div>Winston-Salem, NC 27109 USA office: Rm. 300 Olin Physical Lab</div></div></div></div>
</div></div>