[QE-users] pp.x segfaults for occupations='fixed' and nspin=2 calculation

Konrad Gruszka konrad.gruszka at pcz.pl
Tue Dec 9 11:04:20 CET 2025


I've tried for 1 cpu giving it as much as 384G of RAM (max for one 
node). The total RAM usage for successful nonmagnetic calculations is 
about 45 GB, with size of resulting files of about ~500 MB or less.

 From slurm, I can see that this process did  not tried to reserve much 
RAM:

JobID           JobName      State ExitCode     ReqMem  AveRSS  MaxRSS  
AllocCPUS
------------ ---------- ---------- -------- ---------- ---------- 
---------- ----------
10201284           GaPP         FAILED     11:0       374G               
                       1
10201284.ba+   batch         FAILED     11:0    10020K     10020K       
                         1

What can I do to try resolve this?

Regards

Konrad

W dniu 9.12.2025 o 09:51, Paolo Giannozzi pisze:
> Segfaults with no other error message are often out-of-memory errors. 
> pp.x may or may not be optimized for memory usage (some parts are not 
> optimized at all, I guess, because it is not worth the effort for 
> typical use cases). If there is something wrong with case nspin=2, it 
> should be reproducible with much smaller runs as well.
>
> Paolo
>
>
> On 12/9/25 09:10, Konrad Gruszka via users wrote:
>> Dear QE users,
>>
>> I'm having a problem with pp.x calculation for semiconductor with a 
>> gap (occuapations='fixed') with magnetism (nspin=2 + 
>> tot_magnetization=1). I've tested this using QE version 7.3.1 and 7.5.
>>
>> The pp.x is constantly giving me segfaults regardless of whether I 
>> want to do plotnum=0 or plotnum=6, while if there's no nspin=2, pp.x 
>> for nonmagnetic calculation goes fine. I've also tried projwfc.x to 
>> see magnetic moments, no luck also - segfaults.  The calculations are 
>> done on HPC using 192 cpu cores, I also tried on 1 CPU for pp.x:
>>
>> mpirun -np 192  pw.x -npool 4 -ntg 8 -in ga.scf.in > ga.scf.out
>>
>> mpirun -np 192 pp.x -in pp.in > pp.out
>>
>>
>> output from pp.x:
>>
>>       Program POST-PROC v.7.5 starts on  8Dec2025 at 23: 2:54
>>
>>       This program is part of the open-source Quantum ESPRESSO suite
>>       for quantum simulation of materials; please cite
>>           "P. Giannozzi et al., J. Phys.:Condens. Matter 21 395502 
>> (2009);
>>           "P. Giannozzi et al., J. Phys.:Condens. Matter 29 465901 
>> (2017);
>>           "P. Giannozzi et al., J. Chem. Phys. 152 154105 (2020);
>>            URL http://www.quantum-espresso.org",
>>       in publications or presentations arising from this work. More 
>> details at
>>       http://www.quantum-espresso.org/quote
>>
>>       Parallel version (MPI & OpenMP), running on     192 processor 
>> cores
>>       Number of MPI processes:               192
>>       Threads/MPI process:                     1
>>
>>       MPI processes distributed on     1 nodes
>>       R & G space division:  proc/nbgrp/npool/nimage =  192
>>       341092 MiB available memory on the printing compute node when 
>> the environment starts
>>
>>       Reading xml data from directory:
>>
>>       ./tmp/geas2_mag.save/
>>       file Ge.pbe-dn-kjpaw_psl.1.0.0.UPF: wavefunction(s)  4S 4P 3D 
>> renormalized
>>       file Ge.pbe-dn-kjpaw_psl.1.0.0.UPF: wavefunction(s)  4S 4P 3D 
>> renormalized
>>
>>       IMPORTANT: XC functional enforced from input :
>>       Exchange-correlation= PBE
>>                             (   1   4   3   4   0   0   0)
>>       Any further DFT definition will be discarded
>>       Please, verify this is what you really want
>>
>> After this segfault (matter of seconds after running), no additional 
>> info here.
>>
>> Is there any known problem with such combination of system parameters 
>> or are my inputs just wrong?
>>
>>
>> Below is my pp.x input file:
>>
>> &INPUTPP
>> filplot = 'ppout.rho'
>> outdir = './tmp'
>> plot_num = 0
>> prefix = 'geas2_mag'
>> /
>> &PLOT
>> fileout = 'rho.cube'
>> iflag = 3
>> output_format = 6
>> /
>>
>> Previous to SCF I've done vc-relax, both converging fine. System is 
>> 2D with a vacancy that induces magnetism.
>>
>> Input for SCF calculation after which I try pp.x:
>>
>> &CONTROL
>>     calculation = 'scf'
>>     prefix = 'geas2_mag'
>>     pseudo_dir = '../pp/'
>>     outdir = './tmp'
>>     restart_mode = 'from_scratch'
>>     etot_conv_thr = 1.0D-5
>>     forc_conv_thr = 1.0D-4
>>     tprnfor = .false.
>>     tstress = .false.
>> /
>>
>> &SYSTEM
>>     ibrav = 0
>>     nat = 71
>>     ntyp = 3
>>     ecutwfc = 80
>>     ecutrho = 640
>>     occupations = 'fixed'
>>     nspin=2
>>     tot_magnetization=1
>>     assume_isolated = '2D'
>>     starting_magnetization(1) = 0.00   ! Ge
>>     starting_magnetization(2) = 0.20   ! GeMAG
>>     starting_magnetization(3) = 0.00   ! As
>>
>> /
>>
>> &ELECTRONS
>>     conv_thr = 1.0D-10
>>     mixing_beta = 0.7d0
>>     electron_maxstep= 500
>>     mixing_mode = 'plain'
>> /
>>
>> ATOMIC_SPECIES
>> Ge 72.64 Ge.pbe-dn-kjpaw_psl.1.0.0.UPF
>> GeMAG 72.64 Ge.pbe-dn-kjpaw_psl.1.0.0.UPF
>> As 74.9216 As.pbe-n-kjpaw_psl.1.0.0.UPF
>>
>> CELL_PARAMETERS (angstrom)
>>    11.132390620  10.714510184   0.000000000
>>   -11.132390620  10.714510184   0.000000000
>>     0.000000000   0.000000000  30.272253940
>>
>> ATOMIC_POSITIONS (angstrom)
>> Ge               0.0163362844        6.1146929139  16.6554928040
>> Ge              -0.0551119261        0.7975784520  13.5266388254
>> GeMAG      1.8553984371        8.6000205492       14.8011166052
>> Ge               1.8553984371        3.0714979901  15.4473783907
>> As               0.0755034661        6.9191645903  14.3261240522
>> As              -0.0239102020        1.5214443595  15.8854390605
>> As              -0.0143271795        8.0302432723  18.2433445958
>> ...
>>
>> ...
>>
>> As              -1.4991101973       10.1257107631  12.9365799592
>> As              -1.8809867461        4.6872226564  17.3505252161
>>
>> K_POINTS automatic
>> 4 4 1 0 0 0
>>
>> I've truncated input file due to its size, but I could send full if 
>> needed.
>>
>> Regards
>>
>> Konrad Gruszka
>>
>> _______________________________________________________________________________ 
>>
>> The Quantum ESPRESSO Foundation stands in solidarity with all 
>> civilians worldwide who are victims of terrorism, military 
>> aggression, and indiscriminate warfare.
>> -------------------------------------------------------------------------------- 
>>
>> Quantum ESPRESSO is supported by MaX (www.max-centre.eu)
>> users mailing list users at lists.quantum-espresso.org
>> https://lists.quantum-espresso.org/mailman/listinfo/users
>


More information about the users mailing list