[QE-users] Problems with scf calculation in qe 7.1 and 7.2

lplokijuhygt at tutanota.com lplokijuhygt at tutanota.com
Fri Apr 7 23:44:43 CEST 2023


 Dear Paolo,

Thank you for your advise. I've already tried this option, but without success. After  sudo execstack -s /home/delta/qe-7.2/bin/pw.x       I repeated running pw.x with iron and the program yielded almost the same messages, 

Regards,
Serhi.

7 квіт. 2023 р., 20:26 з paolo.giannozzi at uniud.it:

> Please have a look in this page: https://gitlab.com/QEF/q-e/-/wikis/home at this: "Segfault with gfortran and WSL", https://gitlab.com/QEF/q-e/-/wikis/Support/Segfault-with-gfortran-and-WSL
>
> Paolo
>
> On 07/04/2023 19:18, Serhi via users wrote:
>
>> Dear experts,
>>
>>
>> I installed qe-7.2 with intel oneapi on ubuntu 22-04 running under wsl. When I run 'delta at LAPTOP-650A6EI8:~/qe-7.2/wannier90-3.1.0/examples/example08$ mpirun -np 2 pw.x <iron.scf > scf.out'       I got the next message:
>>
>> forrtl: severe (174): SIGSEGV, segmentation fault occurred
>>
>> Image              PC                Routine            Line        Source
>>
>> _libc.so <http://libc.so/>_.6          00007FC441C12520  Unknown               Unknown  Unknown
>>
>> Unknown            00007FFFFBFE27B8  Unknown               Unknown  Unknown
>>
>> --------------------------------------------------------------------------
>>
>> Primary job  terminated normally, but 1 process returned
>>
>> a non-zero exit code. Per user-direction, the job has been aborted.
>>
>> --------------------------------------------------------------------------
>>
>> forrtl: severe (174): SIGSEGV, segmentation fault occurred
>>
>> Image              PC                Routine            Line        Source
>>
>> _libc.so <http://libc.so/>_.6          00007F0E2AE12520  Unknown               Unknown  Unknown
>>
>> Unknown            00007FFFD05B2538  Unknown               Unknown  Unknown
>>
>> --------------------------------------------------------------------------
>>
>> mpirun detected that one or more processes exited with non-zero status, thus causing
>>
>> the job to be terminated. The first process to do so was:
>>
>>    Process name: [[16591,1],0]
>>
>>    Exit code:    174
>>
>>
>>
>> When I run simply delta at LAPTOP-650A6EI8:~/qe-7.2/wannier90-3.1.0/examples/example08$ pw.x < iron.scf > scf.out  I got
>>
>> 'pw.x:19850 terminated with signal 11 at PC=7fffc97b2b38 SP=7fffc97b29d8.  Backtrace:
>>
>> [0x7fffc97b2b38]'
>>
>>
>> The same problem was with delta at LAPTOP-650A6EI8:~/qe-7.2/wannier90-3.1.0/examples/example06$ mpirun -np 2 pw.x < copper.scf > scf.out and also  without mpirun. But no problems turned out with silicon and diamond.  In case of iron and copper scf.out information  ended after the first iteration.
>>
>>
>> I compiled qe 7.2 with ./configure F90=mpiifort MPIF90=mpiifort CC=mpicc CXX=icc F77=mpiifort LAPACK_LIBS="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lmkl_blacs_intelmpi_lp64 -lpthread -lm -ldl"BLAS_LIBS="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lmkl_blacs_intelmpi_lp64 -lpthread -lm -ldl"SCALAPACK_LIBS="-L${MKLROOT}/lib/intel64 -lmkl_scalapack_lp64 -lmkl_intel_lp64 -lmkl_sequential -lmkl_core -lmkl_blacs_intelmpi_lp64 -lpthread -lm -ldl"
>>
>> taken from _High Performance Computing • Quantum Espresso Tutorial (pranabdas.github.io)_ <https://pranabdas.github.io/espresso/setup/hpc>.
>>
>> The same output was with qe 7.1.
>>
>>
>> Could you kindly help me to resolve this issue,
>>
>>
>> Regards,
>>
>>
>> Serhi.
>>
>>
>>
>> _______________________________________________
>> The Quantum ESPRESSO community stands by the Ukrainian
>> people and expresses its concerns about the devastating
>> effects that the Russian military offensive has on their
>> country and on the free and peaceful scientific, cultural,
>> and economic cooperation amongst peoples
>> _______________________________________________
>> Quantum ESPRESSO is supported by MaX (http://www.max-centre.eu/)
>> users mailing list users at lists.quantum-espresso.org
>> https://lists.quantum-espresso.org/mailman/listinfo/users
>>
>
> -- 
> Paolo Giannozzi, Dip. Scienze Matematiche Informatiche e Fisiche,
> Univ. Udine, via delle Scienze 206, 33100 Udine Italy, +39-0432-558216
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20230407/9149cf76/attachment.html>


More information about the users mailing list