[QE-users] Compiling with Intel's OneAPI - Parallel Performance Issues
Baer, Bradly
bradly.b.baer at Vanderbilt.Edu
Thu Jun 3 23:45:27 CEST 2021
Hello,
1) Ah, I did not notice that. I generally suppress I/O for test jobs but one I did a followup phonon calculation to test timings on that so I/O was turned on. The timing difference persists regardless of the I/O and also persisted into the phonon calculation (1hr//1.5hr roughly)
2) I have run it multiple times and timing is reproducible. Timing issue exists in ph.x as well as mentioned above. I have 16 physical cores and 32 hyperthreads.
3) I have run with mpirun -np 1 which is what I think 1 MPI rank means. The cpu/wall timings are much more consistent, but I must confess that I am not experienced enough to understand what this result indicates is causing my issue.
Thanks,
Brad
--------------------------------------------------------
Bradly Baer
Graduate Research Assistant, Walker Lab
Interdisciplinary Materials Science
Vanderbilt University
________________________________
From: users <users-bounces at lists.quantum-espresso.org> on behalf of Ye Luo <xw111luoye at gmail.com>
Sent: Thursday, June 3, 2021 4:22 PM
To: Quantum ESPRESSO users Forum <users at lists.quantum-espresso.org>
Subject: Re: [QE-users] Compiling with Intel's OneAPI - Parallel Performance Issues
Hi Brad,
1. Your output files differ. One says 'Writing output data file ./pwscf.save' one doesn't. Does one have I/O and one doesn't?
2. Your simulation so small and also you are running 16 MPI ranks, so largely exercising MPI overhead. Run it a couple times and see if the timing is reproducible. Does your machine have 16 physical cores or 8 cores 16 hyperthreads?
3. To validate it is actually a compiler regression, run with 1 MPI rank and compare the timing.
Ye
===================
Ye Luo, Ph.D.
Computational Science Division & Leadership Computing Facility
Argonne National Laboratory
On Thu, Jun 3, 2021 at 3:36 PM Baer, Bradly <bradly.b.baer at vanderbilt.edu<mailto:bradly.b.baer at vanderbilt.edu>> wrote:
Hello Users,
I have a working QE6.7 install built with Intel's Parallel Studios from 2020. I want to compile the d3q code but I have found that my parallel studios license is expired, and I must switch to Intel's new OneAPI distribution to continue using ifort, icc etc. I have configured everything in the same way as with the parallel studios version, including using the same make.inc file, but my parallel performance is very poor when using the OneAPI version.
Attached are my make.inc file that I used for both compiles and an example output file using pw.x compiled with parallel studios and OneAPI. The parallel studios calculation had a CPU//wall time of 5.89s//5.97s but the OneAPI version has almost a 50% performance loss and shows times of 5.92s//8.71s. Both were made using the same inputs.
Has anyone had experience compiling with the new OneAPI versions of things? Have I missed some small but important change in how the libraries are linked now?
Thanks,
Brad
--------------------------------------------------------
Bradly Baer
Graduate Research Assistant, Walker Lab
Interdisciplinary Materials Science
Vanderbilt University
_______________________________________________
Quantum ESPRESSO is supported by MaX (www.max-centre.eu<https://nam04.safelinks.protection.outlook.com/?url=http%3A%2F%2Fwww.max-centre.eu%2F&data=04%7C01%7Cbradly.b.baer%40vanderbilt.edu%7Cebb26f1903cc4b988a4c08d926d5b9a4%7Cba5a7f39e3be4ab3b45067fa80faecad%7C0%7C1%7C637583521651058549%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=VC%2B7hzIdNUfdKmK33zLvgMlJ054sVVnG8PMDknhuHxA%3D&reserved=0>)
users mailing list users at lists.quantum-espresso.org<mailto:users at lists.quantum-espresso.org>
https://lists.quantum-espresso.org/mailman/listinfo/users<https://nam04.safelinks.protection.outlook.com/?url=https%3A%2F%2Flists.quantum-espresso.org%2Fmailman%2Flistinfo%2Fusers&data=04%7C01%7Cbradly.b.baer%40vanderbilt.edu%7Cebb26f1903cc4b988a4c08d926d5b9a4%7Cba5a7f39e3be4ab3b45067fa80faecad%7C0%7C1%7C637583521651068545%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C2000&sdata=itDrMarLfOmf7HCGKhnuMIPbOofSV6gPIys%2BqJxxE%2Bo%3D&reserved=0>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20210603/5ed9f32c/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: MPI1ParallelStudio.out
Type: application/octet-stream
Size: 20474 bytes
Desc: MPI1ParallelStudio.out
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20210603/5ed9f32c/attachment.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: MPI1OneAPI.out
Type: application/octet-stream
Size: 20474 bytes
Desc: MPI1OneAPI.out
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20210603/5ed9f32c/attachment-0001.obj>
More information about the users
mailing list