[Pw_forum] Gathering 3-d arrays across pools using QE's mp_sum
Vahid Askarpour
vh261281 at dal.ca
Sun Oct 30 20:25:36 CET 2016
Hi Ye,
The changes I am making are part of the EPW/QE code and so the parallelization depends on the QE MPI routines. I have no problem using mp_sum for a 2-D array such as
energy(iband,ik). I can use mp_sum to add the contributions from various nodes and print the total in agreement with the output from the serial run. As far as the allocation goes, I use the same 2-D approach to my 3-D array.
The problem arises when I use the 3-D array with mp_sum. I noticed that in the Modules/mp.f90 file, there is a subroutine as follows:
SUBROUTINE mp_sum_rt( msg, gid )
IMPLICIT NONE
REAL (DP), INTENT (INOUT) :: msg(:,:,:)
INTEGER, INTENT(IN) :: gid
#if defined(__MPI)
INTEGER :: msglen
msglen = size(msg)
CALL reduce_base_real( msglen, msg, gid, -1 )
#endif
END SUBROUTINE mp_sum_rt
Here msg has three dimensions. There are other subroutines for 1, 2, and 4 dimensions. Somehow, if I can get EPW to use the above subroutine, it might work. Adding a statement like:
USE mp, ONLY : mp_sum_rt
at the beginning of the file does not work as the compilation fails because EPW does not see the subroutine. Interestingly, it does see mp_sum which is in the same mp.f90 file.
So is there a way that I can get EPW to see mp_sum_rt? This might solve my problem.
Since I know that mp_sum works with 2-D arrays, one option is to rewrite my files in 2-D, one array for each x,y,z direction. I am trying to avoid this.
Thanks,
Vahid
On Oct 30, 2016, at 3:59 PM, Ye Luo <xw111luoye at gmail.com<mailto:xw111luoye at gmail.com>> wrote:
Hi Vahid,
segfault in mp_sum doesn't necessarily mean a problem there. Probably you wrote something to output array but not in a valid place before mp_sum.
Try to check your allocation of output and the copy make sure they are correct.
Ye
===================
Ye Luo, Ph.D.
Leadership Computing Facility
Argonne National Laboratory
2016-10-28 14:33 GMT-05:00 Vahid Askarpour <vh261281 at dal.ca<mailto:vh261281 at dal.ca>>:
Hi Ye,
Thank you for your suggestion. I tried it and when I ran the code, it seg-faulted. I put flags in the code to see where the segmentation faults occurs. It happens as the code calls mp_sum. It seems that mp_sum may not
be able to handle this reduction.
Cheers,
Vahid
On Oct 28, 2016, at 2:51 PM, Ye Luo <xw111luoye at gmail.com<mailto:xw111luoye at gmail.com>> wrote:
In Fortran, whatever-D array is 1-D array. mp_sum should be fine.
I saw something strange in your code that you were not copying the right things as you expected.
How about the following?
output(1:3,1:nbnds,(k_pool*pool_id+1:k_pool*pool_id+k_pool))=input(1:3,1:nbnds,1:k_pool)
Ye
===================
Ye Luo, Ph.D.
Leadership Computing Facility
Argonne National Laboratory
2016-10-28 12:29 GMT-05:00 Vahid Askarpour <vh261281 at dal.ca<mailto:vh261281 at dal.ca>>:
Dear QE Users,
I am working on some modifications to the QE-6.0 code using symmetry. When I try to combine a 3-D array scattered across nodes, I use the following:
output(3,nbnds,(k_pool*pool_id+1:k_pool*pool_id+k_pool))=input(3,nbnds,1:k_pool)
Here, nbnds is the number of bands, k_pool is the number of k points/pool, and pool_id is the id of the pool. Here I am assuming the the number of k points is divisible by the number of pools.
Then I call mp_sum(output,inter_pool_comm) to put all the segments of input across the nodes into one output file.
When I run the modified QE code in parallel, the output file is different from the serial run.
Does the QE's mp_sum allow the above operation for a three-D array?
Any hints or suggestions would be greatly appreciated.
Vahid
Vahid Askarpour
Department of Physics and Atmospheric Science
Dalhousie University,
Halifax, NS, Canada
_______________________________________________
Pw_forum mailing list
Pw_forum at pwscf.org<mailto:Pw_forum at pwscf.org>
http://pwscf.org/mailman/listinfo/pw_forum
_______________________________________________
Pw_forum mailing list
Pw_forum at pwscf.org<mailto:Pw_forum at pwscf.org>
http://pwscf.org/mailman/listinfo/pw_forum
_______________________________________________
Pw_forum mailing list
Pw_forum at pwscf.org<mailto:Pw_forum at pwscf.org>
http://pwscf.org/mailman/listinfo/pw_forum
_______________________________________________
Pw_forum mailing list
Pw_forum at pwscf.org<mailto:Pw_forum at pwscf.org>
http://pwscf.org/mailman/listinfo/pw_forum
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20161030/dd72dbf4/attachment.html>
More information about the users
mailing list