[Q-e-developers] [Pw_forum] Mpi-io access to the same file in QE

Samuel Poncé samuel.pon at gmail.com
Mon Mar 21 20:32:15 CET 2016


Dear QE developers,

This might be a trivial question but what is the best way in QE for
multiple cores to read the same file (different part).

As a example, I have a file called
test.ext1

This file contains the following matrix
dim1 = 5
dim2 = 10
dim3 = 10
matrix(dim1,dim2, dim3)

Now, I would like to run 5 mpi job such that each of them contains after
reading the file "test.ext1" in memory the following matrix
dim1 = 5
dim2 = 10
dim3 = 2
matrix(dim1,dim2, dim3)

So each node has 1/5 of the full matrix written on file.

One way I found to make that work is to copy test.ext1 five times with
following name
test.ext1 test.ext2 test.ext3 test.ext4 test.ext5
and then read each of them like this

%Split dim3 across pools so that I get dim3_start and dim3_end for each
processor

lrec   = 5 * 10 * 2
filint    = trim(prefix)//'.ext'
CALL diropn (iun, 'test, lrec, exst)

DO ir =dim3_start, dim3_end
  CALL davcio ( matrix_node, lrec, iun, ir, -1 )
END

This works but is very inefficient since I copy 5 times the full file.
The problem is that diropn seems to append "nd_nmbr" which suggest that
concurrent access to the same file might not be a good idea?

Is it possible to access different part of the same file at the same time
by different node in
the pool in QE?

Thank you,

Best,

Samuel

-- 

------------------------------------------------------------------------------------------------
    Dr. Samuel Poncé
    Department of Materials
    University of Oxford
    Parks Road
    Oxford OX1 3PH, UK

    Phone: +44 1865 612789
    email: samuel.ponce at materials.ox.ac.uk  <fabio.caruso at materials.ox.ac.uk>
    web: http://giustino.materials.ox.ac.uk/index.php/Site/SamuelPonc%e9
------------------------------------------------------------------------------------------------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/developers/attachments/20160321/f508ca07/attachment.html>


More information about the developers mailing list