[Pw_forum] new question: Solved: mpi compile errer?

Tianying Yan tyan at nankai.edu.cn
Thu Dec 2 11:07:22 CET 2004


Dear pwscfers,

The compiled MPI excutable only runs on 2 CPUs within the same node, but can not setup commnication across the nodes. Do I again miss something? Thank you very much!

Best regards, Tianying

----- Original Message ----- 
From: Tianying Yan 
To: pw_forum at pwscf.org 
Sent: Wednesday, December 01, 2004 5:45 PM
Subject: Solved: mpi compile errer? 


Problem solved by modifying Makefile to include mpif.h in path or copy mpif.h into include subdirectory under pwscf root directory. Hopefully such a solution is the correct one. As I am running one test job from examples and give the same result as the one in reference.

----- Original Message ----- 
From: Tianying Yan 
To: pw_forum at pwscf.org 
Sent: Wednesday, December 01, 2004 4:56 PM
Subject: mpi compile errer?


Dear pwscf user,

I am just new in this list and I have a question about compiling pwscf on a linux64 cluser.

/configure ARCH=linux64
make all

However, the compilation is failed for the para.f90 subroutine in PW sub-directory. I believe the MPI environment is correctly set up, but what is the problem? Any hint would be grateful.

The error message looks as

make[1]: Entering directory `/nfs/s07r1p1/tytan_isc/pwscf/espresso.parallel/PW'
mpif90 -O2 -assume byterecl -I. -I../include -I../Modules -I../PW -I../PH  -nomodule -fpp -D__LINUX64 -D__INTEL -D__MPI -D__PARA -D__FFTW -D__USE_INTERNAL_FFTW  -c para.f90
fortcom: Error: para.f90, line 161: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
     CALL MPI_ALLREDUCE( ps(1+(n-1)*maxb), buff, maxb, MPI_REAL8, &
-------------------------------------------------------^
fortcom: Error: para.f90, line 296: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
     CALL MPI_ALLREDUCE( ps(1+(n-1)*maxb), buff, maxb, MPI_REAL8, &
-------------------------------------------------------^
fortcom: Error: para.f90, line 372: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
  CALL MPI_GATHERV( f_in, recvcount(me_pool), MPI_REAL8, f_out, &
----------------------------------------------^
fortcom: Error: para.f90, line 429: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
  CALL MPI_ALLGATHERV( f_in, recvcount(me_pool), MPI_REAL8, &
-------------------------------------------------^
fortcom: Error: para.f90, line 492: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
  CALL MPI_SCATTERV( f_in, sendcount, displs, MPI_REAL8,   &
----------------------------------------------^
fortcom: Error: para.f90, line 678: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
     CALL MPI_ALLTOALLV( f_aux, sendcount, sdispls, MPI_REAL8, &
----------------------------------------------------^
fortcom: Error: para.f90, line 755: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
     CALL MPI_ALLREDUCE( ps, psr, 1, MPI_REAL8, MPI_MAX, &
-------------------------------------^
fortcom: Error: para.f90, line 812: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
     CALL MPI_SEND( vec, (length*nks), MPI_REAL8, 0, 17, &
---------------------------------------^
fortcom: Error: para.f90, line 949: This name does not have a type, and must have an explicit type.   [MPI_REAL8]
     CALL MPI_ALLREDUCE( ps, psr, 1, MPI_REAL8, MPI_MAX, &
-------------------------------------^
compilation aborted for para.f90 (code 1)
make[1]: *** [para.o] Error 1
make[1]: Leaving directory `/nfs/s07r1p1/tytan_isc/pwscf/espresso.parallel/PW'
make: *** [pw] Error 2

Thanks, Tianying
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.quantum-espresso.org/pipermail/users/attachments/20041202/0519600a/attachment.html>


More information about the users mailing list