Page 1 of 1
MPI problem with AIM
Posted: Mon May 03, 2010 10:53 pm
by gabriel.antonius
Hi everyone,
I'm having trouble running AIM from abinit 6.0.3.
When I compile it with openmpi_intel64/1.4.1 and try to run AIM on a single processor, I get the following error message:
*** An error occurred in MPI_Comm_f2c
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
This error does not happen when abinit is compiled without openmpi, but, unfortunately, the density files from a parallel version of abinit cannot be analysed with a serial version of AIM.
Any explanation would be welcome.
Re: MPI problem with AIM
Posted: Tue May 04, 2010 1:26 am
by mverstra
This is surprising, the DEN file is independent of the seq/par nature of the code run. Why do you assert the contrary?
Regarding the error, I believe that either:
1) aim was not mpi-enabled in 6.0.3, but I think it was.
2) you missed a bit of compilation somewhere, or did not clean things properly from the sequential to the parallel build (or something like that). Try
make clean
rm src/mods/*.mod
make
and see if the new parallel executable works better.
Matthieu
Re: MPI problem with AIM
Posted: Tue May 04, 2010 5:20 pm
by gabriel.antonius
The sequential and parallel versions were built in different directories, so there shouldn't be any interference.
I believe there is some difference in the header form of the density files (which are PAWDEN files btw) from the sequential and parallel versions. When I try to run the sequential version of aim with a density file from a parallel version of abinit, it cannot read the code version. I get :
ECHO of the ABINIT file header
First record :
.codvsn,headform,fform = ?%? 0 0
Then some random numbers and
Internal Error: type_name(): Bad type
Re: MPI problem with AIM
Posted: Sun May 09, 2010 4:51 pm
by mverstra
Could we have your input file (eventually remove atomic positions and types if you wish). Are you using Mpi-io? Otherwise there is no good reason for the header to be written differently in the 2 versions, as both runs call the same routine using only 1 processor (again, not true with mpiio).
Could it be that the sequential and parallel compilers are different? E.g. ifort and gfortran? This would explain everything, as the binary formats are different. You have to check explicitly inside the mpif90 wrapper... If you compile with default values this confusion can happen quite easily.
Matthieu
Re: MPI problem with AIM
Posted: Mon May 10, 2010 8:20 pm
by gabriel.antonius
The sequential and parallel compilers were indeed different, which explains the incompatibility.
That leaves me with the initial MPI problem, since I wish to analyse a density from a parallel calculation. I attached my input files for the production of the PAWDEN file and for aim, but there shouldn't be anything odd. The density file is correctly handled by cut3d.
Re: MPI problem with AIM
Posted: Mon Jul 12, 2010 11:17 pm
by gabriel.antonius
I finally figured a work-around.
I use mpif90 for the parallel compilation, and ifort for the sequential compilation. The PAWDEN file is then compatible from abinit-parallel to aim-sequential.
The problem comes from the fact that, when PAW is used, AIM ends up using mpi subroutines in wrtout, though mpi is not properly initialized. This should be fixed some day.
Thanks for the concern!