MPI problem with AIM
Moderators: MMNSchmitt, gonze
-
- Posts: 58
- Joined: Mon May 03, 2010 10:34 pm
MPI problem with AIM
Hi everyone,
I'm having trouble running AIM from abinit 6.0.3.
When I compile it with openmpi_intel64/1.4.1 and try to run AIM on a single processor, I get the following error message:
*** An error occurred in MPI_Comm_f2c
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
This error does not happen when abinit is compiled without openmpi, but, unfortunately, the density files from a parallel version of abinit cannot be analysed with a serial version of AIM.
Any explanation would be welcome.
I'm having trouble running AIM from abinit 6.0.3.
When I compile it with openmpi_intel64/1.4.1 and try to run AIM on a single processor, I get the following error message:
*** An error occurred in MPI_Comm_f2c
*** before MPI was initialized
*** MPI_ERRORS_ARE_FATAL (your MPI job will now abort)
This error does not happen when abinit is compiled without openmpi, but, unfortunately, the density files from a parallel version of abinit cannot be analysed with a serial version of AIM.
Any explanation would be welcome.
Gabriel Antonius
Université du Québec à Trois-Rivières
Université du Québec à Trois-Rivières
Re: MPI problem with AIM
This is surprising, the DEN file is independent of the seq/par nature of the code run. Why do you assert the contrary?
Regarding the error, I believe that either:
1) aim was not mpi-enabled in 6.0.3, but I think it was.
2) you missed a bit of compilation somewhere, or did not clean things properly from the sequential to the parallel build (or something like that). Try
make clean
rm src/mods/*.mod
make
and see if the new parallel executable works better.
Matthieu
Regarding the error, I believe that either:
1) aim was not mpi-enabled in 6.0.3, but I think it was.
2) you missed a bit of compilation somewhere, or did not clean things properly from the sequential to the parallel build (or something like that). Try
make clean
rm src/mods/*.mod
make
and see if the new parallel executable works better.
Matthieu
Matthieu Verstraete
University of Liege, Belgium
University of Liege, Belgium
-
- Posts: 58
- Joined: Mon May 03, 2010 10:34 pm
Re: MPI problem with AIM
The sequential and parallel versions were built in different directories, so there shouldn't be any interference.
I believe there is some difference in the header form of the density files (which are PAWDEN files btw) from the sequential and parallel versions. When I try to run the sequential version of aim with a density file from a parallel version of abinit, it cannot read the code version. I get :
ECHO of the ABINIT file header
First record :
.codvsn,headform,fform = ?%? 0 0
Then some random numbers and
Internal Error: type_name(): Bad type
I believe there is some difference in the header form of the density files (which are PAWDEN files btw) from the sequential and parallel versions. When I try to run the sequential version of aim with a density file from a parallel version of abinit, it cannot read the code version. I get :
ECHO of the ABINIT file header
First record :
.codvsn,headform,fform = ?%? 0 0
Then some random numbers and
Internal Error: type_name(): Bad type
Gabriel Antonius
Université du Québec à Trois-Rivières
Université du Québec à Trois-Rivières
Re: MPI problem with AIM
Could we have your input file (eventually remove atomic positions and types if you wish). Are you using Mpi-io? Otherwise there is no good reason for the header to be written differently in the 2 versions, as both runs call the same routine using only 1 processor (again, not true with mpiio).
Could it be that the sequential and parallel compilers are different? E.g. ifort and gfortran? This would explain everything, as the binary formats are different. You have to check explicitly inside the mpif90 wrapper... If you compile with default values this confusion can happen quite easily.
Matthieu
Could it be that the sequential and parallel compilers are different? E.g. ifort and gfortran? This would explain everything, as the binary formats are different. You have to check explicitly inside the mpif90 wrapper... If you compile with default values this confusion can happen quite easily.
Matthieu
Matthieu Verstraete
University of Liege, Belgium
University of Liege, Belgium
-
- Posts: 58
- Joined: Mon May 03, 2010 10:34 pm
Re: MPI problem with AIM
The sequential and parallel compilers were indeed different, which explains the incompatibility.
That leaves me with the initial MPI problem, since I wish to analyse a density from a parallel calculation. I attached my input files for the production of the PAWDEN file and for aim, but there shouldn't be anything odd. The density file is correctly handled by cut3d.
That leaves me with the initial MPI problem, since I wish to analyse a density from a parallel calculation. I attached my input files for the production of the PAWDEN file and for aim, but there shouldn't be anything odd. The density file is correctly handled by cut3d.
- Attachments
-
in_pawden.in
- The input producing the PAW density
- (217 Bytes) Downloaded 433 times
-
aim.in
- The input for aim (most basic one)
- (33 Bytes) Downloaded 439 times
Gabriel Antonius
Université du Québec à Trois-Rivières
Université du Québec à Trois-Rivières
-
- Posts: 58
- Joined: Mon May 03, 2010 10:34 pm
Re: MPI problem with AIM
I finally figured a work-around.
I use mpif90 for the parallel compilation, and ifort for the sequential compilation. The PAWDEN file is then compatible from abinit-parallel to aim-sequential.
The problem comes from the fact that, when PAW is used, AIM ends up using mpi subroutines in wrtout, though mpi is not properly initialized. This should be fixed some day.
Thanks for the concern!
I use mpif90 for the parallel compilation, and ifort for the sequential compilation. The PAWDEN file is then compatible from abinit-parallel to aim-sequential.
The problem comes from the fact that, when PAW is used, AIM ends up using mpi subroutines in wrtout, though mpi is not properly initialized. This should be fixed some day.
Thanks for the concern!
Gabriel Antonius
Université du Québec à Trois-Rivières
Université du Québec à Trois-Rivières