HI,
I would like to share my experience with the MPI packages with ABinit 6.10.x under Fedora Core 16 ( x86_64 )
I tried with theses two packages without success
Code: Select all
openmpi-1.5-4.fc16.x86_64
openmpi-devel-1.5-4.fc16.x86_64
mpich2-1.4.1p1-1.fc16.x86_64
mpich2-devel-1.4.1p1-1.fc16.x86_64
1) OpenMPI => runtime problem
Code: Select all
Your architecture is not able to handle 16, 8, 4 or 2-bytes FORTRAN file record markers!
You cannot use ABINIT and MPI/IO.
MPI_ERROR_STRING: MPI_ERR_UNKNOWN: unknown error
I have not yet understood what the problem was...
but the version 1.5.x is not yet supported by our testfarm :
not stable version of OpenMPI2) MPICH2 => compilation problem
there is a bug in the 1.4.1p1 version of mpich2 ( resolved in the svn version... )
see the topic for more details :
viewtopic.php?f=3&t=1206#p4228Then, I compiled the version openmpi 1.4.3 from sources and everything working...
In brief, when one of the two MPI packages will be operational for abinit,
here is a recipe to compile abinit 6.12.x ( with all plugins + openmpi or mpich2 ) under a fresh Fedora Core 16 (x86_64) :
1) installed pakages :
Code: Select all
yum install gcc-gfortran.x86_64
yum install netcdf
yum install netcdf-devel
yum install atlas
yum install atlas-devel
yum install fftw3
yum install fftw3-devel
yum install openmpi
yum install openmpi-devel
yum install mpich2
yum install mpich2-devel
yum install libxc
yum install libxc-devel
yum install patch
2) create a file "build.ac" with these lines :
Code: Select all
#enable_fallbacks="no"
enable_exports="yes"
enable_pkg_check="yes"
enable_gw_dpc = yes
enable_mpi = yes
enable_mpi_io = yes
with_mpi_prefix = /usr/lib64/openmpi
with_trio_flavor="netcdf+etsf_io+fox"
with_dft_flavor = atompaw+bigdft+libxc+wannier90
with_linalg_flavor = atlas
with_linalg_libs = -L/usr/lib64/atlas -llapack -lf77blas -lcblas -latlas
with_fft_flavor = fftw3
with_fft_incs = -I/usr/include/
with_fft_libs = -L/usr/lib64 -lfftw3
with_netcdf_incs = -I/usr/lib64/gfortran/modules
with_netcdf_libs = -L/usr/lib64 -lnetcdf -lnetcdff
with_libxc_incs="-I/usr/include -I/usr/lib64/gfortran/modules"
with_libxc_libs="-L/usr/lib64 -lxc"
3) execute theses commands :
Code: Select all
export PATH=/usr/lib64/openmpi/bin/:$PATH
export LD_LIBRARY_PATH=/usr/lib64/openmpi/lib:$LD_LIBRARY_PATH
./configure --with-config-file=./build.ac
make mj4
make install
all binaries are installed in "/usr/local/bin" then you can add the dir in the PATH :
Code: Select all
export PATH=/usr/lib64/openmpi/bin/:$PATH
Notes : tested with a 6.11.x :
-ok for ./configure and compilation
- some numerical errors with some tests ( not insuppressible to resolve... )
- not good with the libxc 1.1 package
Code: Select all
6.11.2-private/r782
======================================================
Tests SEQ start at 13:37 and done after 1734s
test built_in OK
========================================================
Serie #tests #succes #passed #failed #missing
========================================================
atompaw 1 1 0 0 0
bigdft 13 13 0 0 0
etsf_io 10 10 0 0 0
fast 27 26 0 1 0
fox 2 1 0 1 0
gwdp 31 17 13 1 0
libxc 13 2 1 10 0
tutoplugs 4 0 4 0 0
tutorespfn 46 34 7 5 0
tutorial 57 45 9 3 0
unitary 4 4 0 0 0
v1 96 91 1 4 0
v2 95 80 10 5 0
v3 93 80 9 4 0
v4 94 73 18 3 0
v5 99 72 12 15 0
v6 101 79 18 4 0
wannier90 3 0 3 0 0
paral 66 49 17 0 0
mpiio 9 8 1 0 0
========================================================
Paral Tests DONE ( time elapsed: 1055s )
========================================================
Powered by Analysis V2.7.0rc1
Date : 12/05/2011
========================================================
regards
jmb