Page 1 of 1

installing abinit 6.4.3 on rocks clusters 5.3

Posted: Mon Jan 24, 2011 7:44 am
by asanee
Dear All,

I have a problem when configure the abinit, I tried installing it in a parallel environment using this string

./configure --enable-64bit-flags="yes" --enable-mpi="yes" --with-mpi-prefix="/opt/mpich2/gnu/include/"

Unforturnately, I am a new for using parallel on abinit. After, it's configure, the process is going well.
Then, I just type "make mj4" Its got a message look like this

[root@cluster abinit-6.4.3]# make mj4
make multi multi_nprocs=4
make[1]: Entering directory `/state/partition1/apps/abinit-6.4.3'
cd prereqs && make -j4
make[2]: Entering directory `/state/partition1/apps/abinit-6.4.3/prereqs'
Making all in linalg
make[3]: Entering directory `/state/partition1/apps/abinit-6.4.3/prereqs/linalg'
make -f ../../prereqs/linalg/linalg.mk
make[4]: Entering directory `/state/partition1/apps/abinit-6.4.3/prereqs/linalg'
lapack-abinit_6.0 has been uncompressed.
touch configure-stamp
lapack-abinit_6.0 has been configured.
cd blas && make FC="gfortran -m64" FCFLAGS="-m64 -ffixed-form -m64 -g -O2 -mtune=native -march=native -mfpmath=sse" AR="ar" ARFLAGS="rc" RANLIB="ranlib"
make[5]: Entering directory `/state/partition1/apps/abinit-6.4.3/prereqs/linalg/blas'
gfortran -m64 -m64 -ffixed-form -m64 -g -O2 -mtune=native -march=native -mfpmath=sse -c caxpy.f
gfortran -m64 -m64 -ffixed-form -m64 -g -O2 -mtune=native -march=native -mfpmath=sse -c ccopy.f
gfortran -m64 -m64 -ffixed-form -m64 -g -O2 -mtune=native -march=native -mfpmath=sse -c cdotc.f
caxpy.f:0: error: bad value (native) for -march= switch
caxpy.f:0: error: bad value (native) for -mtune= switch
ccopy.f:0: error: bad value (native) for -march= switch
ccopy.f:0: error: bad value (native) for -mtune= switch
make[5]: *** [caxpy.o] Error 1
make[5]: *** Waiting for unfinished jobs....
make[5]: *** [ccopy.o] Error 1
cdotc.f:0: error: bad value (native) for -march= switch
cdotc.f:0: error: bad value (native) for -mtune= switch
make[5]: *** [cdotc.o] Error 1
make[5]: Leaving directory `/state/partition1/apps/abinit-6.4.3/prereqs/linalg/blas'
make[4]: *** [build-stamp] Error 2
make[4]: Leaving directory `/state/partition1/apps/abinit-6.4.3/prereqs/linalg'
make[3]: *** [package-ready] Error 2
make[3]: Leaving directory `/state/partition1/apps/abinit-6.4.3/prereqs/linalg'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/state/partition1/apps/abinit-6.4.3/prereqs'
make[1]: *** [multi] Error 2
make[1]: Leaving directory `/state/partition1/apps/abinit-6.4.3'
make: *** [mj4] Error 2

Do I make some mistake, Please recommend me

Thank
Asanee

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Mon Jan 24, 2011 11:46 am
by Alain_Jacques
Hello Asanee,

Two things look fishy to me.
First of all, you configured with mpi enabled but gfortran is invoked instead of the mpif90 wrapper. I bet the --with-mpi-prefix is wrong (it should point to the root dir of your MPI implementation i.e. where bin/, lib/ and include/ live)
Secondly, gfortran chokes on march=native (and mtune). It's probably quite an old version not supported anymore by Abinit - check it with gfortran --version.

Kind regards,

Alain

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Mon Jan 24, 2011 1:01 pm
by asanee
Thank. Alain_Jacques to answer me

First of all, I am a new in Linux user. So, If I have change for the mpi libs to

./configure --enable-64bit-flags="yes" --enable-mpi="yes" --with-mpi-prefix="/usr/lib/openmpi/1.3.2-gcc/include/"

As, I know the mpi.h file is in there ( "/usr/lib/openmpi/1.3.2-gcc/include/") and the gfortran version is 4.1.2.
however, it might be the older version. Could you help me for the next step? Please .... if i got the same error as i have been got before
Could I change another compiler insteat of gfortran or it just update for gfortran ?

Thank Asanee

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Mon Jan 24, 2011 2:04 pm
by Alain_Jacques
if you use the --with-mpi-prefix="/SomeDirectory", configure expects to find mpi.h in /SomeDirectory/include/mpi.h, mpif90 in /SomeDirectory/bin/mpif90, libmpi.so in /SomeDirectory/lib/libmpi.so ... and so on. If your mpi implementation is scattered then use --with-mpi-includes="-I/usr/lib/openmpi/1.3.2-gcc/include/" and --with-mpi-libs="-L/usr/lib/openmpi/1.3.2-gcc/lib -lmpi" instead. Now these libs seem associated with an old gfortran 4.1.2 which is not supported anymore (because it is quite buggy).

So try either

Code: Select all

export LD_LIBRARY_PATH=/opt/mpich2/gnu/lib:$LD_LIBRARY_PATH
./configure --enable-64bit-flags="yes" --enable-mpi="yes" --with-mpi-prefix="/opt/mpich2/gnu/" --disable-linalg  FCFLAGS="-O2" CFLAGS="-O2" CXXFLAGS="-O2"
by reference to your first post or

Code: Select all

./configure --enable-64bit-flags="yes" --enable-mpi="yes" --with-mpi-includes="-I/usr/lib/openmpi/1.3.2-gcc/include/" --with-mpi-libs="-L/usr/lib/openmpi/1.3.2-gcc/lib -lmpi" --disable-linalg FCFLAGS="-O2" CFLAGS="-O2" CXXFLAGS="-O2"

and maybe add --disable-all-plugins if you are in trouble with one of them.

You mentioned that you work on a cluster; it would be reasonable to think that gfortran 4.1.2 is not the only available fortran compiler on it. Sorry, I don't know where they hide ... ask the system admin. Abinit supports quite a variety of compilers ... gnu, Intel, PGI, ... And also ask for the mpi associated with the compilers (mpi is build for a specific compiler i.e. don't try to mix). Then I can help you further with this info.

Updating to a new gfortran means compiling a new compiler - not for the fainted of heart, you'll probably have to prepare new gmp, mpfr and mpc libraries. And new mpi libs when finished. Better to check first if this hasn't been done already on your system.


Alain

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Tue Jan 25, 2011 6:22 am
by asanee
Dear Alain_Jacques

I would be appreciate in your kind helping me. Anyway, I try to configue my cluster which has a few node.
Before that I have been familiar to use abinit within single run.
My system is quit large, and also took a long time. Thus, I have plan for mpirun to help my calculationl. So,....
I have done on your recommend with a bit change as following

./configure --enable-mpi="yes" --with-mpi-includes="-I/usr/lib/openmpi/1.3.2-gcc/include/" --with-mpi-libs="-L/usr/lib/openmpi/1.3.2-gcc/lib/" -disable-all-plugins FC="f95" FCFLAGS="-L/usr/bin/"

As results, this can neglect for "gfortran" problems (as I was understand). However, I had stuck on ... .. with I don't know such that problem as shown below (this is only some part of the messege)

Error: Symbol 'mpi_comm_null' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:197

Included at m_xmpi.F90:628

call MPI_ALLGATHER(xval,nelem,MPI_INTEGER,recvcounts,nelem,MPI_INTEGER,space
1
Error: Symbol 'mpi_integer' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:135

Included at m_xmpi.F90:628

call MPI_ALLGATHER(charval,20,MPI_CHARACTER,recvcounts,20,MPI_CHARACTER,spac
1
Error: Symbol 'mpi_character' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:133

Included at m_xmpi.F90:628

if (spaceComm /= MPI_COMM_SELF .and. spaceComm /= MPI_COMM_NULL) then
1
Error: Symbol 'mpi_comm_self' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:133

Included at m_xmpi.F90:628

if (spaceComm /= MPI_COMM_SELF .and. spaceComm /= MPI_COMM_NULL) then
1
Error: Symbol 'mpi_comm_null' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:73

Included at m_xmpi.F90:628

if (spaceComm /= MPI_COMM_SELF .and. spaceComm /= MPI_COMM_NULL) then
1
Error: Symbol 'mpi_comm_self' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:73

Included at m_xmpi.F90:628

if (spaceComm /= MPI_COMM_SELF .and. spaceComm /= MPI_COMM_NULL) then
1
Error: Symbol 'mpi_comm_null' at (1) has no IMPLICIT type
In file xallgather_mpi.F90:75

Included at m_xmpi.F90:628

call MPI_ALLGATHER(xval,1,MPI_INTEGER,recvcounts,1,MPI_INTEGER,spaceComm,ier)
1
Error: Symbol 'mpi_integer' at (1) has no IMPLICIT type
In file m_xmpi.F90:517

if (spaceComm /= MPI_COMM_NULL) then
1
Error: Symbol 'mpi_comm_null' at (1) has no IMPLICIT type
In file m_xmpi.F90:467

if ( spaceComm/=MPI_COMM_NULL ) then
1
Error: Symbol 'mpi_comm_null' at (1) has no IMPLICIT type
In file m_xmpi.F90:421

if ( spaceComm/=MPI_COMM_NULL ) then
1
Error: Symbol 'mpi_comm_null' at (1) has no IMPLICIT type
In file m_xmpi.F90:335

call MPI_ATTR_GET(MPI_COMM_WORLD, MPI_TAG_UB, attribute_val, lflag, ierr) ! De
1
Error: Symbol 'mpi_tag_ub' at (1) has no IMPLICIT type
In file m_xmpi.F90:335

call MPI_ATTR_GET(MPI_COMM_WORLD, MPI_TAG_UB, attribute_val, lflag, ierr) ! De
1
Error: Symbol 'mpi_comm_world' at (1) has no IMPLICIT type
make[3]: *** [m_xmpi.o] Error 1
make[3]: Leaving directory `/state/partition1/home/abinit-6.4.3/src/12_hide_mpi'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/state/partition1/home/abinit-6.4.3/src'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/state/partition1/home/abinit-6.4.3'
make: *** [all] Error 2


For now, this thing I don't understand. Should I go back to use single run again ... ? I'm Sorry for my knowledge using linux however Thank you again.

Thank
Asanee

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Tue Jan 25, 2011 2:41 pm
by pouillon
You should do exactly as Alain told you in his first suggestion. If you do otherwise, it will likely not work.

An alternative to setting CFLAGS, CXXFLAGS and FCFLAGS is to use the following options:

Code: Select all

--with-fc-vendor="gnu" --with-fc-version="4.2"

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Tue Jan 25, 2011 5:58 pm
by Alain_Jacques
Here are some recipes to build a new compiler toolchain ... let's try to have a clean gfortran 4.4 working on your box. As you currently have gcc-4.1.2 installed, I assume you're working with some sort of RHEL 5 or variant (CentOS, Scientific Linux, ...)

1. download GNU MP from ftp://ftp.gmplib.org/pub/gmp-4.3.2/gmp-4.3.2.tar.bz2 Configure, build, check and install with

Code: Select all

./configure --prefix=/opt/gmp-4.3.2
make
make check
make install


2. download MPFR from http://www.mpfr.org/mpfr-2.4.2/mpfr-2.4.2.tar.bz2 Configure, ... with

Code: Select all

./configure --prefix=/opt/mpfr-2.4.2 --with-gmp=/opt/gmp-4.3.2
make
make check
make install


3. download gcc-4.4.5 from a mirror of http://gcc.gnu.org/ Untar. Create a separate build directory, cd to it then configure, .... with

Code: Select all

PathToGCCSource/configure --prefix=/opt/gcc-4.4.5 --enable-bootstrap --enable-shared --enable-threads=posix --with-system-zlib --enable-languages=c,c++,fortran --with-cpu=generic --with-mpfr=/opt/mpfr-2.4.2 --with-gmp=/opt/gmp-4.3.2
make
make install

You should have a brand new gcc/gfortran in /opt/gcc-4.4.5

4. prepare your environment to use the new toolchain ...

Code: Select all

export PATH=/opt/gcc-4.4.5/bin:$PATH
export LD_LIBRARY_PATH=/opt/gcc-4.4.5/lib:$LD_LIBRARY_PATH

From now, a plain gfortran -v should return the new compiler.

5. Build a new MPI (don't try to recycle the one already on your system). Download http://www.open-mpi.org/software/ompi/v1.4/downloads/openmpi-1.4.3.tar.bz2 Configure, ... with

Code: Select all

./configure --prefix=/opt/openmpi-1.4.3_gcc-4.4.5 CC=/opt/gcc-4.4.5/bin/gcc CXX=/opt/gcc-4.4.5/bin/g++ F77=/opt/gcc-4.4.5/bin/gfortran FC=/opt/gcc-4.4.5/bin/gfortran CPP=/opt/gcc-4.4.5/bin/cpp
make
make check
make install


6. append the parallel environment ...

Code: Select all

export PATH=/opt/openmpi-1.4.3_gcc-4.4.5/bin:$PATH
export LD_LIBRARY_PATH=/opt/openmpi-1.4.3_gcc-4.4.5/lib:$LD_LIBRARY_PATH

From now, mpif90 -show should return the new MPI wrapper

7. back to abinit 6.4.3. Try to configure with

Code: Select all

./configure --enable-64bit-flags="yes" --enable-mpi="yes" --with-mpi-prefix=/opt/openmpi-1.4.3_gcc-4.4.5 --enable-linalg="no"


Working?

Alain

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Wed Jan 26, 2011 8:42 am
by asanee
Thank you for both of your recommendation

I had try exactly as both of you suggestion but, I can't update new compiler as a good recommendation from Alain.
Thank for Pouillon, for the suggestion. I re-configure again by using the first as Alian guide me

./configure --enable-mpi="yes" --with-mpi-prefix="/opt/mpich2/gnu/" --disable-linalg FCFLAGS="-O2" CFLAGS="-O2" CXXFLAGS="-O2" --disable-all-plugins

and get the result message as showing


==============================================================================
=== C support ===
==============================================================================

checking for gcc... /opt/mpich2/gnu//bin/mpicc
checking whether the C compiler works... yes
checking for C compiler default output file name... a.out
checking for suffix of executables...
checking whether we are cross compiling... no
checking for suffix of object files... o
checking whether we are using the GNU C compiler... yes
checking whether /opt/mpich2/gnu//bin/mpicc accepts -g... yes
checking for /opt/mpich2/gnu//bin/mpicc option to accept ISO C89... none needed
checking for style of include used by make... GNU
checking dependency style of /opt/mpich2/gnu//bin/mpicc... gcc3
checking how to run the C preprocessor... /opt/mpich2/gnu//bin/mpicc -E
checking which type of compiler we have... gnu 4.1
checking for ANSI C header files... yes
checking for sys/types.h... yes
checking for sys/stat.h... yes
checking for stdlib.h... yes
checking for string.h... yes
checking for memory.h... yes
checking for strings.h... yes
checking for inttypes.h... yes
checking for stdint.h... yes
checking for unistd.h... yes
checking whether byte ordering is bigendian... no

==============================================================================
=== C++ support ===
==============================================================================

checking whether we are using the GNU C++ compiler... yes
checking whether /opt/mpich2/gnu//bin/mpicxx accepts -g... yes
checking dependency style of /opt/mpich2/gnu//bin/mpicxx... gcc3
checking which type of C++ compiler we have... gnu 4.1

==============================================================================
=== Fortran support ===
==============================================================================

checking whether we are using the GNU Fortran compiler... yes
checking whether /opt/mpich2/gnu//bin/mpif90 accepts -g... yes
checking which type of Fortran compiler we have... gnu 4.1
checking fortran 90 modules extension... mod
checking for Fortran flag to compile .F90 files... none
configure: determining Fortran module case
checking whether Fortran modules are upper-case... no
checking how to get verbose linking output from /opt/mpich2/gnu//bin/mpif90... -v
checking for Fortran libraries of /opt/mpich2/gnu//bin/mpif90... -L/opt/mpich2/gnu/lib -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2/../../../../lib64 -L/lib/../lib64 -L/usr/lib/../lib64 -lmpichf90 -lmpich -lopa -lpthread -lrt -luuid -lgfortranbegin -lgfortran -lm
checking for dummy main to link with Fortran libraries... none
checking for Fortran name-mangling scheme... lower case, underscore, no extra underscore
.
.
.
==============================================================================
=== Connectors ===
==============================================================================

checking whether the C compiler supports MPI... yes
checking whether the C++ compiler supports MPI... yes
checking whether the Fortran Compiler supports MPI... yes
checking whether MPI is usable... yes
configure: enabling MPI I/O support
checking whether to build MPI code... yes
checking whether to build MPI I/O code... yes
checking whether to build MPI I/O untested features... no
checking whether to build MPI time tracing code... no
checking which level of MPI is supported by the Fortran compiler... 2
checking whether to activate GPU support... no
checking whether to use transferable I/O libraries... no
checking whether to use optimized timer libraries... no
checking whether to use optimized linear algebra libraries... no
checking whether to activate ScaLAPACK support... no
checking whether to use optimized math libraries... no
checking whether to use optimized FFT libraries... no
checking whether to use DFT libraries... no

This is seem to be good, however, I have continue by " make mj4" . I got this message
The screen show the succeed complie of compiler, however ...

In file interfaces_67_common.F90:580

real(dp), intent(out), allocatable :: csix(:,:)
1
Error: ALLOCATABLE attribute conflicts with DUMMY attribute at (1)
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o calc_fc.o calc_fc.F90
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o clnup2.o clnup2.F90
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o dens_in_sph.o dens_in_sph.F90
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o dielmt2.o dielmt2.F90
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o dielmt.o dielmt.F90
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o dieltcel.o dieltcel.F90
In file interfaces_67_common.F90:572

subroutine evdw_wannier(csix,corrvdw,nwan,vdw_nwan,&
1
Error: Symbol 'csix' at (1) has no IMPLICIT type
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o ewald.o ewald.F90
/opt/mpich2/gnu//bin/mpif90 -m64 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -ffree-form -J/export/home/asanee/Desktop/abinit-6.4.3/src/mods -I/export/home/asanee/Desktop/abinit-6.4.3/src/mods -O2 -c -o ewald2.o ewald2.F90
make[5]: *** [interfaces_67_common.o] Error 1
make[5]: *** Waiting for unfinished jobs....
make[5]: Leaving directory `/state/partition1/home/asanee/Desktop/abinit-6.4.3/src/67_common'
make[4]: *** [all-recursive] Error 1
make[4]: Leaving directory `/state/partition1/home/asanee/Desktop/abinit-6.4.3/src'
make[3]: *** [all-recursive] Error 1
make[3]: Leaving directory `/state/partition1/home/asanee/Desktop/abinit-6.4.3'
make[2]: *** [all] Error 2
make[2]: Leaving directory `/state/partition1/home/asanee/Desktop/abinit-6.4.3'
make[1]: *** [multi] Error 2
make[1]: Leaving directory `/state/partition1/home/asanee/Desktop/abinit-6.4.3'
make: *** [mj4] Error 2

I think the error might come from somethings which it does not concern to the compiler.
So, for now, I would be appreciated for boths of your helps. I would be afraid of your offering for the sake of my less
linux user. Anyway, Is there possible if I change to use fedora14 (This might be a new compiler), and install mpich2, then
abinit in single computer? This would be easier than rocks cluster.

Very thank boths
Asanee

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Wed Jan 26, 2011 11:29 am
by Alain_Jacques
Hello Asanee,

Fedora 14 comes with gcc-4.5 ... this compiler version is well supported. And yes, you can use MPICH2 on a single SMP box - multicores and/or multicpus - in this case, I configure MPICH2 with gforker process manager and ch3:nemesis communication device.

Kind regards,

Alain

Re: installing abinit 6.4.3 on rocks clusters 5.3

Posted: Wed Jan 26, 2011 2:14 pm
by asanee
Dear Alain

Thank for reply me. Now, I decide to install abinit on fedora14 with has compiler 4.5.1 in it. This is correct as you had been mension before.
The compiling procedure is strange forword as you suggest. It's working. Thank again. The mpich2 and abinit are well compatible.
But one thing that's quit differ from a single run is " Does mpirun need memory (RAM) to calculate for the whole process as a single core before?"
I check by using system monitor and observe that It doesn't a litle bit within a big system calculation, or do I make something worng?

Finally, Thank you for all you kind helping me again
Asanee