option, parallelism,...
Moderators: fgoudreault, mcote
Forum rules
Please have a look at ~abinit/doc/config/build-config.ac in the source package for detailed and up-to-date information about the configuration of Abinit 8 builds.
For a video explanation on how to build Abinit 7.x for Linux, please go to:
http://www.youtube.com/watch?v=DppLQ-KQA68.
IMPORTANT: when an answer solves your problem, please check the little green V-like button on its upper-right corner to accept it.
-
IvanHito
- Posts: 3
- Joined: Thu Jun 12, 2014 9:07 am
Post
by IvanHito » Thu Jun 12, 2014 11:01 am
Greetings to all!
I'm quite new on this forum and not well enough in abinit code. So i ask for an excuse if there would be something wrong in my question.
My current problem is to compile abinit on a super computer with mvapich release of mpi.
There are configuration options i use:
Code: Select all
./configure --enable-debug=paranoid --enable-mpi="yes" --disable-mpi-inplace --disable-mpi-io --with-mpi-prefix="/common/mvapich-1.2rc1-gcc" --enable-64bit-flags="yes"
The configuration always goes well (as far as i can conclude

). The config log is in the attachment. The real problem arises on the make step. There is the error:
Code: Select all
/common/mvapich-1.2rc1-gcc/bin/mpif90 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -I/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/fallbacks/exports/include -free -module /gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/mods -O0 -g -debug all -check uninit -ftrapuv -traceback -warn all -fp-stack-check -extend-source -vec-report0 -noaltparam -nofpscomp -c -o m_GreenHyb.o m_GreenHyb.F90
...
...
------------------------------^
m_GreenHyb.F90(866): error #6404: This name does not have a type and must have an excplicit type. [MPI_IN_PLACE]
CALL MPI_ALLGATHERV(MPI_IN_PLACE, O, MPI_DATA_TYPE_NULL,&
------------------------------^
compilation aborted from m_GreenHyb.F90 (code 1)
make[3]: *** [m_GreenHyb.o] Error 1
make[2]: *** [all-recursive] Error 1
make[1]: *** [all-recursive] Error 1
make: *** [all] Error 2
Appreciate any hints on solving this.

-
Attachments
-
config.log
- (138.48 KiB) Downloaded 307 times
-
pouillon
- Posts: 651
- Joined: Wed Aug 19, 2009 10:08 am
- Location: Spain
-
Contact:
Post
by pouillon » Thu Jun 12, 2014 12:15 pm
This is a bug in Abinit. The case where MPI_IN_PLACE is not supported has not been taken into account in the faulty module. I'll notify the developers of this section and we'll keep you in touch.
Thank you for your report.
Yann Pouillon
Simune Atomistics
Donostia-San Sebastián, Spain
-
Jordan
- Posts: 282
- Joined: Tue May 07, 2013 9:47 am
Post
by Jordan » Thu Jun 12, 2014 6:05 pm
Sorry for the bug,
Here is a patch to apply that should work. Let me know if it does not.
Download and copy the attached file into your abinit directory
Code: Select all
$ pwd
/foo/bar/abinit-7.6.4/
$ patch -p0 < 62_ctqmc.patch.log
Then try to compile again.
Cheers,
Jordan
-
Attachments
-
62_ctqmc.patch.log
- Patch for 62_ctqmc without MPI_IN_PLACE
- (5.1 KiB) Downloaded 306 times
-
IvanHito
- Posts: 3
- Joined: Thu Jun 12, 2014 9:07 am
Post
by IvanHito » Mon Jun 16, 2014 1:54 pm
I didn't have access to the computer for last three days. Now I'm back to work again.
I did the patch you kindly prepared for me and restarted the make (with distclean and the same as above ./configure befor it). But it failed again...

though, in the other part of the compilation. Now that's what it complains about:
Code: Select all
make[3]: Entering directory `/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/98_main'
/common/mvapich-1.2rc1-gcc/bin/mpif90 -DHAVE_CONFIG_H -I. -I../.. -I../../src/incs -I../../src/incs -I/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/fallbacks/exports/include -free -module /gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/mods -O0 -g -debug all -check uninit -ftrapuv -traceback -warn all -fp-stack-check -extend-source -vec-report0 -noaltparam -nofpscomp -g -debug all -check uninit -ftrapuv -traceback -warn all -fp-stack-check -extend-source -vec-report0 -noaltparam -nofpscomp -c -o abinit-abinit.o `test -f 'abinit.F90' || echo './'`abinit.F90
make[3]: Warning: File `../../src/95_drive/lib95_drive.a' has modification time 4.1 s in the future
/common/mvapich-1.2rc1-gcc/bin/mpif90 -free -module /gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/mods -O0 -g -debug all -check uninit -ftrapuv -traceback -warn all -fp-stack-check -extend-source -vec-report0 -noaltparam -nofpscomp -g -debug all -check uninit -ftrapuv -traceback -warn all -fp-stack-check -extend-source -vec-report0 -noaltparam -nofpscomp -static-intel -static-libgcc -static-intel -static-libgcc -o abinit abinit-abinit.o -static-intel -static-libgcc ../../src/95_drive/lib95_drive.a ../../src/79_seqpar_mpi/lib79_seqpar_mpi.a ../../src/77_ddb/lib77_ddb.a ../../src/77_suscep/lib77_suscep.a ../../src/72_response/lib72_response.a ../../src/71_bse/lib71_bse.a ../../src/70_gw/lib70_gw.a ../../src/69_wfdesc/lib69_wfdesc.a ../../src/68_dmft/lib68_dmft.a ../../src/68_recursion/lib68_recursion.a ../../src/68_rsprc/lib68_rsprc.a ../../src/67_common/lib67_common.a ../../src/66_paw/lib66_paw.a ../../src/66_wfs/lib66_wfs.a ../../src/65_psp/lib65_psp.a ../../src/65_nonlocal/lib65_nonlocal.a ../../src/64_atompaw/lib64_atompaw.a ../../src/62_occeig/lib62_occeig.a ../../src/62_iowfdenpot/lib62_iowfdenpot.a ../../src/62_wvl_wfs/lib62_wvl_wfs.a ../../src/62_poisson/lib62_poisson.a ../../src/62_cg_noabirule/lib62_cg_noabirule.a ../../src/62_ctqmc/lib62_ctqmc.a ../../src/61_ionetcdf/lib61_ionetcdf.a ../../src/57_iovars/lib57_iovars.a ../../src/57_iopsp_parser/lib57_iopsp_parser.a ../../src/56_recipspace/lib56_recipspace.a ../../src/56_xc/lib56_xc.a ../../src/56_mixing/lib56_mixing.a ../../src/56_io_mpi/lib56_io_mpi.a ../../src/53_abiutil/lib53_abiutil.a ../../src/53_spacepar/lib53_spacepar.a ../../src/53_ffts/lib53_ffts.a ../../src/52_fft_mpi_noabirule/lib52_fft_mpi_noabirule.a ../../src/51_manage_mpi/lib51_manage_mpi.a ../../src/49_gw_toolbox_oop/lib49_gw_toolbox_oop.a ../../src/47_xml/lib47_xml.a ../../src/45_geomoptim/lib45_geomoptim.a ../../src/44_abitypes_defs/lib44_abitypes_defs.a ../../src/43_wvl_wrappers/lib43_wvl_wrappers.a ../../src/43_ptgroups/lib43_ptgroups.a ../../src/42_parser/lib42_parser.a ../../src/42_nlstrain/lib42_nlstrain.a ../../src/42_libpaw/lib42_libpaw.a ../../src/41_xc_lowlevel/lib41_xc_lowlevel.a ../../src/41_geometry/lib41_geometry.a ../../src/32_util/lib32_util.a ../../src/28_numeric_noabirule/lib28_numeric_noabirule.a ../../src/27_toolbox_oop/lib27_toolbox_oop.a ../../src/21_psiesta_noabirule/lib21_psiesta_noabirule.a ../../src/18_timing/lib18_timing.a ../../src/16_hideleave/lib16_hideleave.a ../../src/14_hidewrite/lib14_hidewrite.a ../../src/12_hide_mpi/lib12_hide_mpi.a ../../src/11_qespresso_ext/lib11_qespresso_ext.a ../../src/11_memory_mpi/lib11_memory_mpi.a ../../src/10_defs/lib10_defs.a ../../src/01_linalg_ext/lib01_linalg_ext.a ../../src/01_interfaces_ext/lib01_interfaces_ext.a -L/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/fallbacks/exports/lib -llapack -lblas -lrt -L/usr/lib64 -L/common/mvapich-1.2rc1-gcc/lib/shared -L/common/mvapich-1.2rc1-gcc/lib -L/opt/intel/composerxe-2011.3.174/compiler/lib/intel64 -L/opt/intel/composerxe-2011.3.174/mkl/lib/intel64 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2/../../../../lib64 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2/../../.. -L/lib64 -L/lib -L/usr/lib -lmpichf90nc -lmpichfarg -lmpich -libverbs -libumad -lpthread -lrt -lifport -lifcore -limf -lsvml -lm -lipgo -lirc -lirc_s -ldl -L/usr/lib64 -L/common/mvapich-1.2rc1-gcc/lib/shared -L/common/mvapich-1.2rc1-gcc/lib -L/opt/intel/composerxe-2011.3.174/compiler/lib/intel64 -L/opt/intel/composerxe-2011.3.174/mkl/lib/intel64 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2/../../../../lib64 -L/usr/lib/gcc/x86_64-redhat-linux/4.1.2/../../.. -L/lib64 -L/lib -L/usr/lib -lmpichf90nc -lmpichfarg -lmpich -libverbs -libumad -lpthread -lrt -lifport -lifcore -limf -lsvml -lm -lipgo -lirc -lirc_s -ldl
../../src/12_hide_mpi/lib12_hide_mpi.a(m_xmpi.o): In function `xmpi_comm_set_errhandler':
/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/12_hide_mpi/m_xmpi.F90:1779: undefined reference to `mpi_comm_get_errhandler_'
/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/12_hide_mpi/m_xmpi.F90:1780: undefined reference to `mpi_comm_set_errhandler_'
make[3]: *** [abinit] Error 1
make[3]: Leaving directory `/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src/98_main'
make[2]: *** [all-recursive] Error 1
make[2]: Leaving directory `/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4/src'
make[1]: *** [all-recursive] Error 1
make[1]: Leaving directory `/gpfs/NETHOME/ipsm1/Lobzenko/abinit-7.6.4'
make: *** [all] Error 2
-
gmatteo
- Posts: 291
- Joined: Sun Aug 16, 2009 5:40 pm
Post
by gmatteo » Mon Jun 16, 2014 8:08 pm
Enable the calls to MPI_Errhandler_get and comment the
calls to MPI_comm_get_Errhandler in m_xmpi.F90
in subroutine xmpi_comm_set_errhandler
=== modified file 'src/12_hide_mpi/m_xmpi.F90'
--- src/12_hide_mpi/m_xmpi.F90 2014-05-12 09:22:46 +0000
+++ src/12_hide_mpi/m_xmpi.F90 2014-06-16 18:06:01 +0000
@@ -2179,14 +2179,14 @@
mpierr1=MPI_SUCCESS; mpierr2=MPI_SUCCESS
-#if defined HAVE_MPI1
+!#if defined HAVE_MPI1
call MPI_Errhandler_get(my_comm,old_err_handler,mpierr1)
call MPI_Errhandler_set(my_comm,new_err_handler,mpierr2)
-#endif
-#if defined HAVE_MPI2
- call MPI_comm_get_Errhandler(my_comm,old_err_handler,mpierr1)
- call MPI_comm_set_Errhandler(my_comm,new_err_handler,mpierr2)
-#endif
+!#endif
+!#if defined HAVE_MPI2
+! call MPI_comm_get_Errhandler(my_comm,old_err_handler,mpierr1)
+! call MPI_comm_set_Errhandler(my_comm,new_err_handler,mpierr2)
+!#endif
ierror=MPI_SUCCESS
if (mpierr1/=MPI_SUCCESS) then
-
IvanHito
- Posts: 3
- Joined: Thu Jun 12, 2014 9:07 am
Post
by IvanHito » Tue Jun 17, 2014 1:49 pm
Didn't get it from the first look, but after some attempts I understood, what you mean.
The compilation have been
successfully finished
I used the following version of m_xmpi.F90:
Code: Select all
mpierr1=MPI_SUCCESS; mpierr2=MPI_SUCCESS
#if defined HAVE_MPI1
call MPI_Errhandler_get(my_comm,old_err_handler,mpierr1)
call MPI_Errhandler_set(my_comm,new_err_handler,mpierr2)
#endif
#if defined HAVE_MPI2
!! call MPI_comm_get_Errhandler(my_comm,old_err_handler,mpierr1)
!! call MPI_comm_set_Errhandler(my_comm,new_err_handler,mpierr2)
call MPI_Errhandler_get(my_comm,old_err_handler,mpierr1)
call MPI_Errhandler_set(my_comm,new_err_handler,mpierr2)
#endif
ierror=MPI_SUCCESS
So, basicaly i just commented two calls for MPI2 and copy/paste those for MPI1. If it wouldn't corrupt my calculations in future, I say that the problem is
SOLVED. Is it?
-
gmatteo
- Posts: 291
- Joined: Sun Aug 16, 2009 5:40 pm
Post
by gmatteo » Tue Jun 17, 2014 10:32 pm
[quote]
So, basicaly i just commented two calls for MPI2 and copy/paste those for MPI1. If it wouldn't corrupt my calculations in future, I say that the problem is SOLVED. Is it?
[/quote]
Yes, the compilation error due to the missing MPI functions is solved. Note however, that your MPI library is outdated.
Try to run the tests in ~abinit/tests with the runtests.py script to validate the build.
If you encounter problem with Abinit in parallel, try a more recent version of mvapich