Note: Gromacs has a nicely behaving build system that allows to build its core parts separately from the MPI-aware mdrun part. Therefore, the same Gromacs library get used regardless of what MPI implementation you use, the only difference is about the mdrun binary you have to use (see table below).

To use any Gromacs version, load the corresponding modulefiles and then use the command

$ source $Path/bin/GMXRC

to initialize the Gromacs environment.

Version Compiler MPI FFT Library Path Known to work
4.5.5 GCC 4.7.0 ACML 5.2.0 fma4 /cluster/Apps/gromacs/4.5.5/gcc_acml/ ?
OpenMPI 1.6.1 /cluster/Apps/gromacs/4.5.5/gcc_acml/bin/mdrun_openmpi ?
Platform MPI 8.3 /cluster/Apps/gromacs/4.5.5/gcc_acml/bin/mdrun_platform_mpi ?
Intel MPI /cluster/Apps/gromacs/4.5.5/gcc_acml/bin/mdrun_intelmpi
Intel Composer 2011 SP1 10.319 Intel MKL 10.319 /cluster/Apps/gromacs/4.5.5/icc_mkl/bin/mdrun_intelmpi ?
/cluster/Apps/gromacs/4.5.5/icc_mkl/ ?
OpenMPI 1.6.1 /cluster/Apps/gromacs/4.5.5/icc_mkl/bin/mdrun_openmpi ?
4.0.7 GCC 4.7.0 Intel MPI ACML 5.1.0 fma4 /cluster/Apps/gromacs/4.0.7/gcc_intelmpi_acml/ ?
OpenMPI 1.6.1 ACML 5.2.0 fma4 /cluster/Apps/gromacs/4.0.7/gcc_openmpi_acml/ ?

This is a completed jobfile that is already tested and working.

In my experiment I used varying the number of processors in order to obtain the best performance, and see the behavior.

One can see that there is a file called by the program “mdrun” which is “file.tpr” that must be precompiled:

#BSUB -q short             # Job queue
#BSUB -o proc64-%J.output              # output is sent to file job${LSB_JOBID}.output
#BSUB -e proc64-%J.error              #error output is sent to file job.error
#BSUB -J name_job       # name of the job
#BSUB -n 64                     # number of procs
#BSUB -R 'span[hosts=1]'
#BSUB -W 300             #time for calculation (short queue t < or =300)
#BSUB -app Reserve1800M #This concerns the memory, that varies in each system
#BSUB -N        # send job report when done by email
module add gcc/4.7.0 
module add mpi/intelmpi/ 
module add acml/5.1.0/gfortran/gfortran64_fma4 
mpiexec.hydra -genvall /cluster/Apps/gromacs/4.5.5/gcc_intelmpi_acml/bin/mdrun -deffnm file.tpr 

So the file you need to have next to your jobfile as input is “.tpr” file. Easily compiled in PC (no need mogon facilities):

grompp -f conditions.mdp -c geometry.gro -p -o file.tpr

For more information concerning Gromacs “.tpr” input please check

  • software/gromacs.txt
  • Last modified: 2014/03/13 15:04
  • by schlarbm