Using alternative compilers or dependencies¶
Using an alternative compiler¶
Intel¶
GCC is the most supported compiler
The ITSR Applications Team have compiled the majority of applications available via modules on Apocrita using GCC. Whilst other compilers are available on Apocrita as outlined below, you will receive the best support from us should you choose to use GCC as well.
If you don't specify a compiler when installing something from Spack, Spack will use what it considers to be the best compiler available (usually GCC). But what about if we prefer to use one of the other compilers in our personal compilers.yaml file?
First, we can check which compilers are available:
spack compilers
output
$ spack -C ${HOME}/spack-config-templates/0.23.1 compilers
==> Available compilers
-- gcc rocky9-x86_64 --------------------------------------------
gcc@14.2.0 gcc@12.2.0 gcc@11.4.1
-- oneapi rocky9-x86_64 -----------------------------------------
oneapi@2024.1.0
So, we can see a list of available compilers as previously defined.
To use a specific compiler, you need to add it to the install command after the
%
sigil. So, let's spec nano@6.4
, but this time we will use the Intel
compiler (so add %oneapi@2024.1.0
to the end of the command):
spack spec
output
$ spack -C ${HOME}/spack-config-templates/0.23.1 spec nano@6.4 %oneapi@2024.1.0
- nano@6.4%oneapi@2024.1.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gettext@0.22.5%gcc@11.4.1+bzip2+curses+git~libunistring+libxml2+pic+shared+tar+xz build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^bzip2@1.0.8%gcc@11.4.1~debug~pic+shared build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^diffutils@3.10%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^libxml2@2.10.3%gcc@11.4.1+pic~python+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^zlib-ng@2.1.6%gcc@11.4.1+compat+new_strategies+opt+pic+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^tar@1.34%gcc@11.4.1 build_system=autotools zip=pigz arch=linux-rocky9-x86_64_v4
[^] ^pigz@2.8%gcc@11.4.1 build_system=makefile arch=linux-rocky9-x86_64_v4
[^] ^zstd@1.5.6%gcc@11.4.1+programs build_system=makefile compression=none libs=shared,static arch=linux-rocky9-x86_64_v4
[^] ^xz@5.4.6%gcc@11.4.1~pic build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[e] ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gmake@4.4.1%gcc@11.4.1~guile build_system=generic arch=linux-rocky9-x86_64_v4
- ^intel-oneapi-runtime@2024.1.0%oneapi@2024.1.0 build_system=generic arch=linux-rocky9-x86_64_v4
- ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^ncurses@6.5%gcc@11.4.1~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=linux-rocky9-x86_64_v4
[^] ^pkgconf@2.2.0%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
So, a similar output to before - most of the dependencies are already upstream
and the only things that need to be compiled and installed personally are
intel-oneapi-runtime@2024.1.0
and the actual binary for nano
itself. So
let's go ahead and run an install:
spack install
output (click to expand)
$ spack -C ${HOME}/spack-config-templates/0.23.1 install -j ${NSLOTS} nano@6.4 %oneapi@2024.1.0
[+] /usr (external glibc-2.34-xri56vcyzs7kkvakhoku3fefc46nw25y)
==> Installing gcc-runtime-11.4.1-ahrtqhe73rrbhmrvurbmn3cvqvwmmuia [2/15]
==> No binary for gcc-runtime-11.4.1-ahrtqhe73rrbhmrvurbmn3cvqvwmmuia found: installing from source
==> No patches needed for gcc-runtime
==> gcc-runtime: Executing phase: 'install'
==> gcc-runtime: Successfully installed gcc-runtime-11.4.1-ahrtqhe73rrbhmrvurbmn3cvqvwmmuia
Stage: 0.00s. Install: 0.10s. Post-install: 0.59s. Total: 0.88s
[+] /data/scratch/abc123/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc-runtime/11.4.1-ahrtqhe
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gcc-runtime/11.4.1-llid4hw
==> Installing intel-oneapi-runtime-2024.1.0-6lnrxuhulirsr6spzpq4l2jo7jzxhkt6 [4/15]
==> No binary for intel-oneapi-runtime-2024.1.0-6lnrxuhulirsr6spzpq4l2jo7jzxhkt6 found: installing from source
==> No patches needed for intel-oneapi-runtime
==> intel-oneapi-runtime: Executing phase: 'install'
==> intel-oneapi-runtime: Successfully installed intel-oneapi-runtime-2024.1.0-6lnrxuhulirsr6spzpq4l2jo7jzxhkt6
Stage: 0.00s. Install: 2.69s. Post-install: 0.23s. Total: 2.98s
[+] /data/scratch/abc123/spack/apps/linux-rocky9-x86_64_v4/oneapi-2024.1.0/intel-oneapi-runtime/2024.1.0-6lnrxuh
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/bzip2/1.0.8-uj4wyhx
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gmake/4.4.1-xchit5a
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/zstd/1.5.6-my7tyw6
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/ncurses/6.5-4n2uzha
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/xz/5.4.6-rwn7pno
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/zlib-ng/2.1.6-g2yruc3
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/libxml2/2.10.3-q6zmsq6
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/pigz/2.8-somkvv4
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/tar/1.34-ivzmnos
[+] /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-11.4.1/gettext/0.22.5-udcuonu
==> Installing nano-6.4-xqyv4jsv6ecorffkjj4cfwrswzx35jas [15/15]
==> No binary for nano-6.4-xqyv4jsv6ecorffkjj4cfwrswzx35jas found: installing from source
==> Using cached archive: /data/scratch/abc123/spack/cache/_source-cache/archive/41/4199ae8ca78a7796de56de1a41b821dc47912c0307e9816b56cc317df34661c0.tar.xz
==> No patches needed for nano
==> nano: Executing phase: 'autoreconf'
==> nano: Executing phase: 'configure'
==> nano: Executing phase: 'build'
==> nano: Executing phase: 'install'
==> nano: Successfully installed nano-6.4-xqyv4jsv6ecorffkjj4cfwrswzx35jas
Stage: 0.22s. Autoreconf: 0.00s. Configure: 55.33s. Build: 1.80s. Install: 1.62s. Post-install: 0.43s. Total: 1m 1.06s
[+] /data/scratch/abc123/spack/apps/linux-rocky9-x86_64_v4/oneapi-2024.1.0/nano/6.4-xqyv4js
So, once again:
- The existing available upstream dependencies have been used from the central location as opposed to reinstalled
- Spack has noticed that
intel-oneapi-runtime@2024.1.0
andnano@6.4
compiled against Intel are missing and has re-used the existing cached tarballs and then manually compiled them and installed them to the location defined asinstall_tree:
root:
inconfig.yaml
.
Let's now return to our spack find
command:
spack find
output
$ spack -C ${HOME}/spack-config-templates/0.23.1 find -x -p nano
-- linux-rocky9-x86_64_v4 / gcc@12.2.0 --------------------------
nano@7.2 /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-12.2.0/nano/7.2-4ew6nde
-- linux-rocky9-x86_64_v4 / gcc@14.2.0 --------------------------
nano@6.4 /data/scratch/abc123/spack/apps/linux-rocky9-x86_64_v4/gcc-14.2.0/nano/6.4-6hegmja
nano@8.2 /share/apps/rocky9/spack/apps/linux-rocky9-x86_64_v4/gcc-14.2.0/nano/8.2-7onanyb
-- linux-rocky9-x86_64_v4 / oneapi@2024.1.0 ---------------------
nano@6.4 /data/scratch/abc123/spack/apps/linux-rocky9-x86_64_v4/oneapi-2024.1.0/nano/6.4-xqyv4js
==> 4 installed packages
And we now have a module file available for the version compiled against Intel
(note that the module name lists oneapi-2024.1.0
and not gcc-14.2.0
):
module avail
output
$ module avail -l nano
- Package/Alias -----------------------.- Versions --------.- Last mod. -------
/data/scratch/abc123/spack/privatemodules/linux-rocky9-x86_64_v4:
nano/6.4-gcc-14.2.0 2025/05/15 16:28:01
nano/6.4-oneapi-2024.1.0 2025/05/16 11:24:37
/share/apps/rocky9/environmentmodules/apocrita-modules/spack:
nano/7.2-gcc-12.2.0 2025/05/01 10:40:45
nano/8.2-gcc-14.2.0 2025/05/15 16:27:07
$ module load nano/6.4-oneapi-2024.1.0
$ which nano
/data/scratch/abc123/spack/apps/linux-rocky9-x86_64_v4/oneapi-2024.1.0/nano/6.4-xqyv4js/bin/nano
$ nano --version
GNU nano, version 6.4
(C) 2022 the Free Software Foundation and various contributors
Compiled options: --disable-libmagic --enable-utf8
Using alternative dependencies¶
Intel/Intel MPI¶
Open MPI is the most supported MPI implementation
The ITSR Applications Team have compiled almost all centrally installed
applications available on Apocrita against bespoke installations of Open MPI
integrating advanced features such as
Infiniband, and you will
receive the best support from us if you also use Open MPI (defined as an
external
package as advised in the above
packages.yaml
documentation). Using other
MPI versions as detailed below is possible, but not as well-supported.
You'll have noticed that the above examples for Gromacs and LAMMPS all use the
^openmpi
dependency. But it is also possible to compile using the
Intel compiler
and Intel MPI instead.
Let's take LAMMPS as a simple example. The default centrally installed CPU
versions (the lammps
modules) have been compiled using GCC and Open MPI, using
the following variant:
lammps ^openmpi
Let's run a spec
command using our personal config scope, but this time
swapping out ^openmpi
with Intel and Intel MPI:
Use the intel-oneapi-*
Spack packages
Spack uses the intel-oneapi-*
naming scheme for all Intel packages such as
intel-oneapi-compilers
,
intel-oneapi-mpi
etc. Search the Spack Packages site for more
details.
We will use the Intel compiler, Intel MPI and the Intel MKL libraries (which
will provide fftw
via the +cluster
variant of Intel MKL and the fft=mkl
variant of LAMMPS itself).
spack spec
output (click to expand)
$ spack -C ${HOME}/spack-config-templates/0.23.1 spec lammps fft=mkl %oneapi ^intel-oneapi-mpi ^intel-oneapi-mkl+cluster
- lammps@20240829.1%oneapi@2024.1.0~adios~amoeba~asphere~atc~awpmd~bocs~body~bpm~brownian~cg-dna~cg-spica~class2~colloid~colvars~compress~coreshell~cuda~cuda_mps~curl~dielectric~diffraction~dipole~dpd-basic~dpd-meso~dpd-react~dpd-smooth~drude~eff~electrode~extra-compute~extra-dump~extra-fix~extra-molecule~extra-pair~fep~ffmpeg~granular~h5md~heffte~intel~interlayer~ipo~jpeg~kim~kokkos+kspace~latboltz~lepton+lib~machdyn~manifold+manybody~mc~meam~mesont~mgpt~misc~ml-hdnnp~ml-iap~ml-pod~ml-rann~ml-snap~ml-uf3~mofff+molecule~molfile+mpi~netcdf~opencl+openmp~openmp-package~opt~orient~peri~phonon~plugin~plumed~png~poems~ptm~python~qeq~qtb~reaction~reaxff~replica~rheo+rigid~rocm~shock~smtbq~sph~spin~srd~tally~tools~uef~voronoi~vtk~yaff build_system=cmake build_type=Release fft=mkl fftw_precision=double generator=make gpu_precision=mixed lammps_sizes=smallbig arch=linux-rocky9-x86_64_v4
[^] ^cmake@3.27.9%gcc@12.2.0~doc+ncurses+ownlibs build_system=generic build_type=Release arch=linux-rocky9-x86_64_v4
[^] ^curl@8.7.1%gcc@12.2.0~gssapi~ldap~libidn2~librtmp~libssh~libssh2+nghttp2 build_system=autotools libs=shared,static tls=openssl arch=linux-rocky9-x86_64_v4
[^] ^nghttp2@1.57.0%gcc@12.2.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^diffutils@3.10%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^openssl@3.3.0%gcc@12.2.0~docs+shared build_system=generic certs=mozilla arch=linux-rocky9-x86_64_v4
[^] ^ca-certificates-mozilla@2023-05-30%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^perl@5.38.0%gcc@11.4.1+cpanm+opcode+open+shared+threads build_system=generic patches=714e4d1 arch=linux-rocky9-x86_64_v4
[^] ^berkeley-db@18.1.40%gcc@11.4.1+cxx~docs+stl build_system=autotools patches=26090f4,b231fcc arch=linux-rocky9-x86_64_v4
[^] ^bzip2@1.0.8%gcc@11.4.1~debug~pic+shared build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gdbm@1.23%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^readline@8.2%gcc@11.4.1 build_system=autotools patches=bbf97f1 arch=linux-rocky9-x86_64_v4
[^] ^pkgconf@2.2.0%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
[e] ^glibc@2.34%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^ncurses@6.5%gcc@11.4.1~symlinks+termlib abi=none build_system=autotools patches=7a351bc arch=linux-rocky9-x86_64_v4
[^] ^zlib-ng@2.1.6%gcc@11.4.1+compat+new_strategies+opt+pic+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[e] ^glibc@2.34%gcc@12.2.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^gmake@4.4.1%gcc@11.4.1~guile build_system=generic arch=linux-rocky9-x86_64_v4
[^] ^gcc-runtime@11.4.1%gcc@11.4.1 build_system=generic arch=linux-rocky9-x86_64_v4
- ^intel-oneapi-mkl@2024.2.2%oneapi@2024.1.0+cluster+envmods~gfortran~ilp64+shared build_system=generic mpi_family=mpich threads=none arch=linux-rocky9-x86_64_v4
[^] ^intel-tbb@2021.9.0%gcc@12.2.0~ipo+shared+tm build_system=cmake build_type=Release cxxstd=default generator=make patches=91755c6 arch=linux-rocky9-x86_64_v4
[^] ^hwloc@2.9.1%gcc@12.2.0~cairo~cuda~gl~libudev+libxml2~netloc~nvml~oneapi-level-zero~opencl+pci~rocm build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] ^libpciaccess@0.17%gcc@12.2.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^libtool@2.4.7%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^findutils@4.9.0%gcc@11.4.1 build_system=autotools patches=440b954 arch=linux-rocky9-x86_64_v4
[^] ^m4@1.4.19%gcc@11.4.1+sigsegv build_system=autotools patches=9dc5fbd,bfdffa7 arch=linux-rocky9-x86_64_v4
[^] ^libsigsegv@2.14%gcc@11.4.1 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^util-macros@1.19.3%gcc@12.2.0 build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^libxml2@2.10.3%gcc@11.4.1+pic~python+shared build_system=autotools arch=linux-rocky9-x86_64_v4
[^] ^xz@5.4.6%gcc@11.4.1~pic build_system=autotools libs=shared,static arch=linux-rocky9-x86_64_v4
[^] ^intel-oneapi-mpi@2021.12.1%gcc@12.2.0+envmods~external-libfabric~generic-names~ilp64 build_system=generic arch=linux-rocky9-x86_64_v4
- ^intel-oneapi-runtime@2024.1.0%oneapi@2024.1.0 build_system=generic arch=linux-rocky9-x86_64_v4
[+] ^gcc-runtime@12.2.0%gcc@12.2.0 build_system=generic arch=linux-rocky9-x86_64_v4
So, a new personal variant of ^intel-oneapi-mkl
needs to be compiled against
Intel MPI (which is already in upstream), as well as our new variant of LAMMPS.
If you follow through and install that personal variant, then, assuming you
broadly follow the
example modules.yaml
template, the module
should be named something like:
lammps/20240829.1-intel-oneapi-mpi-2021.12.1-oneapi-2024.1.0
And when loading the above module and running $ mpirun -np ${NSLOTS} lmp -help
you should see a section like this:
Compiler: Intel LLVM C++ 202401.0 / Intel(R) oneAPI DPC++/C++ Compiler 2024.1.0 (2024.1.0.20240308) with OpenMP 5.1
C++ standard: C++11
MPI v3.1: Intel(R) MPI Library 2021.12 for Linux* OS
Accelerator configuration:
FFT information:
FFT precision = double
FFT engine = mpiFFT
FFT library = MKL with threads