|
| 1 | +# Spack: A Package Manager on the UL HPC Platform |
| 2 | + |
| 3 | +[<img width='400px' src='https://cdn.rawgit.com/spack/spack/develop/share/spack/logo/spack-logo-text.svg'/>](https://spack.readthedocs.io/en/latest/#) |
| 4 | + |
| 5 | + |
| 6 | + |
| 7 | +## Introduction to Spack |
| 8 | + |
| 9 | + |
| 10 | +A brief introduction to Spack will be added here. |
| 11 | + |
| 12 | +## Setting up Spack |
| 13 | + |
| 14 | +!!! note |
| 15 | + The guide is also applicable to other HPC clusters where users need to manage components such as MPI libraries, compilers, and other software through the `module` system. |
| 16 | + |
| 17 | + |
| 18 | +### Connection to a compute node |
| 19 | + |
| 20 | + |
| 21 | +```{.sh .copy} |
| 22 | +si -N 1 -n 16 -c 1 -t 0-02:00:00 # on iris: -C broadwell or -C skylake |
| 23 | +``` |
| 24 | + |
| 25 | +??? note "Allocation Details" |
| 26 | + |
| 27 | + `si` is a shell function that wraps the `salloc` command to simplify interactive Slurm job allocation. |
| 28 | + It stands for: |
| 29 | + |
| 30 | + ```bash |
| 31 | + salloc -p interactive --qos debug -C batch ${options} |
| 32 | + ``` |
| 33 | + - `${options}`: any additional arguments passed to `si` (e.g., `-N`, `-n`, `-c`, `-t`, etc.) |
| 34 | + |
| 35 | + ```bash |
| 36 | + si -N 1 -n 16 -c 1 -t 0-02:00:00 |
| 37 | + ``` |
| 38 | + |
| 39 | + This allocates: |
| 40 | + |
| 41 | + - 1 node (`-N 1`) |
| 42 | + - 16 MPI tasks (`-n 16`) |
| 43 | + - 1 CPU per task (`-c 1`) |
| 44 | + - for a wall time of 2 hours (`-t 0-02:00:00`) |
| 45 | + |
| 46 | + !!! info "Iris Cluster" |
| 47 | + |
| 48 | + On the **Iris** cluster, |
| 49 | + |
| 50 | + - Use `-C broadwell` or `-C skylake` |
| 51 | + |
| 52 | + **Examples:** |
| 53 | + ```bash |
| 54 | + si -N 1 -n 16 -c 1 -t 0-02:00:00 -C broadwell |
| 55 | + ``` |
| 56 | + |
| 57 | + |
| 58 | +### Clone & Setup Spack |
| 59 | + |
| 60 | +Clone and setup spack in `$HOME` - it has better much better performance for |
| 61 | +small files than `$SCRATCH` |
| 62 | + |
| 63 | +``` { .sh .copy } |
| 64 | +cd $HOME |
| 65 | +git clone --depth=2 https://github.com/spack/spack.git |
| 66 | +cd spack |
| 67 | +``` |
| 68 | +To make Spack available in your shell session, source its environment setup script: |
| 69 | + |
| 70 | +``` { .sh .copy } |
| 71 | +source $HOME/spack/share/spack/setup-env.sh |
| 72 | +``` |
| 73 | +For convenience, this line can be added to the .`bashrc` file to make Spack automatically available in every new shell session. |
| 74 | + |
| 75 | +### Define System-Provided Packages |
| 76 | + |
| 77 | +`packages.yaml` A spack configuration file used to tell Spack what tools and versions already exist on the cluster, so Spack can use those instead of building everything again.Create a packages.yaml file under: `$HOME/.spack/packages.yaml` |
| 78 | + |
| 79 | +``` { .sh .copy } |
| 80 | +touch $HOME/.spack/packages.yaml |
| 81 | +``` |
| 82 | + |
| 83 | + |
| 84 | +with the following contents: |
| 85 | + |
| 86 | +``` { .sh .copy } |
| 87 | + packages: |
| 88 | + gcc: |
| 89 | + externals: |
| 90 | + - spec: [email protected]+binutils languages:= 'c,c++,fortran' |
| 91 | + modules: |
| 92 | + - compiler/GCC/13.2.0 |
| 93 | + extra_attributes: |
| 94 | + compilers: |
| 95 | + c: /opt/apps/easybuild/systems/aion/rhel810-20250405/2023b/epyc/software/GCCcore/13.2.0/bin/gcc |
| 96 | + cxx: /opt/apps/easybuild/systems/aion/rhel810-20250405/2023b/epyc/software/GCCcore/13.2.0/bin/g++ |
| 97 | + fortran: /opt/apps/easybuild/systems/aion/rhel810-20250405/2023b/epyc/software/GCCcore/13.2.0/bin/gfortran |
| 98 | + buildable: false |
| 99 | + binutils: |
| 100 | + externals: |
| 101 | + |
| 102 | + modules: |
| 103 | + - tools/binutils/2.40-GCCcore-13.2.0 |
| 104 | + buildable: false |
| 105 | + libevent: |
| 106 | + externals: |
| 107 | + |
| 108 | + modules: |
| 109 | + - lib/libevent/2.1.12-GCCcore-13.2.0 |
| 110 | + buildable: false |
| 111 | + libfabric: |
| 112 | + externals: |
| 113 | + |
| 114 | + modules: |
| 115 | + - lib/libfabric/1.19.0-GCCcore-13.2.0 |
| 116 | + buildable: false |
| 117 | + libpciaccess: |
| 118 | + externals: |
| 119 | + |
| 120 | + modules: |
| 121 | + - system/libpciaccess/0.17-GCCcore-13.2.0 |
| 122 | + buildable: false |
| 123 | + libxml2: |
| 124 | + externals: |
| 125 | + |
| 126 | + modules: |
| 127 | + - lib/libxml2/2.11.5-GCCcore-13.2.0 |
| 128 | + buildable: false |
| 129 | + hwloc: |
| 130 | + externals: |
| 131 | + |
| 132 | + modules: |
| 133 | + - system/hwloc/2.9.2-GCCcore-13.2.0 |
| 134 | + buildable: false |
| 135 | + mpi: |
| 136 | + buildable: false |
| 137 | + munge: |
| 138 | + externals: |
| 139 | + |
| 140 | + prefix: /usr |
| 141 | + buildable: false |
| 142 | + numactl: |
| 143 | + externals: |
| 144 | + |
| 145 | + modules: |
| 146 | + - tools/numactl/2.0.16-GCCcore-13.2.0 |
| 147 | + buildable: false |
| 148 | + openmpi: |
| 149 | + variants: fabrics=ofi,ucx schedulers=slurm |
| 150 | + externals: |
| 151 | + |
| 152 | + modules: |
| 153 | + - mpi/OpenMPI/4.1.6-GCC-13.2.0 |
| 154 | + buildable: false |
| 155 | + pmix: |
| 156 | + externals: |
| 157 | + |
| 158 | + modules: |
| 159 | + - lib/PMIx/4.2.6-GCCcore-13.2.0 |
| 160 | + buildable: false |
| 161 | + slurm: |
| 162 | + externals: |
| 163 | + - spec: [email protected] sysconfdir=/etc/slurm |
| 164 | + prefix: /usr |
| 165 | + buildable: false |
| 166 | + ucx: |
| 167 | + externals: |
| 168 | + |
| 169 | + modules: |
| 170 | + - lib/UCX/1.15.0-GCCcore-13.2.0 |
| 171 | + buildable: false |
| 172 | + zlib: |
| 173 | + externals: |
| 174 | + |
| 175 | + modules: |
| 176 | + - lib/zlib/1.2.13-GCCcore-13.2.0 |
| 177 | + buildable: false |
| 178 | + |
| 179 | +``` |
| 180 | +This tells Spack to use the system available GCC, binutils and OpenMPI with the native fabrics. |
| 181 | + |
| 182 | +## Building FEniCS |
| 183 | + |
| 184 | +Create an environment and install FEniCS |
| 185 | +``` { .sh .copy } |
| 186 | +cd ~ |
| 187 | +spack env create -d fenicsx-main-20230126/ |
| 188 | +spack env activate fenicsx-main-20230126/ |
| 189 | +spack add py-fenics-dolfinx@main fenics-dolfinx+adios2 adios2+python petsc+mumps |
| 190 | +# Change @main to e.g. @0.7.2 in the above if you want a fixed version. |
| 191 | +spack concretize |
| 192 | +spack install -j16 |
| 193 | +``` |
| 194 | +or the same directly in `spack.yaml` in `$SPACK_ENV` |
| 195 | + |
| 196 | +``` { .sh .copy } |
| 197 | +spack: |
| 198 | + # add package specs to the `specs` list |
| 199 | + specs: |
| 200 | + - py-fenics-dolfinx@main |
| 201 | + - fenics-dolfinx@main+adios2 |
| 202 | + - petsc+mumps |
| 203 | + - adios2+python |
| 204 | + view: true |
| 205 | + concretizer: |
| 206 | + unify: true |
| 207 | +``` |
| 208 | +The following are also commonly used in FEniCS scripts and may be useful |
| 209 | + |
| 210 | +``` { .sh .copy } |
| 211 | +spack add gmsh+opencascade py-numba py-scipy py-matplotlib |
| 212 | +``` |
| 213 | +It is possible to build a specific version (git ref) of DOLFINx. Note that the hash must be the full hash. It is best to specify appropriate git refs on all components. |
| 214 | + |
| 215 | +``` { .sh .copy } |
| 216 | +# This is a Spack Environment file. |
| 217 | +# |
| 218 | +# It describes a set of packages to be installed, along with |
| 219 | +# configuration settings. |
| 220 | +spack: |
| 221 | + # add package specs to the `specs` list |
| 222 | + specs: |
| 223 | + |
| 224 | + |
| 225 | + |
| 226 | + |
| 227 | + |
| 228 | + |
| 229 | + |
| 230 | + |
| 231 | + |
| 232 | + |
| 233 | + |
| 234 | + |
| 235 | + - petsc+mumps |
| 236 | + - adios2+python |
| 237 | + view: true |
| 238 | + concretizer: |
| 239 | + unify: true |
| 240 | +``` |
| 241 | + |
| 242 | +It is also possible to build only the C++ layer using |
| 243 | + |
| 244 | + |
| 245 | +``` { .sh .copy } |
| 246 | +spack add fenics-dolfinx@main+adios2 py-fenics-ffcx@main petsc+mumps |
| 247 | +``` |
| 248 | +To rebuild FEniCSx from main branches inside an existing environment |
| 249 | + |
| 250 | + |
| 251 | +``` { .sh .copy } |
| 252 | +spack install --overwrite -j16 fenics-basix py-fenics-basix py-fenics-ffcx fenics-ufcx py-fenics-ufl fenics-dolfinx py-fenics-dolfinx |
| 253 | +``` |
| 254 | + |
| 255 | + |
| 256 | +## Testing the build |
| 257 | + |
| 258 | +Quickly test the build with |
| 259 | +``` { .sh .copy } |
| 260 | +srun python -c "from mpi4py import MPI; import dolfinx" |
| 261 | +``` |
| 262 | + |
| 263 | +## Using the build |
| 264 | + |
| 265 | +See the uni.lu documentation for full details - using the environment should be as |
| 266 | +simple as adding the following where `...` is the name/folder of your environment. |
| 267 | + |
| 268 | +``` { .sh .copy } |
| 269 | +#!/bin/bash -l |
| 270 | +source $HOME/spack/share/spack/setup-env.sh |
| 271 | +spack env activate ... |
| 272 | +``` |
| 273 | + |
| 274 | +## Known issues |
| 275 | + |
| 276 | +Workaround for broken Python module find for gmsh on uni.lu cluster |
| 277 | + |
| 278 | +``` { .sh .copy } |
| 279 | + |
| 280 | +export PYTHONPATH=$SPACK_ENV/.spack-env/view/lib64/:$PYTHONPATH |
| 281 | + |
| 282 | +``` |
| 283 | +Workaround for broken Python module find for adios2 (seems broken in Spack) |
| 284 | + |
| 285 | +``` { .sh .copy } |
| 286 | + |
| 287 | +export PYTHONPATH=$(find $SPACK_ENV/.spack-env -type d -name 'site-packages' | grep venv):$PYTHONPATH |
| 288 | + |
| 289 | +``` |
0 commit comments