Installation

xLLiM can be installed in several ways depending on your environment and preference. The recommended approach is via conda-forge, which handles all native dependencies automatically.

2. PyPI

Pre-built wheels are available for Linux x86_64 (manylinux_2_28) with Python 3.11 and 3.12:

pip install xllim

Mac/Windows: conda (section 1. Conda-forge (recommended)) or Docker (section 3. Docker / Apptainer (Singularity)) are recommended.

Other Python versions (3.10, 3.13…): not officially tested. You can try building from the source distribution - pip will fall back to it automatically when no wheel matches. However, you must first install the native system libraries manually (see section 4. Manual installation (build from source)).

3. Docker / Apptainer (Singularity)

Pre-built container images are published for every release. Two formats are available: a standard Docker image for workstations and cloud environments, and an Apptainer / Singularity SIF image for HPC clusters where Docker is unavailable.

Docker

A minimal Docker image based on python:3.11-slim is available, with xLLiM pre-installed.

Prerequisites: install Docker Engine (Linux/macOS) or Docker Desktop (Windows).

Pull the image:

docker pull ghcr.io/xllim-tools/xllim/xllim:latest

Run your project by mounting your source directory:

docker run -it \
  -v $(pwd):/workspace \
  -w /workspace \
  ghcr.io/xllim-tools/xllim/xllim:latest \
  python main.py

You can also open an interactive shell:

docker run -it -v $(pwd):/workspace -w /workspace ghcr.io/xllim-tools/xllim/xllim:latest bash

Note

Specific versions of the image are also available. See the xLLiM GHCR registry for the full list.

Apptainer / Singularity (HPC)

Apptainer (formerly Singularity) is a container runtime designed for HPC environments where Docker is typically not available or not allowed. It runs rootless and is natively supported on most HPC schedulers (SLURM, PBS, etc.).

For every release, a SIF (Singularity Image Format) image is automatically built from the Docker image and published to the same GHCR registry with a -sif tag suffix.

Prerequisite: Apptainer must be installed (typically already available on HPC clusters - check with your sysadmin). See the Apptainer quick-start guide.

Pull the latest SIF image:

apptainer pull xllim.sif oras://ghcr.io/xllim-tools/xllim/xllim:latest-sif

Or pull a specific version (replace <version> with the desired release tag):

apptainer pull xllim.sif oras://ghcr.io/xllim-tools/xllim/xllim:<version>-sif

Note

Available versions are listed in the xLLiM GHCR registry.

Run your project by binding your working directory:

apptainer run -B $(pwd):/workspace --pwd /workspace xllim.sif python main.py

Or open an interactive shell:

apptainer shell -B $(pwd):/workspace xllim.sif

4. Manual installation (build from source)

If you need to compile xLLiM from source (e.g., for an unsupported platform or Python version, or for development), please refer to the requirements below.

Build dependencies

Name

Version

Notes

C++ compiler

C++17 support

System-installed (e.g., g++ >= 9, clang++ >= 5)

CMake

>= 3.21

Auto-installed by pip

Ninja

any

Auto-installed by pip

scikit-build-core

>= 0.7.0, < 1

Auto-installed by pip

Python

>= 3.11, < 3.13

System-installed

Numpy

>= 2

Auto-installed by pip

Pybind11

>= 2.12

Auto-installed by pip

Armadillo

>= 12.6, < 13

System-installed

Boost

>= 1.78, < 2

System-installed; components: system, thread, random

BLAS / LAPACK

any

System-installed (OpenBLAS, MKL, Apple Accelerate…)

Carma

>= 0.8.0

System-installed (see below)

Build from source

Since xLLiM contains C++ extensions with Python bindings, building from source requires native system libraries that pip cannot install automatically. The instructions below use a Debian/Ubuntu-based system as example - adapt package names for your distribution.

  1. Clone the repository:

    git clone https://github.com/xllim-tools/xllim.git
    cd xllim
    
  2. Install system dependencies:

    sudo apt update
    
    # Compilation tools
    sudo apt-get install -y --no-install-recommends g++ cmake ninja-build
    
    # BLAS/LAPACK (OpenBLAS shown; any conforming implementation works)
    sudo apt-get install -y --no-install-recommends libopenblas-dev liblapack-dev
    
    # Armadillo
    sudo apt-get install -y --no-install-recommends libarmadillo-dev
    
    # Python development headers
    sudo apt-get install -y --no-install-recommends python3-dev
    
    # Boost (components: system, thread, random)
    sudo apt-get install -y --no-install-recommends libboost-dev libboost-system-dev libboost-thread-dev libboost-random-dev
    

    Warning

    Boost version: xLLiM requires Boost >= 1.78.0 (enforced by CMake). Ubuntu 24.04+ ships a compatible version out of the box. On older distributions (e.g., Ubuntu 22.04 ships 1.74), build Boost from source:

    curl -L https://archives.boost.io/release/1.78.0/source/boost_1_78_0.tar.gz -o boost.tar.gz
    tar -xzf boost.tar.gz
    cd boost_1_78_0
    ./bootstrap.sh --prefix=/usr/local
    ./b2 --with-system --with-thread --with-random link=static variant=release -j$(nproc) install
    cd .. && rm -rf boost_1_78_0 boost.tar.gz
    

    Then set CMAKE_ARGS so CMake finds this installation instead of the system one:

    export CMAKE_ARGS="-DBoost_ROOT=/usr/local -DBoost_NO_SYSTEM_PATHS=ON -DBoost_USE_STATIC_LIBS=ON"
    

    carma - not available in standard apt repositories; build from source:

    curl -L https://github.com/RUrlus/carma/archive/refs/tags/v0.8.0.tar.gz -o carma.tar.gz
    tar -xzf carma.tar.gz
    cd carma-0.8.0 && mkdir build && cd build
    cmake -DCARMA_INSTALL_LIB=ON ..
    cmake --build . --config Release --target install
    cd ../..
    rm -rf carma-0.8.0 carma.tar.gz
    

    Tip

    If you are using conda, all of the above can be installed in one step:

    conda install armadillo boost libblas liblapack carma
    
  3. Build and install:

    pip install .
    
  4. Verify the installation:

    import xllim
    

Manual CMake build (for development)

For C++ development or debugging, drive CMake directly:

cmake -S . -B build -G Ninja -DCMAKE_BUILD_TYPE=Debug -DXLLIM_BUILD_TESTS=ON
cmake --build build
ctest --test-dir build