Installation
xLLiM can be installed in several ways depending on your environment and preference.
The recommended approach is via conda-forge, which handles all native dependencies automatically.
1. Conda-forge (recommended)
xLLiM is published on conda-forge, a community-maintained channel that provides
up-to-date, cross-platform packages. The corresponding
xllim-feedstock contains the conda packaging recipe
and tracks each release. That is where the conda recipe, managing xllim conda builds, should be updated
if needed (refer to xllim-feedstock README for instructions).
Installing a conda distribution
If you don’t already have conda installed, several distributions are available. They all provide the conda command (or a drop-in equivalent) and can install packages from conda-forge:
Distribution |
Description |
Docs |
|---|---|---|
Micromamba |
Standalone, fast C++ reimplementation of conda. No base environment, no Python required. Defaults to conda-forge. |
|
Miniforge |
Community-maintained minimal installer. Defaults to conda-forge. |
|
Miniconda |
Minimal installer from Anaconda. Defaults to the |
|
Anaconda |
Full-featured distribution with many pre-installed packages and a GUI. Defaults to |
Note
Recommended: Micromamba or Miniforge. Both default to conda-forge exclusively, so environments
are clean and consistent. They are also significantly faster than the classic conda solver.
Example: installing Micromamba on Linux or macOS:
"${SHELL}" <(curl -L micro.mamba.pm/install.sh)
Restart your shell (or source ~/.bashrc), then verify:
micromamba --version
On Windows, see the Micromamba installation docs.
Warning
Anaconda/Miniconda users: channel conflict. Anaconda and Miniconda default to Anaconda’s proprietary
defaults channel, which is not the same as conda-forge. Mixing channels can lead to subtle
dependency conflicts and broken environments, because the two channels build packages independently and
their ABI guarantees are not always compatible.
If you use Anaconda or Miniconda, create a conda-forge-only environment before installing xLLiM:
conda create -n <my-env> -c conda-forge --override-channels
conda activate <my-env>
conda config --env --add channels conda-forge
conda config --env --set channel_priority strict
See the conda-forge documentation on channel conflicts for more details.
Installing xLLiM
Create and activate an environment (<my-env> is a placeholder), then install xLLiM:
# With conda or mamba
conda create -n <my-env>
conda activate <my-env>
conda install -c conda-forge xllim
# With micromamba
micromamba create -n <my-env>
micromamba activate <my-env>
micromamba install -c conda-forge xllim
2. PyPI
Pre-built wheels are available for Linux x86_64 (manylinux_2_28) with Python 3.11 and 3.12:
pip install xllim
Mac/Windows: conda (section 1. Conda-forge (recommended)) or Docker (section 3. Docker / Apptainer (Singularity)) are recommended.
Other Python versions (3.10, 3.13…): not officially tested. You can try building from the source
distribution - pip will fall back to it automatically when no wheel matches. However, you must first
install the native system libraries manually (see section 4. Manual installation (build from source)).
3. Docker / Apptainer (Singularity)
Pre-built container images are published for every release. Two formats are available: a standard Docker image for workstations and cloud environments, and an Apptainer / Singularity SIF image for HPC clusters where Docker is unavailable.
Docker
A minimal Docker image based on python:3.11-slim is available, with xLLiM pre-installed.
Prerequisites: install Docker Engine (Linux/macOS) or Docker Desktop (Windows).
Pull the image:
docker pull ghcr.io/xllim-tools/xllim/xllim:latest
Run your project by mounting your source directory:
docker run -it \
-v $(pwd):/workspace \
-w /workspace \
ghcr.io/xllim-tools/xllim/xllim:latest \
python main.py
You can also open an interactive shell:
docker run -it -v $(pwd):/workspace -w /workspace ghcr.io/xllim-tools/xllim/xllim:latest bash
Note
Specific versions of the image are also available. See the xLLiM GHCR registry for the full list.
Apptainer / Singularity (HPC)
Apptainer (formerly Singularity) is a container runtime designed for HPC environments where Docker is typically not available or not allowed. It runs rootless and is natively supported on most HPC schedulers (SLURM, PBS, etc.).
For every release, a SIF (Singularity Image Format) image is automatically built from the Docker
image and published to the same GHCR registry with a -sif tag suffix.
Prerequisite: Apptainer must be installed (typically already available on HPC clusters - check with your sysadmin). See the Apptainer quick-start guide.
Pull the latest SIF image:
apptainer pull xllim.sif oras://ghcr.io/xllim-tools/xllim/xllim:latest-sif
Or pull a specific version (replace <version> with the desired release tag):
apptainer pull xllim.sif oras://ghcr.io/xllim-tools/xllim/xllim:<version>-sif
Note
Available versions are listed in the xLLiM GHCR registry.
Run your project by binding your working directory:
apptainer run -B $(pwd):/workspace --pwd /workspace xllim.sif python main.py
Or open an interactive shell:
apptainer shell -B $(pwd):/workspace xllim.sif
4. Manual installation (build from source)
If you need to compile xLLiM from source (e.g., for an unsupported platform or Python version, or for development),
please refer to the requirements below.
Build dependencies
Name |
Version |
Notes |
|---|---|---|
C++ compiler |
C++17 support |
System-installed (e.g., g++ >= 9, clang++ >= 5) |
CMake |
>= 3.21 |
Auto-installed by pip |
Ninja |
any |
Auto-installed by pip |
scikit-build-core |
>= 0.7.0, < 1 |
Auto-installed by pip |
Python |
>= 3.11, < 3.13 |
System-installed |
Numpy |
>= 2 |
Auto-installed by pip |
Pybind11 |
>= 2.12 |
Auto-installed by pip |
Armadillo |
>= 12.6, < 13 |
System-installed |
Boost |
>= 1.78, < 2 |
System-installed; components: |
BLAS / LAPACK |
any |
System-installed (OpenBLAS, MKL, Apple Accelerate…) |
Carma |
>= 0.8.0 |
System-installed (see below) |
Build from source
Since xLLiM contains C++ extensions with Python bindings, building from source requires native system libraries that pip cannot install automatically. The instructions below use a Debian/Ubuntu-based system as example - adapt package names for your distribution.
Clone the repository:
git clone https://github.com/xllim-tools/xllim.git cd xllim
Install system dependencies:
sudo apt update # Compilation tools sudo apt-get install -y --no-install-recommends g++ cmake ninja-build # BLAS/LAPACK (OpenBLAS shown; any conforming implementation works) sudo apt-get install -y --no-install-recommends libopenblas-dev liblapack-dev # Armadillo sudo apt-get install -y --no-install-recommends libarmadillo-dev # Python development headers sudo apt-get install -y --no-install-recommends python3-dev # Boost (components: system, thread, random) sudo apt-get install -y --no-install-recommends libboost-dev libboost-system-dev libboost-thread-dev libboost-random-dev
Warning
Boost version:
xLLiMrequires Boost >= 1.78.0 (enforced by CMake). Ubuntu 24.04+ ships a compatible version out of the box. On older distributions (e.g., Ubuntu 22.04 ships 1.74), build Boost from source:curl -L https://archives.boost.io/release/1.78.0/source/boost_1_78_0.tar.gz -o boost.tar.gz tar -xzf boost.tar.gz cd boost_1_78_0 ./bootstrap.sh --prefix=/usr/local ./b2 --with-system --with-thread --with-random link=static variant=release -j$(nproc) install cd .. && rm -rf boost_1_78_0 boost.tar.gz
Then set
CMAKE_ARGSso CMake finds this installation instead of the system one:export CMAKE_ARGS="-DBoost_ROOT=/usr/local -DBoost_NO_SYSTEM_PATHS=ON -DBoost_USE_STATIC_LIBS=ON"
carma - not available in standard apt repositories; build from source:
curl -L https://github.com/RUrlus/carma/archive/refs/tags/v0.8.0.tar.gz -o carma.tar.gz tar -xzf carma.tar.gz cd carma-0.8.0 && mkdir build && cd build cmake -DCARMA_INSTALL_LIB=ON .. cmake --build . --config Release --target install cd ../.. rm -rf carma-0.8.0 carma.tar.gz
Tip
If you are using conda, all of the above can be installed in one step:
conda install armadillo boost libblas liblapack carma
Build and install:
pip install .
Verify the installation:
import xllim
Manual CMake build (for development)
For C++ development or debugging, drive CMake directly:
cmake -S . -B build -G Ninja -DCMAKE_BUILD_TYPE=Debug -DXLLIM_BUILD_TESTS=ON
cmake --build build
ctest --test-dir build