xFormers is:
conda install xformers -c xformers
# cuda 11.8 version
python -m pip install -U xformers --index-url https://download.pytorch.org/whl/cu118
# cuda 12.1 version
python -m pip install -U xformers --index-url https://download.pytorch.org/whl/cu121
# rocm 6.1 version (linux only)
python -m pip install -U xformers --index-url https://download.pytorch.org/whl/rocm6.1
# Use either conda or pip, same requirements as for the stable version above
conda install xformers -c xformers/label/dev
pip install --pre -U xformers
# (Optional) Makes the build much faster
pip install ninja
# Set TORCH_CUDA_ARCH_LIST if running and building on different GPU types
pip install -v -U git+https://github.com/facebookresearch/xformers.git@main#egg=xformers
# (this can take dozens of minutes)
Memory-efficient MHA Setup: A100 on f16, measured total time for a forward+backward pass
Note that this is exact attention, not an approximation, just by calling xformers.ops.memory_efficient_attention
More benchmarks
xFormers provides many components, and more benchmarks are available in BENCHMARKS.md.
This command will provide information on an xFormers installation, and what kernels are built/available:
python -m xformers.info
module unload cuda; module load cuda/xx.x
, possibly also nvcc
TORCH_CUDA_ARCH_LIST
env variable is set to the architectures that you want to support. A suggested setup (slow to build but comprehensive) is export TORCH_CUDA_ARCH_LIST="6.0;6.1;6.2;7.0;7.2;7.5;8.0;8.6"
MAX_JOBS
(eg MAX_JOBS=2
)UnsatisfiableError
when installing with conda, make sure you have PyTorch installed in your conda environment, and that your setup (PyTorch version, cuda version, python version, OS) match an existing binary for xFormersxFormers has a BSD-style license, as found in the LICENSE file.
If you use xFormers in your publication, please cite it by using the following BibTeX entry.
@Misc{xFormers2022,
author = {Benjamin Lefaudeux and Francisco Massa and Diana Liskovich and Wenhan Xiong and Vittorio Caggiano and Sean Naren and Min Xu and Jieru Hu and Marta Tintore and Susan Zhang and Patrick Labatut and Daniel Haziza and Luca Wehrstedt and Jeremy Reizenstein and Grigory Sizov},
title = {xFormers: A modular and hackable Transformer modelling library},
howpublished = {\url{https://github.com/facebookresearch/xformers}},
year = {2022}
}
The following repositories are used in xFormers, either in close to original form or as an inspiration: