flash-attn compatibility checklist
Most “install flash-attn” failures are compatibility mismatches. Use this checklist to pick versions that actually work together—then install from a matching wheel.
Python Versions
Python 3.10-3.13 support, version requirements per flash-attn release
PyTorch Versions
PyTorch 2.0-2.9 compatibility, CUDA build requirements
Windows Support
Windows wheel availability, WSL2 alternative, common issues
Compatibility Checklist
- 1) Platform tag: choose the platform you're installing on (Linux x86_64, Linux ARM64, or Windows).
- 2) Python version: ensure your Python version matches the wheel (e.g., cp312 for Python 3.12). See Python compatibility.
- 3) PyTorch version: your installed torch build must match the wheel's torch ABI expectations. See PyTorch compatibility.
- 4) CUDA version: your runtime (and sometimes toolkit) must be compatible with the wheel you're installing.
Check your environment
python --versionpython -c "import torch; print(torch.__version__)"nvidia-smiIf you don’t have a GPU or you’re on CPU-only PyTorch, many flash-attn builds won’t apply.
Pick a wheel + install
Once the versions line up, use the wheel finder to get a compatible wheel URL: