Installation - Outlines (original) (raw)
You can install Outlines with pip
:
Outlines supports OpenAI, Transformers, Mamba, llama.cpp, and ExLlamaV2, but you will need to install them manually:
[](#%5F%5Fcodelineno-1-1)pip install openai [](#%5F%5Fcodelineno-1-2)pip install transformers datasets accelerate torch [](#%5F%5Fcodelineno-1-3)pip install llama-cpp-python [](#%5F%5Fcodelineno-1-4)pip install exllamav2 transformers torch [](#%5F%5Fcodelineno-1-5)pip install mamba_ssm transformers torch [](#%5F%5Fcodelineno-1-6)pip install vllm
If you encounter any problems using Outlines with these libraries, take a look at their installation instructions. The installation of openai
and transformers
should be straightforward, but other libraries have specific hardware requirements.
Optional Dependencies
Outlines provides multiple optional dependency sets to support different backends and use cases. You can install them as needed using:
pip install "outlines[vllm]"
for vLLM, optimized for high-throughput inference.pip install "outlines[transformers]"
for Hugging Face Transformers.pip install "outlines[mlx]"
for MLX-LM, optimized for Apple silicon.pip install "outlines[openai]"
to use OpenAI’s API.pip install "outlines[llamacpp]"
for llama.cpp, a lightweight LLM inference engine.pip install "outlines[exllamav2]"
for ExLlamaV2, optimized for NVIDIA GPUs.
Bleeding Edge
You can install the latest version of Outlines from the repository's main
branch:
[](#%5F%5Fcodelineno-2-1)pip install git+https://github.com/dottxt-ai/outlines.git@main
This can be useful, for instance, when a fix has been merged but not yet released.
Installing for Development
See the contributing documentation for instructions on how to install Outlines for development, including an example using the dot-install
method for one of the backends.