Toolkits
Download and run AI models locally with Clarifai toolkits
Toolkits let you run large language models (LLMs) and other generative AI models locally on your machine. They provide a standardized way to initialize a new Clarifai model directory structure thatβs fully compatible with the Clarifai platform.
Using toolkits, you can easily download and set up models from popular repositories such as Hugging Face or Ollama, making it simple to prepare them for deployment or integration with Clarifai.
To download and initialize a model using a toolkit we support, run:
clarifai model init --toolkit <toolkit-name>
ποΈ Ollama
Download and run Ollama models locally and expose them via a public API
ποΈ Hugging Face
Download and run Hugging Face models locally and make them available via a public API
ποΈ LM Studio
Download and run LM Studio models locally and expose them via a public API
ποΈ vLLM
Download and serve vLLM models locally and expose them via a public API
ποΈ SGLang
Run models using the SGLang runtime format and make them available via a public API