Providers
Make inferences with Clarifai using an OpenAI-compatible format
Clarifai supports various providers that you can use to interact with different models. We offer an OpenAI-compatible API endpoint, allowing any OpenAI-compatible library or client to seamlessly send requests directly to Clarifai.
Base URL for Clarifai's OpenAI endpoint: https://api.clarifai.com/v2/ext/openai/v1
.
This integration capability offers several advantages, including:
- Access to diverse models — Harness Clarifai's rich array of models directly within your OpenAI projects, expanding your AI capabilities.
- Standardized interaction — Interact with Clarifai-hosted models using familiar OpenAI API patterns and interfaces, reducing the learning curve and streamlining development.
- Enhanced flexibility — Leverage the power of Clarifai's platform while maintaining the flexibility of your chosen OpenAI development environment.
Usage-based billing is handled directly through Clarifai — not through OpenAI or any other provider. Also, while most OpenAI parameters are supported, certain advanced features may be unavailable depending on the specific model or endpoint.
📄️ OpenAI
Run inferences on Clarifai models using OpenAI
📄️ LiteLLM
Run inferences on Clarifai models using LiteLLM
📄️ Vercel
Run inferences on Clarifai models using Vercel