Deploy Open-Source MCP Servers
Upload, deploy, and interact with open-source MCP servers on Clarifai
Besides building a custom MCP (Model Context Protocol) server for the Clarifai platform, you can also upload any open-source MCP server and expose it as a managed API endpoint — just like any model in the platform.
You can easily integrate third-party MCP servers with Clarifai by simply adding the mcp_server configuration to your config.yaml file.
This allows you to:
- Expose MCP servers as HTTP APIs accessible through Clarifai
- Use the FastMCP client to interact with deployed MCP servers
- Seamlessly integrate MCP tools with LLMs to extend model capabilities
Step 1: Perform Prerequisites
Get an MCP Server
You can get open-source MCP servers from third-party repositories, such as mcpservers.org or mcp.so.
For this example, let's use the DuckDuckGo MCP server to demonstrate how to upload and deploy an open-source MCP server on Clarifai. This server provides tools for web search, browsing, and information retrieval, and requires no authentication tokens or secrets — making it easy to deploy and use. You can also follow its tutorial here.
Get an Agentic Model
Integrating large language models (LLMs) with MCP servers enables agentic capabilities, allowing models to discover and use external tools autonomously to complete tasks. MCP servers expose functionalities that models can invoke as function-calling tools during conversations.
With MCP server integration, an agentic model can iteratively discover tools, execute them, and reason over the results to produce more capable and context-aware responses.
Note: For a model to support agentic behavior through MCP servers on the Clarifai platform, it must extend the standard
OpenAIModelClasswith theAgenticModelClass. This enables:
- Tool discovery and execution handled by the agentic model class
- Iterative tool calling within a single predict or generate request
- Compatibility with the OpenAI-compatible API and Clarifai SDKs
- Support for both streaming and non-streaming modes
You can see an example implementation of
AgenticModelClassin this1/model.pyfile.
To upload a model with agentic capabilities, simply use the AgenticModelClass — all other functionalities and steps remain the same as uploading a standard model on Clarifai. You can follow this example.
These are some example models with agentic capabilities enabled:
Install Packages
Install the following Python packages to work with the DuckDuckGo Browser MCP server:
clarifai— The latest version of the Clarifai Python SDK, required to integrate your MCP server with the Clarifai platform. This package also comes with the Clarifai Command Line Interface (CLI), which you’ll use to upload the server.fastmcp— The core framework for interacting with MCP servers.openai— This leverage Clarifai’s OpenAI-compatible endpoint endpoint to run inferences using the OpenAI client libraryanyio— An asynchronous I/O library used by FastMCP.requests— A lightweight HTTP client for making HTTP requests.mcp— The Model Context Protocol library.
You can run the following command to install them:
- Bash
pip install --upgrade clarifai fastmcp openai anyio requests mcp
Get Credentials
You need to have the following Clarifai credentials:
- App ID — Create a Clarifai application and get its ID. This is where your MCP server will reside on the Clarifai platform.
- User ID — In the collapsible left sidebar, select Settings and choose Account from the dropdown list. Then, locate your user ID.
- Personal Access Token (PAT) — From the same Settings option, choose Secrets to generate or copy your PAT. This token is used to authenticate your connection with the Clarifai platform.
Then, set the CLARIFAI_PAT as an environment variable.
- Unix-Like Systems
- Windows
export CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE
set CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE
Create Files
On the Clarifai platform, MCP servers are treated just like models and follow the same underlying architecture.
To upload an MCP server, you need to create a project directory and organize your files according to Clarifai’s custom model requirements, as shown below:
your_model_directory/
├── 1/
│ └── model.py
├── requirements.txt
├── config.yaml
├── Dockerfile
└── client.py
- your_model_directory/ — The root directory containing all files related to your MCP server.
- 1/ — A required subdirectory that contains the model implementation (note that the folder name is 1).
- model.py — Implements the core logic of the MCP server.
- requirements.txt — Specifies the Python dependencies required to run the server.
- config.yaml — Defines metadata and configuration settings used when uploading the MCP server to Clarifai.
- Dockerfile — Defines the runtime environment used to build and run your MCP server on the Clarifai platform.
- client.py — An example client script you can use to interact with the MCP server after it has been uploaded.
- 1/ — A required subdirectory that contains the model implementation (note that the folder name is 1).
Create a Cluster and Nodepool
You'll need to deploy your MCP server to a dedicated compute cluster and nodepool. This action provisions the necessary resources to run your server and handle requests efficiently.
Learn how to create a cluster and nodepool here.
Note: Ensure that your cluster and nodepool meet the compute resource requirements specified in your
config.yamlfile.
Step 2: Prepare model.py File
When building a custom MCP server from scratch, model.py is where you implement the server’s core logic, including defining and exposing tools.
However, when uploading an open-source MCP server, you do not need to reimplement this logic. Instead, you only need to define a class that inherits from StdioMCPModelClass, which is designed to run and manage stdio-based MCP servers, such as the Browser MCP server.
Here is the model.py file that defines a StdioMCPModelClass for the Browser MCP server:
- model.py
from clarifai.runners.models.stdio_mcp_class import StdioMCPModelClass
class BrowserMCPServerClass(StdioMCPModelClass):
pass
The StdioMCPModelClass abstracts away the complexity of managing stdio-based MCP servers. By inheriting from it and configuring the mcp_server section in config.yaml, your server is ready to be uploaded.
Specifically, StdioMCPModelClass automatically:
- Starts the MCP server as a stdio process
- Discovers all available MCP tools
- Exposes the tools through an HTTP API
- Handles tool execution and response formatting
This makes it easy to deploy open-source MCP servers on Clarifai with minimal code.
Step 3: Prepare the config.yaml File
The config.yaml file defines the build, deployment, and runtime configuration for a custom model — or, in this case, an MCP server — on the Clarifai platform. It tells Clarifai how to build the execution environment and where the server should live within your account.
When integrating an open-source MCP server, this file is also where you specify the server’s mcp_server configuration, which you can obtain from the repository where it's hosted.
Here is the config.yaml file for the Browser MCP server:
- config.yaml
model:
id: browser-mcp-server
app_id: app_id
user_id: user_id
model_type_id: mcp
build_info:
python_version: "3.11"
inference_compute_info:
cpu_limit: 1000m
cpu_memory: 1Gi
num_accelerators: 0
mcp_server:
command: "uvx"
args: ["duckduckgo-mcp-server"]
Let’s break down what each part of the configuration does.
model— Defines where the MCP server will be uploaded on the Clarifai platform:id— Provide a unique identifier for the model (server)app_id— Provide your Clarifai app where the server will resideuser_id— Provide your Clarifai user IDmodel_type_id— Specifies the model type; usemcpfor MCP servers
build_info— Specifies the Python version used to build the runtime environment. Note that Clarifai currently supports Python 3.11 and 3.12 (default).inference_compute_info— Defines the compute resources allocated when the MCP server is running:cpu_limit: 1000m— Allocates 1 CPU corecpu_memory: 1Gi— Allocates 1 GB of RAMnum_accelerators: 0— No hardware accelerators (e.g., GPUs), which is typical for MCP servers
Note: Ensure your Clarifai compute cluster and nodepool meet these requirements before deploying the model.
mcp_server— Specifies how the MCP server process is started:command— The executable used to launch the server (e.g.,npx,uvx,python)args— Arguments passed to the command
To deploy a different MCP server, simply update the mcp_server section in config.yaml.
Note: Learn how to deploy it on Clarifai here.
mcp_server:
command: "npx"
args: ["-y", "@modelcontextprotocol/server-github"]
- Custom Python MCP Server (a custom, user-implemented MCP server)
mcp_server:
command: "python"
args: ["-m", "my_mcp_server"]