Skip to main content

Clarifai MCP Servers

Build performant MCP Servers with Clarifai


The Model Context Protocol (MCP) is a standardized, secure framework for building servers that expose data and functionality to LLM-based applications. Think of it as a specialized web API built specifically for LLM interactions.

Clarifai allows you to build MCP servers by providing the necessary infrastructure and tools to define and deploy custom MCP servers. This allows you to seamlessly integrate your proprietary data sources, custom APIs, and application-specific functionalities with various LLM applications.

Let's illustrate how you can build a simple MCP server.

tip

Prerequisites

Install Clarifai Package

Install the latest version of the clarifai Python SDK. This also installs the Clarifai Command Line Interface (CLI), which we'll use for uploading the model.

 pip install --upgrade clarifai 

Set a PAT Key

You need to set the CLARIFAI_PAT (Personal Access Token) as an environment variable. You can generate the PAT key in your personal settings page by navigating to the Security section.

This token is essential for authenticating your connection to the Clarifai platform.

 export CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE 

Create Files

Create a project directory and organize your files as indicated below to fit the requirements of building servers for the Clarifai platform.

your_model_directory/
├── 1/
│ └── model.py
├── requirements.txt
└── config.yaml
└── client.py
  • your_model_directory/ – The root directory containing all files related to your server.
    • 1/ – A subdirectory that holds the model file (Note that the folder is named as 1).
      • model.py – Contains the main MCP server implementation.
    • requirements.txt – Lists the Python dependencies required to run your server.
    • config.yaml – Contains metadata and configuration settings, such as compute requirements, needed for uploading the model to Clarifai.
    • client.py – Contains the example client demonstrating usage.

Add the following snippets to each of the respective files.

model.py

import json
import os
import asyncio
import queue
import threading
import traceback
from typing import Any, Iterator

import anyio
from pydantic import Field
from pydantic_core import ValidationError

from clarifai.runners.models.model_class import ModelClass
from clarifai.runners.models.mcp_class import MCPModelClass
from clarifai.utils.logging import logger

from mcp import types
from mcp.shared.exceptions import McpError
from fastmcp import FastMCP, Client # Use fastmcp v2, not the built-in mcp

# Initialize the server
server = FastMCP("my-first-mcp-server", instructions="", stateless_http=True)


@server.tool("calculate_sum", description="Add two numbers together")
def sum(a: Any = Field(description="first number"), b: Any = Field(description="second number")):
return float(a) + float(b)


@server.tool("weather", description="Get the current weather information for the given city")
def weather(city: str = Field(description="The city to get weather for")):
if city.lower() == "philly":
return "It's always sunny in Philadelphia!"
elif city.lower() == "seattle":
return "It's always rainy in Seattle!"
else:
return f"In {city} it's 74 F and cloudy."


@server.tool("list_files", description="List files in a directory")
def list_files(directory: str = Field(description="The directory to list files in")) -> list[str]:
try:
return os.listdir(directory)
except FileNotFoundError:
return f"Directory {directory} not found."


@server.tool("send_slack_message", description="Send a message to a Slack channel")
def send_slack_message(
channel: str = Field(description="The Slack channel to send the message to"),
message: str = Field(description="The message to send"),
) -> str:
import requests

# Replace with your Slack API token
slack_token = os.environ.get("SLACK_API_TOKEN")
headers = {
"Content-Type": "application/json",
"Authorization": f"Bearer {slack_token}",
}
payload = {
"channel": channel,
"text": message,
}
url = "https://slack.com/api/chat.postMessage"
response = requests.post(url, headers=headers, data=json.dumps(payload))
response.raise_for_status() # Raise an exception for HTTP errors (4xx or 5xx)
response_json = response.json()

if response.status_code == 200 and response_json.get("ok"):
return "Message sent successfully!"
else:
return f"Failed to send message: {response.text}"


@server.tool("sandbox", description="Run code")
def sandbox(code: str = Field(description="Code to run")) -> str:
return eval(code)


# Static resource
@server.resource("config://version")
def get_version():
return "2.0.1"


# Dynamic resource template
@server.resource("users://{user_id}/profile")
def get_profile(user_id: int):
# Fetch profile for user_id...
return {"name": f"User {user_id}", "status": "active"}


@server.prompt()
def summarize_request(text: str) -> str:
"""Generate a prompt asking for a summary."""
return f"Please summarize the following text:\n\n{text}"


class MyModelClass(MCPModelClass):
def get_server(self) -> FastMCP:
return server

requirements.txt

clarifai==11.4.7
anyio==4.9.0
mcp==1.9.0
fastmcp==2.3.4

config.yaml

important

In the model section of the config.yaml file, specify your model ID, Clarifai user ID, and Clarifai app ID. These will define where your model will be uploaded on the Clarifai platform.

build_info:
python_version: '3.11'
inference_compute_info:
cpu_limit: 500m
cpu_memory: 500Mi
num_accelerators: 0
model:
app_id: app-id
id: model-id
model_type_id: mcp
user_id: user-id

client.py

import os

from clarifai.client import Model
model = Model.from_current_context()

# Example model prediction from different model methods:
import json

s = json.dumps({
"jsonrpc": "2.0",
"id": 1,
"method": "tools/list",
})

response = model.mcp_transport(msg=s)
print(response)


s = json.dumps({
"jsonrpc": "2.0",
"id": 1,
"method": "tools/call",
"params": {
"name": "weather",
"arguments": {
"city": "philly"
},
"model_config": {}
}
})

response = model.mcp_transport(msg=s)
print(response)
Mock Data

This example includes mock data and fallback implementations when external services are not available, allowing you to test the MCP interface without requiring all external dependencies.

Run an Example

After setting up the required files, navigate to your directory and run the following command to install the dependencies:

 pip install -r requirements.txt 

Then, run the client example:

 python client.py 

Upload to Clarifai

You can upload the MCP server to the Clarifai platform by navigating to its directory and running the following command:

 clarifai model upload