Upload Your First Model
Upload a custom model to the Clarifai platform
The Clarifai platform allows you to upload custom models for a wide range of use cases. With just a few simple steps, you can get your models up and running and leverage the platform’s powerful capabilities.
Let’s walk through how to upload a simple custom model that appends the phrase Hello World
to any input text.
To learn more about how to upload different types of models, check out this comprehensive guide.
Step 1: Perform Prerequisites
Install Clarifai Package
Install the latest version of the clarifai
Python SDK. This also installs the Clarifai Command Line Interface (CLI), which we'll use for uploading the model.
- Bash
pip install --upgrade clarifai
Set a PAT Key
You need to set the CLARIFAI_PAT
(Personal Access Token) as an environment variable. You can generate the PAT key in your personal settings page by navigating to the Security section.
This token is essential for authenticating your connection to the Clarifai platform.
- Unix-Like Systems
- Windows
export CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE
set CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE
On Windows, the Clarifai Python SDK expects a HOME
environment variable, which isn’t set by default. To ensure compatibility with file paths used by the SDK, set HOME
to the value of your USERPROFILE
. You can set it in your Command Prompt this way: set HOME=%USERPROFILE%
.
Step 2: Create Files
You can automatically generate the required files by running the clarifai model init
command in the terminal from your current directory. After the files are created, you can modify them as needed.
Create a project directory and organize your files as indicated below to fit the requirements of uploading models to the Clarifai platform.
your_model_directory/
├── 1/
│ └── model.py
├── requirements.txt
└── config.yaml
- your_model_directory/ – The root directory containing all files related to your custom model.
- 1/ – A subdirectory that holds the model file (Note that the folder is named as 1).
- model.py – Contains the code that defines your model, including running inference.
- requirements.txt – Lists the Python dependencies required to run your model.
- config.yaml – Contains metadata and configuration settings, such as compute requirements, needed for uploading the model to Clarifai.
- 1/ – A subdirectory that holds the model file (Note that the folder is named as 1).
Add the following snippets to each of the respective files.
model.py
- Python
from clarifai.runners.models.model_class import ModelClass
class MyFirstModel(ModelClass):
"""A custom model that adds 'Hello World' to the end of a text."""
@ModelClass.method
def predict(self, text1: str = "") -> str:
"""
This is the method that will be called when the model is run.
It takes in an input and returns an output.
"""
output_text = text1 + " Hello World"
return output_text
requirements.txt
- Text
clarifai>=11.3.0
config.yaml
In the model
section of the config.yaml
file, specify your model ID, Clarifai user ID, and Clarifai app ID. These will define where your model will be uploaded on the Clarifai platform.
- YAML
model:
id: "my-first-model"
user_id: "YOUR_USER_ID_HERE"
app_id: "YOUR_APP_ID_HERE"
model_type_id: "text-to-text"
build_info:
python_version: "3.11"
inference_compute_info:
cpu_limit: "1"
cpu_memory: "5Gi"
num_accelerators: 0
Step 3: Upload the Model
Once your custom model is ready, upload it to the Clarifai platform by navigating to the directory containing the model and running the following command:
- CLI
clarifai model upload
Step 4: Deploy the Model
After successfully uploading your model to the Clarifai platform, the terminal will prompt you to proceed with deployment and set up your model for inference.
Follow the prompts to:
- Set up a cluster – This serves as the foundation of your compute environment.
- Create a nodepool – A nodepool is a group of compute nodes within your cluster that provides the resources needed to run your model.
- Deploy your model – Once the nodepool is ready, deploy your model to make it available for inference.
Build Logs Example
clarifai model upload
[INFO] 14:10:21.295509 No checkpoints specified in the config file | thread=21092
[INFO] 14:10:21.306110 Setup: Using Python version 3.11 from the config file to build the Dockerfile | thread=21092
[INFO] 14:10:21.313102 Setup: Validating requirements.txt file at C:\Users\Alfrick\Desktop\upload-model\new1\requirements.txt using uv pip compile | thread=21092
[INFO] 14:10:27.021075 Setup: Requirements.txt file validated successfully | thread=21092
[INFO] 14:10:27.043079 Setup: Linting Python files: ['C:\\Users\\Alfrick\\Desktop\\upload-model\\new1\\1\\model.py'] | thread=21092
[INFO] 14:10:27.183777 Setup: Python code linted successfully, no errors found. | thread=21092
[INFO] 14:10:27.475527 New model will be created at https://clarifai.com/alfrick/upload-models-2/models/my-new1 with it's first version. | thread=21092
Press Enter to continue...
[INFO] 14:10:35.884498 Uploading file... | thread=16512
[INFO] 14:10:35.886430 Upload complete! | thread=16512
Status: Upload in progress, Progress: 0% - Starting upload. request_id: sdk-python-11.5.4-b4a7ae57c854497e9dc66cd06c92bStatus: Upload done, Progress: 0% - Completed upload of files, initiating model version image build.. request_id: sdk-pStatus: Model image is currently being built., Progress: 0% - Model version image is being built. request_id: sdk-pytho[INFO] 14:10:36.541215 Created Model Version ID: 44e470e310484daaaf98df96b9f20d0f | thread=21092
[INFO] 14:10:36.541215 Full url to that version is: https://clarifai.com/alfrick/upload-models-2/models/my-new1 | thread=21092
[INFO] 14:10:42.075543 2025-06-27 11:10:35.933343 INFO: Downloading uploaded model from storage... | thread=21092
[INFO] 14:10:47.022986 2025-06-27 11:10:41.738352 INFO: Done downloading model
2025-06-27 11:10:41.741974 INFO: Extracting upload...
2025-06-27 11:10:41.747170 INFO: Done extracting upload
2025-06-27 11:10:41.749617 INFO: Parsing requirements file for model version ID ****df96b9f20d0f
2025-06-27 11:10:41.775586 INFO: Dockerfile found at /shared/context/Dockerfile
cat: /shared/context/downloader/hf_token: No such file or directory
2025-06-27 11:10:42.482697 INFO: Setting up credentials
amazon-ecr-credential-helper
Version: 0.8.0
Git commit: ********
2025-06-27 11:10:42.487124 INFO: Building image...
#1 \[internal] load build definition from Dockerfile
#1 transferring dockerfile: 2.72kB done
#1 DONE 0.0s
#2 resolve image config for docker-image://docker.io/docker/dockerfile:1.13-labs
#2 DONE 0.1s
#3 docker-image://docker.io/docker/dockerfile:1.13-labs@sha256:************18b8
#3 resolve docker.io/docker/dockerfile:1.13-labs@sha256:************18b8 done
#3 CACHED
#4 \[internal] load metadata for public.ecr.aws/clarifai-models/python-base:3.11-********
#4 DONE 0.1s
#5 \[internal] load .dockerignore
#5 transferring context: 2B done
#5 DONE 0.0s
#6 \[internal] load build context
#6 transferring context: 2.02kB done
#6 DONE 0.0s
#7 \[final 1/8] FROM public.ecr.aws/clarifai-models/python-base:3.11-********@sha256:************0579
#7 resolve public.ecr.aws/clarifai-models/python-base:3.11-********@sha256:************0579 done
#7 CACHED
#8 \[final 2/8] COPY --link requirements.txt /home/nonroot/requirements.txt
#8 merging done
#8 DONE 0.0s
#9 \[final 3/8] RUN ["uv", "pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#9 0.084 Using Python 3.11.13 environment at: /venv
#9 0.508 Resolved 30 packages in 421ms
#9 0.515 Downloading grpcio (5.8MiB)
#9 0.516 Downloading pillow (4.4MiB)
#9 0.517 Downloading aiohttp (1.6MiB)
#9 0.517 Downloading numpy (16.1MiB)
#9 0.520 Downloading ruff (10.8MiB)
#9 0.779 Downloading aiohttp
#9 0.779 Downloading ruff
#9 0.803 Downloading grpcio
#9 0.818 Downloading pillow
#9 1.009 Downloading numpy
#9 1.010 Prepared 21 packages in 501ms
#9 1.028 Installed 21 packages in 17ms
#9 1.028 + aiohappyeyeballs==2.6.1
#9 1.028 + aiohttp==3.12.13
#9 1.028 + aiosignal==1.3.2
#9 1.028 + attrs==25.3.0
#9 1.028 + clarifai==11.5.4
#9 1.028 + clarifai-grpc==11.5.5
#9 1.028 + clarifai-protocol==0.0.24
#9 1.028 + click==8.2.1
#9 1.028 + contextlib2==21.6.0
#9 1.028 + frozenlist==1.7.0
#9 1.028 + googleapis-common-protos==1.70.0
#9 1.028 + grpcio==1.73.1
#9 1.028 + multidict==6.6.0
#9 1.028 + numpy==2.3.1
#9 1.028 + pillow==11.2.1
#9 1.028 + propcache==0.3.2
#9 1.028 + protobuf==6.31.1
#9 1.028 + ruff==0.11.4
#9 1.028 + schema==0.7.5
#9 1.028 + tabulate==0.9.0
#9 1.028 + yarl==1.20.1
#9 DONE 1.1s
#10 \[final 4/8] RUN ["uv", "pip", "show", "--no-cache-dir", "clarifai"]
#10 0.075 Using Python 3.11.13 environment at: /venv
#10 0.077 Name: clarifai
#10 0.077 Version: 11.5.4
#10 0.077 Location: /venv/lib/python3.11/site-packages
#10 0.077 Requires: aiohttp, clarifai-grpc, clarifai-protocol, click, fsspec, numpy, pillow, pyyaml, requests, ruff, schema, tabulate, tqdm, uv
#10 0.077 Required-by: clarifai-protocol
#10 DONE 0.1s
#11 \[final 5/8] COPY --chown=nonroot:nonroot downloader/unused.yaml /home/nonroot/main/1/checkpoints/.cache/unused.yaml
#11 DONE 0.0s
#12 \[final 6/8] RUN ["python", "-m", "clarifai.cli", "model", "download-checkpoints", "/home/nonroot/main", "--out_path", "/home/nonroot/main/1/checkpoints", "--stage", "build"]
#12 0.850 [INFO] 11:10:44.893856 No checkpoints specified in the config file | thread=140360215559744
#12 DONE 0.9s
#13 \[final 7/8] COPY --link=true 1 /home/nonroot/main/1
#13 DONE 0.0s
#14 \[final 8/8] COPY --link=true requirements.txt config.yaml /home/nonroot/main/
#14 DONE 0.0s
#15 exporting to image
#15 exporting layers | thread=21092
[INFO] 14:10:53.115042 #12 DONE 0.9s
#13 \[final 7/8] COPY --link=true 1 /home/nonroot/main/1
#13 DONE 0.0s
#14 \[final 8/8] COPY --link=true requirements.txt config.yaml /home/nonroot/main/
#14 DONE 0.0s
#15 exporting to image
#15 exporting layers
#15 exporting layers 4.1s done
#15 exporting manifest sha256:************89bf done
#15 exporting config sha256:************2f89 done
#15 pushing layers
#15 ...
#16 \[auth] sharing credentials for 891377382885.dkr.ecr.us-east-1.amazonaws.com
#16 DONE 0.0s
#15 exporting to image | thread=21092
[INFO] 14:10:57.018544 #15 pushing layers 2.6s done
#15 pushing manifest for ****/prod/pytorch:****df96b9f20d0f@sha256:************89bf
#15 pushing manifest for ****/prod/pytorch:****df96b9f20d0f@sha256:************89bf 0.4s done
#15 DONE 7.0s
2025-06-27 11:10:52.020948 INFO: Done building image!!! | thread=21092
[INFO] 14:10:57.019540 Model build complete! | thread=21092
[INFO] 14:10:57.019540 Build time elapsed 20.5s) | thread=21092
[INFO] 14:10:57.019540 Check out the model at https://clarifai.com/alfrick/upload-models-2/models/my-new1 version: 44e470e310484daaaf98df96b9f20d0f | thread=21092
[INFO] 14:10:57.027747
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
# Here is a code snippet to use this model:
XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
| thread=21092
[INFO] 14:10:57.027747
# Clarifai Model Client Script
# Set the environment variables `CLARIFAI_DEPLOYMENT_ID` and `CLARIFAI_PAT` to run this script.
# Example usage:
import os
from clarifai.client import Model
from clarifai.runners.utils import data_types
model = Model("https://clarifai.com/alfrick/upload-models-2/models/my-new1",
deployment_id = os.environ['CLARIFAI_DEPLOYMENT_ID'], # Only needed for dedicated deployed models
)
# Example model prediction from different model methods:
response = model.predict(text1="What is the future of AI?")
print(response)
| thread=21092
Do you want to deploy the model? (y/n): y
[INFO] 14:12:22.988696 Checking for available compute clusters... | thread=21092
[INFO] 14:12:24.793067 Available compute clusters: | thread=21092
[INFO] 14:12:24.794064 1. local-dev-compute-cluster (Default Local Dev Compute Cluster) | thread=21092
Choose a compute cluster (1-1) or 'n' to create a new one: n
[INFO] 14:12:42.729645 Please create a new compute cluster by visiting: https://clarifai.com/settings/compute/new | thread=21092
Do you want to open the compute cluster creation page in your browser? (y/n): y
After creating the compute cluster, press Enter to continue...
[INFO] 14:14:14.758749 Re-checking for available compute clusters... | thread=21092
[INFO] 14:14:15.348357 Available compute clusters: | thread=21092
[INFO] 14:14:15.349357 1. new-cluster () | thread=21092
[INFO] 14:14:15.349357 2. local-dev-compute-cluster (Default Local Dev Compute Cluster) | thread=21092
Choose a compute cluster (1-2): 1
[INFO] 14:14:28.311012 Checking for available nodepools in compute cluster 'new-cluster'... | thread=21092
[INFO] 14:14:30.277793 Available nodepools: | thread=21092
[INFO] 14:14:30.278676 1. new-nodepool () | thread=21092
Choose a nodepool (1-1) or 'n' to create a new one: 1
[INFO] 14:14:45.306485 Please create a new deployment by visiting: https://clarifai.com/settings/compute/deployments/new?computeClusterId=new-cluster&nodePoolId=new-nodepool | thread=21092
Do you want to open the deployment creation page in your browser? (y/n): y
[INFO] 14:14:49.806321 After creating the deployment, your model will be ready for inference! | thread=21092
[INFO] 14:14:49.807324 You can always return to view your deployments at: https://clarifai.com/settings/compute/deployments/new?computeClusterId=new-cluster&nodePoolId=new-nodepool | thread=21092
Step 5: Predict With Model
Once your model is successfully deployed, you can start making predictions with it.
- Python
- Node.js SDK
import os
from clarifai.client import Model
# Set your Personal Access Token (PAT)
os.environ["CLARIFAI_PAT"] = "YOUR_PAT_HERE"
# Initialize with model URL
model = Model(url="https://clarifai.com/alfrick/docs-demos/models/my-first-model")
response = model.predict("Yes, I uploaded it!")
print(response)
import { Model } from "clarifai-nodejs";
const model = new Model({
url: "https://clarifai.com/alfrick/docs-demos/models/my-first-model",
authConfig: {
pat: process.env.CLARIFAI_PAT,
},
});
const response = await model.predict({
// see available methodNames using model.availableMethods()
methodName: "predict",
text1: "Yes, I uploaded it!",
});
console.log(JSON.stringify(response));
// get response data from the response object
Model.getOutputDataFromModelResponse(response);
Output Example
Yes, I uploaded it! Hello World
Congratulations!
You've successfully uploaded your first model to the Clarifai platform and run inference with it!