Create and Run Pipelines [API]
Create, upload, and run pipelines via our API effortlessly
Clarifai Pipelines let you design and launch asynchronous, multi-step AI workflows on our platform. You can effortlessly automate complex processes, orchestrate AI agents, and run long-running jobs at scale.
Let’s walk through how you can create, upload, and run pipelines via our API.
Step 1: Perform Prerequisites
Sign Up or Log In
To get started, log in to your Clarifai account or sign up for a new one. If you’re creating a new account, a default application will be automatically created for you.
Next, gather the following details — they’re needed to create and manage pipelines programmatically:
- App ID – Go to your application’s page and choose the Overview option in the collapsible left sidebar. Get the app ID from there.
- User ID – Go to Settings in the collapsible left sidebar and choose the Account option. Then, copy your user ID from that page.
- PAT (Personal Access Token) – From the same Settings menu, navigate to the Secrets page to generate or copy your PAT. This token authenticates your requests to the Clarifai platform.
Set your PAT as an environment variable before continuing:
- Unix-Like Systems
- Windows
export CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE
set CLARIFAI_PAT=YOUR_PERSONAL_ACCESS_TOKEN_HERE
Install the Clarifai Python SDK
Install the latest version of the clarifai Python SDK. It includes the Clarifai CLI, which you can use to interact with pipelines from the command line.
pip install --upgrade clarifai
Create a Cluster and Nodepool
A compute cluster and nodepool define where your pipeline runs within Clarifai’s compute environment. They are required to allocate and manage the resources your pipeline needs for execution.
You need to create them and get their IDs.
Step 2: Initialize a Pipeline Project
Run the following command to create a new pipeline project in your current directory:
- CLI
clarifai pipeline init
Note:
- You can initialize a project in a specific location by providing a
PIPELINE_PATH. For example, runningclarifai pipeline init <pipeline-name>creates a new directory named<pipeline-name>and populates it with all the required pipeline project files.- You can shorten
pipelinetoplwhen running pipeline commands. For example:clarifai pl init.
After running the command, you’ll be prompted to provide the following details:
User ID— Your Clarifai user ID.App ID— Your Clarifai application ID where the pipeline lives.Pipeline ID— The unique ID for your pipeline (or press Enter to use the default).Number of steps— How many steps your pipeline should have (or press Enter to use the default). You need to specify at least one step.Step names— A name for each pipeline step (or press Enter to use the defaults).
Example Output
clarifai pipeline init
Welcome to Clarifai Pipeline Initialization!
Please provide the following information:
User ID: alfrick
App ID: pipelines-1
Pipeline ID [hello-world-pipeline]:
Number of pipeline steps [2]:
Name for step 1 [stepA]:
Name for step 2 [stepB]:
Creating pipeline 'hello-world-pipeline' with steps: stepA, stepB
[INFO] 08:25:06.765535 Created /Users/macbookpro/Desktop/pipelines/one/config.yaml | thread=8329666752
[INFO] 08:25:06.766398 Created /Users/macbookpro/Desktop/pipelines/one/README.md | thread=8329666752
[INFO] 08:25:06.767111 Created /Users/macbookpro/Desktop/pipelines/one/stepA/config.yaml | thread=8329666752
[INFO] 08:25:06.767417 Created /Users/macbookpro/Desktop/pipelines/one/stepA/requirements.txt | thread=8329666752
[INFO] 08:25:06.768022 Created /Users/macbookpro/Desktop/pipelines/one/stepA/1/pipeline_step.py | thread=8329666752
[INFO] 08:25:06.768514 Created /Users/macbookpro/Desktop/pipelines/one/stepB/config.yaml | thread=8329666752
[INFO] 08:25:06.768764 Created /Users/macbookpro/Desktop/pipelines/one/stepB/requirements.txt | thread=8329666752
[INFO] 08:25:06.769000 Created /Users/macbookpro/Desktop/pipelines/one/stepB/1/pipeline_step.py | thread=8329666752
[INFO] 08:25:06.769042 Pipeline initialization complete in /Users/macbookpro/Desktop/pipelines/one | thread=8329666752
[INFO] 08:25:06.769080 Next steps: | thread=8329666752
[INFO] 08:25:06.769116 1. Implement your pipeline step logic in the generated pipeline_step.py files | thread=8329666752
[INFO] 08:25:06.769151 2. Add dependencies to requirements.txt files as needed | thread=8329666752
[INFO] 08:25:06.769187 3. Run 'clarifai pipeline upload config.yaml' to upload your pipeline | thread=8329666752
Pipeline Steps
A step is a self-contained unit of execution in a Clarifai Pipeline. It represents one stage of your workflow and performs a specific task, then optionally passes its output to the next step.
For example, a step might preprocess input data, call an AI model, transform results, or interact with an external API.
In your pipeline project structure, each step is represented by its own folder. Steps are executed in sequence, parallel, or according to the definitions in the pipeline configuration.
Each step runs in an isolated, containerized environment with:
- Its own dependencies (
requirements.txt) - Its own configuration (
config.yaml) - Its own implementation logic (
/1/pipeline_step.py)
This design makes steps modular, reusable, and independently versioned — allowing you to update or improve one step without impacting the rest of the pipeline.
Note: You can also manage individual pipeline steps using the
pipelinestep(orps) command. This allows you to create, update, and reuse pipeline steps independently of the full pipeline. For example:
clarifai pipelinestep init– Initialize a new pipeline step project structure.clarifai pipelinestep upload– Upload a pipeline step to the Clarifai platform.
After running clarifai pipeline init, the CLI scaffolds a complete boilerplate project for you. Each file and folder represents a specific part of how your pipeline is configured, versioned, and executed.
Here is the structure of the generated project:
├── config.yaml # Pipeline configuration
├── stepA/ # First pipeline step
│ ├── config.yaml # Step A configuration
│ ├── requirements.txt # Step A dependencies
│ └── 1/
│ └── pipeline_step.py # Step A implementation logic
├── stepB/ # Second pipeline step
│ ├── config.yaml # Step B configuration
│ ├── requirements.txt # Step B dependencies
│ └── 1/
│ └── pipeline_step.py # Step B implementation logic
└── README.md # Generated project documentation
Step 3: Modify the Files
The Clarifai CLI generates a structured, multi-step pipeline project, where each step is independently configurable, versioned, and deployable.
Next, you can customize the generated files to define your pipeline’s behavior.
Let’s walk through what each file does in the default project setup.
config.yaml (root)
Example: config.yaml (root)
pipeline:
id: "hello-world-pipeline"
user_id: "user-id"
app_id: "app-id"
step_directories:
- stepA
- stepB
orchestration_spec:
argo_orchestration_spec: |
apiVersion: argoproj.io/v1alpha1
kind: Workflow
spec:
entrypoint: sequence
arguments:
parameters:
- name: input_text
value: "Input Text Here"
templates:
- name: sequence
steps:
- - name: step-0
templateRef:
name: users/user-id/apps/app-id/pipeline_steps/stepA
template: users/user-id/apps/app-id/pipeline_steps/stepA
arguments:
parameters:
- name: input_text
value: "{{workflow.parameters.input_text}}"
- - name: step-1
templateRef:
name: users/user-id/apps/app-id/pipeline_steps/stepB
template: users/user-id/apps/app-id/pipeline_steps/stepB
arguments:
parameters:
- name: input_text
value: "{{workflow.parameters.input_text}}"
# Optional: Define secrets for pipeline steps
# config:
# step_version_secrets:
# step-0:
# API_KEY: users/alfrick/apps//secrets/my-api-key
# DB_PASSWORD: users/alfrick/apps/secrets/db-secret
# step-1:
# EMAIL_TOKEN: users/alfrick/apps/secrets/email-token
The root config.yaml file defines your pipeline’s identity, structure, and execution logic. It tells Clarifai what the pipeline is, which steps it contains, and how those steps should be orchestrated at runtime.
Pipeline execution is powered by Argo Workflows, an open-source, Kubernetes-native orchestration engine used to schedule and manage containerized jobs. Argo models multi-step workflows as a sequence of tasks and captures the dependencies between them using a directed acyclic graph (DAG). Clarifai leverages this under the hood to coordinate step execution, manage dependencies, and handle long-running workloads.
Let’s break down the config.yaml file section by section.
Pipeline Metadata
pipeline:
id: "hello-world-pipeline"
user_id: "user-id"
app_id: "app-id"
These fields define your pipeline’s identity and location when uploaded to the Clarifai platform. They are automatically filled based on the values you provide during initialization.
Step Directories
step_directories:
- stepA
- stepB
This section tells Clarifai which folders in your project represent pipeline steps and how they should be ordered during execution. The execution order is stipulated by the Argo Workflow Definition, as explained below.
Each listed directory corresponds to one pipeline step created during initialization. As mentioned earlier, each step contains its own config.yaml, requirements.txt, and executable logic, allowing steps to be configured and maintained independently.
Argo Workflow Definition
orchestration_spec:
argo_orchestration_spec: |
apiVersion: argoproj.io/v1alpha1
kind: Workflow
spec:
entrypoint: sequence
arguments:
parameters:
- name: input_text
value: "Input Text Here"
This section is the control center of your pipeline. It defines how Clarifai uses Argo Workflows to orchestrate and execute your multi-step workflow as a DAG.
Here’s what each part does:
apiVersion— Specifies the version of the Argo Workflow API being used.kind: Workflow— Tells the system this is a multi-step Argo workflow.entrypoint— Defines the starting point of execution. In this case, the workflow begins at a template namedsequence.- Input parameters — These are runtime variables passed into your pipeline when you run it. Here, a parameter called
input_textis defined. If no value is provided at runtime, it defaults to"Input Text Here". This value can then be consumed by any pipeline step.
Step Execution Order
templates:
- name: sequence
steps:
- - name: step-0
templateRef:
name: users/user-id/apps/app-id/pipeline_steps/stepA
template: users/user-id/apps/app-id/pipeline_steps/stepA
arguments:
parameters:
- name: input_text
value: "{{workflow.parameters.input_text}}"
- - name: step-1
templateRef:
name: users/user-id/apps/app-id/pipeline_steps/stepB
template: users/user-id/apps/app-id/pipeline_steps/stepB
arguments:
parameters:
- name: input_text
value: "{{workflow.parameters.input_text}}"
The section defines how individual pipeline steps are connected and the exact order in which they run.
step-0— Executes first. It references the deployed pipeline stepstepAand receives theinput_textparameter.step-1— Executes afterstep-0completes. It referencesstepBand also receives the sameinput_textparameter.
Note: This structure defines a sequential workflow. Under the hood, Argo models this sequence as a DAG, ensuring that
stepBonly runs once its dependency (stepA) has successfully completed. ThetemplateReffields point to the deployed step implementations in your project directories (stepA/andstepB/). When deployed, Clarifai transforms these into executable Argo templates that power the pipeline.
Note: You can also structure your pipeline as a parallel workflow, where multiple steps run simultaneously instead of sequentially.
Optional: Secrets Configuration
config:
step_version_secrets:
step-0:
API_KEY: users/user-id/apps/secrets/my-api-key
DB_PASSWORD: users/user-id/apps/secrets/db-secret
step-1:
EMAIL_TOKEN: users/user-id/apps/secrets/email-token
This section lets you securely inject environment secrets into specific pipeline steps. It’s especially useful for handling sensitive data, like third-party API keys, without hardcoding them into your source code.
Each key (for example, API_KEY or DB_PASSWORD) becomes an environment variable inside the corresponding step’s runtime environment. The values reference secrets stored in the Clarifai secrets manager, which ensures they are encrypted, access-controlled, and never exposed directly in your project files.
Note: Once the secrets are mounted, you can access them in your
pipeline_step.pyfile using standard environment variable calls, for example:
import os
api_key = os.environ['API_KEY']
The secret value is then available to your code at runtime.
step/config.yaml
Example: step/config.yaml
pipeline_step:
id: "stepA"
user_id: "user-id"
app_id: "app-id"
pipeline_step_input_params:
- name: input_text
description: "Text input for processing"
build_info:
python_version: "3.12"
pipeline_step_compute_info:
cpu_limit: "500m"
cpu_memory: "500Mi"
num_accelerators: 0
Each pipeline step includes its own config.yaml file, which defines it as a self-contained, containerized unit of execution. These files usually share a common structure and similar resource settings, helping keep your pipeline consistent, predictable, and easy to manage.
Let’s break down the step/config.yaml file section by section.
Step Metadata
pipeline_step:
id: "stepA"
user_id: "user-id"
app_id: "app-id"
This section defines how Clarifai identifies, scopes, and stores this pipeline step. These values are automatically filled with the information you provide when you initialize the pipeline project.
Note: Clarifai Pipelines support cross-application orchestration for reusing logic. To integrate a shared step into your pipeline, simply configure it with the corresponding
user_id,app_id, or stepidwhere the resource is hosted.
Input Interface
pipeline_step_input_params:
- name: input_text
description: "Text input for processing"
This defines the data this step expects when it runs. The description helps document what this input is used for and how it should be provided by upstream steps or users.
Runtime Environment
build_info:
python_version: "3.12"
This specifies the environment used to run the step, ensuring compatibility with your code and dependencies.
Compute Resources Allocation
pipeline_step_compute_info:
cpu_limit: "500m"
cpu_memory: "500Mi"
num_accelerators: 0
This section controls the resources allocated to the step during execution. It is the most critical part for performance and cost.
cpu_limit: "500m"allocates half of one CPU core's processing power (500 millicores = 0.5 CPU cores). This indicates the step is a lightweight task (like text parsing), not a heavy calculation.cpu_memory: "500Mi"limits memory usage to 500 Mebibytes of RAM.num_accelerators: 0means no GPU or other accelerators are used. This keeps costs low, as you are running on standard CPU infrastructure.
step/requirements.txt
Example: step/requirements.txt
clarifai==11.10.2
# Add your pipeline step dependencies here
# Example:
# torch>=1.9.0
# transformers>=4.20.0
Each pipeline step has its own requirements.txt file, which specifies the Python packages required for that specific step to run.
These dependencies are installed in an isolated, containerized environment, meaning one step can use a different set of libraries or versions without affecting other steps in the pipeline.
1/pipeline_step.py
Example: 1/pipeline_step.py
import argparse
import clarifai
from clarifai.utils.logging import logger
def main():
parser = argparse.ArgumentParser(description='stepA processing step.')
parser.add_argument('--input_text', type=str, required=True, help='Text input for processing')
args = parser.parse_args()
logger.info(clarifai.__version__)
# TODO: Implement your pipeline step logic here
logger.info(f"stepA processed: {args.input_text}")
if __name__ == "__main__":
main()
Each pipeline step has its own pipeline_step.py file, which contains the core implementation for that specific step. This is where you define all the logic the step executes — including data processing, API interactions, model inference, and output transformations.
When the pipeline runs, Clarifai spins up this step’s container and executes this script.
Note: The
pipeline_step.pyfile is nested inside thestep/1/directory. The folder is named as 1 to fit Clarifai’s naming convention.
Let’s break down the example of a pipeline_step.py file section by section.
Command-line Input Handling
parser = argparse.ArgumentParser(description='stepA processing step.')
parser.add_argument('--input_text', type=str, required=True, help='Text input for processing')
args = parser.parse_args()
This section receives its inputs as command-line arguments when Clarifai runs it. The --input_text corresponds to the pipeline_step_input_params you defined in the step’s config.yaml.
Clarifai automatically maps pipeline parameters to these arguments when the step is executed.
Logging the Clarifai SDK Version
logger.info(clarifai.__version__)
This section logs the version of the Clarifai Python SDK installed inside the step’s container. It’s useful for debugging and verifying that the correct environment is being used.
Step Logic Placeholder
# TODO: Implement your pipeline step logic here
logger.info(f"stepA processed: {args.input_text}")
This section is where your step’s core logic lives. In this example, it simply logs the input it receives, but in a real pipeline, this is where you would implement the actual work of the step, such as preprocessing input data or calling LLM models.
Step 4: Upload the Pipeline
Run the following command in your current directory to upload the pipeline with associated pipeline steps to Clarifai.
- CLI
clarifai pipeline upload
Note: You can specify a path to the pipeline configuration file or directory containing the root
config.yamlfile. If not specified, the current directory is used by default.
Example Output
clarifai pipeline upload
[INFO] 17:13:59.215943 Starting pipeline upload from config: ./config.yaml | thread=8329666752
[INFO] 17:13:59.216027 Uploading 2 pipeline steps... | thread=8329666752
[INFO] 17:13:59.216063 Uploading pipeline step from directory: stepA | thread=8329666752
[INFO] 17:13:59.221287 No config-lock.yaml found, will upload pipeline step | thread=8329666752
[INFO] 17:13:59.221721 Created Dockerfile at /Users/macbookpro/Desktop/pipelines/one/stepA/Dockerfile | thread=8329666752
[INFO] 17:14:01.613332 Creating new pipeline step stepA | thread=8329666752
[INFO] 17:14:02.248411 Successfully created pipeline step stepA | thread=8329666752
[INFO] 17:14:02.263631 Uploading pipeline step content... | thread=6135148544
[INFO] 17:14:02.263920 Upload complete! | thread=6135148544
Status: Upload done, Upload Progress: 100%, Details: Completed upload of files, initiating pipelineStep version image build. request_id: sdk-python-11Status: Pipeline Step image is currently being built., Upload Progress: 100%, Details: Pipeline Step Version image is being built. request_id: sdk-pyt[INFO] 17:14:03.580160 54c3894e0cc62981ba3a4
Created Pipeline Step Version ID: 36e752e546334fa28d73bcbdc86d37a7 | thread=8329666752
[INFO] 17:14:08.499014 2025-11-26 14:14:03.535648 INFO: Downloading uploaded buildable from storage...
2025-11-26 14:14:04.413025 INFO: Done downloading buildable from storage
2025-11-26 14:14:04.417035 INFO: Extracting upload...
2025-11-26 14:14:04.422114 INFO: Done extracting upload
2025-11-26 14:14:04.425067 INFO: Parsing requirements file for buildable version ID ****bcbdc86d37a7
2025-11-26 14:14:04.452415 INFO: Dockerfile found at /shared/context/Dockerfile
2025-11-26 14:14:05.160145 INFO: Setting up credentials
amazon-ecr-credential-helper
Version: 0.8.0
Git commit: ********
2025-11-26 14:14:05.165486 INFO: Building image...
#1 \[internal] load build definition from Dockerfile
#1 transferring dockerfile: 776B done
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 1)
#1 WARN: ****orm: Setting platform to predefined $TARGETPLATFORM in FROM is redundant as this is the default behavior (line 1)
#1 DONE 0.0s
#2 \[linux/amd64 internal] load metadata for public.ecr.aws/clarifai-models/python-base:3.12-********
#2 DONE 0.3s
#3 \[linux/arm64 internal] load metadata for public.ecr.aws/clarifai-models/python-base:3.12-********
#3 DONE 0.3s
#4 \[internal] load .dockerignore
#4 transferring context: 2B done
#4 DONE 0.0s
#5 \[internal] load build context
#5 transferring context: 1.28kB done
#5 DONE 0.0s
#6 \[linux/amd64 1/5] FROM public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62
#6 resolve public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62 done
#6 CACHED
#7 \[linux/arm64 1/5] FROM public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62
#7 resolve public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62 done
#7 CACHED
#8 \[linux/amd64 2/5] COPY --link requirements.txt /home/nonroot/requirements.txt
#8 merging done
#8 DONE 0.0s
#9 \[linux/arm64 2/5] COPY --link requirements.txt /home/nonroot/requirements.txt
#9 merging done
#9 DONE 0.0s
#10 \[linux/arm64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"] | thread=8329666752
[INFO] 17:14:14.728456 #10 4.906 Collecting clarifai==11.10.2 (from -r /home/nonroot/requirements.txt (line 1))
#10 ...
#11 \[linux/amd64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#11 0.495 Collecting clarifai==11.10.2 (from -r /home/nonroot/requirements.txt (line 1))
#11 0.537 Downloading clarifai-11.10.2-py3-none-any.whl.metadata (23 kB)
#11 0.578 Collecting clarifai-grpc>=11.10.3 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.584 Downloading clarifai_grpc-11.10.9-py3-none-any.whl.metadata (4.4 kB)
#11 0.622 Collecting clarifai-protocol>=0.0.33 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.627 Downloading clarifai_protocol-0.0.34-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (14 kB)
#11 0.760 Collecting numpy>=1.22.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.764 Downloading numpy-2.3.5-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.metadata (62 kB)
#11 0.780 Requirement already satisfied: tqdm>=4.65.0 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (4.67.1)
#11 0.780 Requirement already satisfied: PyYAML>=6.0.1 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (6.0.2)
#11 0.787 Collecting schema==0.7.5 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.792 Downloading schema-0.7.5-py2.py3-none-any.whl.metadata (34 kB)
#11 0.906 Collecting Pillow>=9.5.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.910 Downloading pillow-12.0.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl.metadata (8.8 kB)
#11 0.921 Collecting tabulate>=0.9.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.925 Downloading tabulate-0.9.0-py3-none-any.whl.metadata (34 kB)
#11 0.929 Requirement already satisfied: fsspec>=2024.6.1 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2025.2.0)
#11 0.940 Collecting click>=8.1.7 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 0.943 Downloading click-8.3.1-py3-none-any.whl.metadata (2.6 kB)
#11 0.945 Requirement already satisfied: requests>=2.32.3 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2.32.3)
#11 1.298 Collecting aiohttp>=3.10.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 1.303 Downloading aiohttp-3.13.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (8.1 kB)
#11 1.474 Collecting uv==0.7.12 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 1.479 Downloading uv-0.7.12-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (11 kB)
#11 1.719 Collecting ruff==0.11.4 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 1.724 Downloading ruff-0.11.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (25 kB)
#11 1.793 Collecting psutil==7.0.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 1.797 Downloading psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (22 kB)
#11 1.813 Collecting pygments>=2.19.2 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 1.817 Downloading pygments-2.19.2-py3-none-any.whl.metadata (2.5 kB)
#11 2.258 Collecting pydantic_core==2.33.2 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.262 Downloading pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (6.8 kB)
#11 2.273 Collecting packaging==25.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.277 Downloading packaging-25.0-py3-none-any.whl.metadata (3.3 kB)
#11 2.281 Requirement already satisfied: typing-extensions!=4.7.0,>=4.6.0 in /venv/lib/python3.12/site-packages (from pydantic_core==2.33.2->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (4.12.2)
#11 2.287 Collecting contextlib2>=0.5.5 (from schema==0.7.5->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.291 Downloading contextlib2-21.6.0-py2.py3-none-any.whl.metadata (4.1 kB)
#11 2.303 Collecting aiohappyeyeballs>=2.5.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.307 Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl.metadata (5.9 kB)
#11 2.314 Collecting aiosignal>=1.4.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.317 Downloading aiosignal-1.4.0-py3-none-any.whl.metadata (3.7 kB)
#11 2.328 Collecting attrs>=17.3.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.332 Downloading attrs-25.4.0-py3-none-any.whl.metadata (10 kB)
#11 2.401 Collecting frozenlist>=1.1.1 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.406 Downloading frozenlist-1.8.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl.metadata (20 kB)
#11 2.559 Collecting multidict<7.0,>=4.5 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.563 Downloading multidict-6.7.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (5.3 kB)
#11 2.599 Collecting propcache>=0.2.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.603 Downloading propcache-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (13 kB)
#11 2.771 Collecting yarl<2.0,>=1.17.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 2.776 Downloading yarl-1.22.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.metadata (75 kB)
#11 3.138 Collecting grpcio>=1.53.2 (from clarifai-grpc>=11.10.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 3.143 Downloading grpcio-1.76.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.metadata (3.7 kB)
#11 3.248 Collecting protobuf>=5.29.5 (from clarifai-grpc>=11.10.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 3.252 Downloading protobuf-6.33.1-cp39-abi3-manylinux2014_x86_64.whl.metadata (593 bytes)
#11 3.264 Collecting googleapis-common-protos>=1.57.0 (from clarifai-grpc>=11.10.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#11 3.268 Downloading googleapis_common_protos-1.72.0-py3-none-any.whl.metadata (9.4 kB)
#11 3.303 Requirement already satisfied: charset-normalizer<4,>=2 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (3.4.1)
#11 3.303 Requirement already satisfied: idna<4,>=2.5 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (3.10)
#11 3.304 Requirement already satisfied: urllib3<3,>=1.21.1 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2.3.0)
#11 3.304 Requirement already satisfied: certifi>=2017.4.17 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2025.1.31)
#11 3.373 Downloading clarifai-11.10.2-py3-none-any.whl (306 kB)
#11 3.390 Downloading packaging-25.0-py3-none-any.whl (66 kB)
#11 3.395 Downloading psutil-7.0.0-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (277 kB)
#11 3.406 Downloading pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB)
#11 3.445 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 59.3 MB/s eta 0:00:00
#11 3.451 Downloading ruff-0.11.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (11.3 MB)
#11 3.523 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 11.3/11.3 MB 164.1 MB/s eta 0:00:00
#11 3.527 Downloading schema-0.7.5-py2.py3-none-any.whl (17 kB)
#11 3.532 Downloading uv-0.7.12-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (17.8 MB)
#11 3.597 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 17.8/17.8 MB 284.8 MB/s eta 0:00:00
#11 3.602 Downloading aiohttp-3.13.2-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (1.8 MB)
#11 3.607 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.8/1.8 MB 499.4 MB/s eta 0:00:00
#11 3.612 Downloading clarifai_grpc-11.10.9-py3-none-any.whl (302 kB)
#11 3.624 Downloading clarifai_protocol-0.0.34-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (449 kB)
#11 3.629 Downloading click-8.3.1-py3-none-any.whl (108 kB)
#11 3.632 Downloading numpy-2.3.5-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (16.6 MB)
#11 3.672 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.6/16.6 MB 436.6 MB/s eta 0:00:00
#11 3.675 Downloading pillow-12.0.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl (7.0 MB)
#11 3.694 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.0/7.0 MB 409.3 MB/s eta 0:00:00
#11 3.698 Downloading pygments-2.19.2-py3-none-any.whl (1.2 MB)
#11 3.702 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 402.1 MB/s eta 0:00:00
#11 3.706 Downloading tabulate-0.9.0-py3-none-any.whl (35 kB)
#11 3.710 Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl (15 kB)
#11 3.714 Downloading aiosignal-1.4.0-py3-none-any.whl (7.5 kB)
#11 3.717 Downloading attrs-25.4.0-py3-none-any.whl (67 kB)
#11 3.721 Downloading contextlib2-21.6.0-py2.py3-none-any.whl (13 kB)
#11 3.725 Downloading frozenlist-1.8.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl (242 kB)
#11 3.729 Downloading googleapis_common_protos-1.72.0-py3-none-any.whl (297 kB)
#11 3.733 Downloading grpcio-1.76.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (6.6 MB)
#11 3.757 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.6/6.6 MB 321.3 MB/s eta 0:00:00
#11 3.760 Downloading multidict-6.7.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (256 kB)
#11 3.764 Downloading propcache-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (221 kB)
#11 3.768 Downloading protobuf-6.33.1-cp39-abi3-manylinux2014_x86_64.whl (323 kB)
#11 3.773 Downloading yarl-1.22.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl (377 kB)
#11 3.943 Installing collected packages: uv, tabulate, ruff, pygments, pydantic_core, psutil, protobuf, propcache, Pillow, packaging, numpy, multidict, grpcio, frozenlist, contextlib2, click, attrs, aiohappyeyeballs, yarl, schema, googleapis-common-protos, aiosignal, clarifai-grpc, aiohttp, clarifai-protocol, clarifai
#11 ...
#10 \[linux/arm64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#10 5.322 Downloading clarifai-11.10.2-py3-none-any.whl.metadata (23 kB)
#10 5.792 Collecting clarifai-grpc>=11.10.3 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 5.807 Downloading clarifai_grpc-11.10.9-py3-none-any.whl.metadata (4.4 kB)
#10 6.197 Collecting clarifai-protocol>=0.0.33 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 6.213 Downloading clarifai_protocol-0.0.34-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (14 kB) | thread=8329666752
[INFO] 17:14:19.500282 #10 7.682 Collecting numpy>=1.22.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 7.696 Downloading numpy-2.3.5-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.metadata (62 kB)
#10 7.888 Requirement already satisfied: tqdm>=4.65.0 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (4.67.1)
#10 7.892 Requirement already satisfied: PyYAML>=6.0.1 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (6.0.2)
#10 7.943 Collecting schema==0.7.5 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 7.956 Downloading schema-0.7.5-py2.py3-none-any.whl.metadata (34 kB)
#10 ...
#11 \[linux/amd64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#11 5.453 Attempting uninstall: packaging
#11 5.454 Found existing installation: packaging 24.2
#11 5.458 Uninstalling packaging-24.2:
#11 5.463 Successfully uninstalled packaging-24.2
#11 7.939 Successfully installed Pillow-12.0.0 aiohappyeyeballs-2.6.1 aiohttp-3.13.2 aiosignal-1.4.0 attrs-25.4.0 clarifai-11.10.2 clarifai-grpc-11.10.9 clarifai-protocol-0.0.34 click-8.3.1 contextlib2-21.6.0 frozenlist-1.8.0 googleapis-common-protos-1.72.0 grpcio-1.76.0 multidict-6.7.0 numpy-2.3.5 packaging-25.0 propcache-0.4.1 protobuf-6.33.1 psutil-7.0.0 pydantic_core-2.33.2 pygments-2.19.2 ruff-0.11.4 schema-0.7.5 tabulate-0.9.0 uv-0.7.12 yarl-1.22.0
#11 8.028
#11 8.028 \[notice] A new release of pip is available: 25.0.1 -> 25.3
#11 8.028 \[notice] To update, run: pip install --upgrade pip
#11 DONE 8.2s
#12 \[linux/amd64 4/5] COPY --link=true 1 /home/nonroot/main/1
#12 DONE 0.0s
#13 \[linux/amd64 5/5] COPY --link=true requirements.txt config.yaml /home/nonroot/main/
#13 DONE 0.0s
#10 \[linux/arm64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#10 9.264 Collecting Pillow>=9.5.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 9.278 Downloading pillow-12.0.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl.metadata (8.8 kB)
#10 9.363 Collecting tabulate>=0.9.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 9.377 Downloading tabulate-0.9.0-py3-none-any.whl.metadata (34 kB)
#10 9.418 Requirement already satisfied: fsspec>=2024.6.1 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2025.2.0)
#10 9.519 Collecting click>=8.1.7 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 9.531 Downloading click-8.3.1-py3-none-any.whl.metadata (2.6 kB)
#10 9.546 Requirement already satisfied: requests>=2.32.3 in /venv/lib/python3.12/site-packages (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2.32.3) | thread=8329666752
[INFO] 17:14:23.845921 #10 13.42 Collecting aiohttp>=3.10.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 13.43 Downloading aiohttp-3.13.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (8.1 kB)
#10 15.26 Collecting uv==0.7.12 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 15.28 Downloading uv-0.7.12-py3-none-manylinux_2_28_aarch64.whl.metadata (11 kB) | thread=8329666752
[INFO] 17:14:29.359543 #10 18.01 Collecting ruff==0.11.4 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 18.03 Downloading ruff-0.11.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (25 kB)
#10 18.64 Collecting psutil==7.0.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 18.65 Downloading psutil-7.0.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (22 kB)
#10 18.81 Collecting pygments>=2.19.2 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 18.82 Downloading pygments-2.19.2-py3-none-any.whl.metadata (2.5 kB) | thread=8329666752
[INFO] 17:14:34.195938 #10 23.91 Collecting pydantic_core==2.33.2 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 23.92 Downloading pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl.metadata (6.8 kB)
#10 24.02 Collecting packaging==25.0 (from clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 24.03 Downloading packaging-25.0-py3-none-any.whl.metadata (3.3 kB)
#10 24.07 Requirement already satisfied: typing-extensions!=4.7.0,>=4.6.0 in /venv/lib/python3.12/site-packages (from pydantic_core==2.33.2->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (4.12.2)
#10 24.11 Collecting contextlib2>=0.5.5 (from schema==0.7.5->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 24.12 Downloading contextlib2-21.6.0-py2.py3-none-any.whl.metadata (4.1 kB)
#10 24.23 Collecting aiohappyeyeballs>=2.5.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 24.24 Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl.metadata (5.9 kB)
#10 24.28 Collecting aiosignal>=1.4.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 24.29 Downloading aiosignal-1.4.0-py3-none-any.whl.metadata (3.7 kB)
#10 24.38 Collecting attrs>=17.3.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 24.40 Downloading attrs-25.4.0-py3-none-any.whl.metadata (10 kB)
#10 24.97 Collecting frozenlist>=1.1.1 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 24.98 Downloading frozenlist-1.8.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (20 kB)
#10 26.78 Collecting multidict<7.0,>=4.5 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 26.79 Downloading multidict-6.7.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (5.3 kB)
#10 27.18 Collecting propcache>=0.2.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 27.19 Downloading propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (13 kB) | thread=8329666752
[INFO] 17:14:38.907503 #10 26.78 Collecting multidict<7.0,>=4.5 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 26.79 Downloading multidict-6.7.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (5.3 kB)
#10 27.18 Collecting propcache>=0.2.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 27.19 Downloading propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (13 kB)
#10 29.18 Collecting yarl<2.0,>=1.17.0 (from aiohttp>=3.10.0->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 29.19 Downloading yarl-1.22.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl.metadata (75 kB) | thread=8329666752
[INFO] 17:14:44.744245 #10 33.12 Collecting grpcio>=1.53.2 (from clarifai-grpc>=11.10.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 33.13 Downloading grpcio-1.76.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.metadata (3.7 kB)
#10 34.34 Collecting protobuf>=5.29.5 (from clarifai-grpc>=11.10.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 34.35 Downloading protobuf-6.33.1-cp39-abi3-manylinux2014_aarch64.whl.metadata (593 bytes)
#10 34.47 Collecting googleapis-common-protos>=1.57.0 (from clarifai-grpc>=11.10.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1))
#10 34.48 Downloading googleapis_common_protos-1.72.0-py3-none-any.whl.metadata (9.4 kB)
#10 34.92 Requirement already satisfied: charset-normalizer<4,>=2 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (3.4.1)
#10 34.93 Requirement already satisfied: idna<4,>=2.5 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (3.10)
#10 34.93 Requirement already satisfied: urllib3<3,>=1.21.1 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2.3.0)
#10 34.93 Requirement already satisfied: certifi>=2017.4.17 in /venv/lib/python3.12/site-packages (from requests>=2.32.3->clarifai==11.10.2->-r /home/nonroot/requirements.txt (line 1)) (2025.1.31)
#10 35.46 Downloading clarifai-11.10.2-py3-none-any.whl (306 kB)
#10 35.47 Downloading packaging-25.0-py3-none-any.whl (66 kB)
#10 35.49 Downloading psutil-7.0.0-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (279 kB)
#10 35.51 Downloading pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (1.9 MB)
#10 35.56 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.9/1.9 MB 71.6 MB/s eta 0:00:00
#10 35.58 Downloading ruff-0.11.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl (10.4 MB)
#10 35.73 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.4/10.4 MB 75.2 MB/s eta 0:00:00
#10 35.75 Downloading schema-0.7.5-py2.py3-none-any.whl (17 kB)
#10 35.76 Downloading uv-0.7.12-py3-none-manylinux_2_28_aarch64.whl (16.6 MB)
#10 36.00 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.6/16.6 MB 74.0 MB/s eta 0:00:00
#10 36.02 Downloading aiohttp-3.13.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (1.7 MB)
#10 36.06 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 76.8 MB/s eta 0:00:00
#10 36.07 Downloading clarifai_grpc-11.10.9-py3-none-any.whl (302 kB)
#10 36.09 Downloading clarifai_protocol-0.0.34-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (407 kB)
#10 36.11 Downloading click-8.3.1-py3-none-any.whl (108 kB)
#10 36.12 Downloading numpy-2.3.5-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (14.3 MB)
#10 36.34 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.3/14.3 MB 73.3 MB/s eta 0:00:00
#10 36.35 Downloading pillow-12.0.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (6.3 MB)
#10 36.47 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.3/6.3 MB 65.5 MB/s eta 0:00:00
#10 36.48 Downloading pygments-2.19.2-py3-none-any.whl (1.2 MB)
#10 36.51 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 71.1 MB/s eta 0:00:00
#10 36.53 Downloading tabulate-0.9.0-py3-none-any.whl (35 kB)
#10 36.54 Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl (15 kB)
#10 36.55 Downloading aiosignal-1.4.0-py3-none-any.whl (7.5 kB)
#10 36.56 Downloading attrs-25.4.0-py3-none-any.whl (67 kB)
#10 36.58 Downloading contextlib2-21.6.0-py2.py3-none-any.whl (13 kB)
#10 36.59 Downloading frozenlist-1.8.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (243 kB)
#10 36.61 Downloading googleapis_common_protos-1.72.0-py3-none-any.whl (297 kB)
#10 36.62 Downloading grpcio-1.76.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (6.4 MB)
#10 36.73 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.4/6.4 MB 73.0 MB/s eta 0:00:00
#10 36.74 Downloading multidict-6.7.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (258 kB)
#10 36.76 Downloading propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (225 kB)
#10 36.77 Downloading protobuf-6.33.1-cp39-abi3-manylinux2014_aarch64.whl (324 kB)
#10 36.79 Downloading yarl-1.22.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (372 kB) | thread=8329666752
[INFO] 17:14:49.968390 #10 36.34 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.3/14.3 MB 73.3 MB/s eta 0:00:00
#10 36.35 Downloading pillow-12.0.0-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl (6.3 MB)
#10 36.47 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.3/6.3 MB 65.5 MB/s eta 0:00:00
#10 36.48 Downloading pygments-2.19.2-py3-none-any.whl (1.2 MB)
#10 36.51 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 71.1 MB/s eta 0:00:00
#10 36.53 Downloading tabulate-0.9.0-py3-none-any.whl (35 kB)
#10 36.54 Downloading aiohappyeyeballs-2.6.1-py3-none-any.whl (15 kB)
#10 36.55 Downloading aiosignal-1.4.0-py3-none-any.whl (7.5 kB)
#10 36.56 Downloading attrs-25.4.0-py3-none-any.whl (67 kB)
#10 36.58 Downloading contextlib2-21.6.0-py2.py3-none-any.whl (13 kB)
#10 36.59 Downloading frozenlist-1.8.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (243 kB)
#10 36.61 Downloading googleapis_common_protos-1.72.0-py3-none-any.whl (297 kB)
#10 36.62 Downloading grpcio-1.76.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (6.4 MB)
#10 36.73 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.4/6.4 MB 73.0 MB/s eta 0:00:00
#10 36.74 Downloading multidict-6.7.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (258 kB)
#10 36.76 Downloading propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (225 kB)
#10 36.77 Downloading protobuf-6.33.1-cp39-abi3-manylinux2014_aarch64.whl (324 kB)
#10 36.79 Downloading yarl-1.22.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl (372 kB)
#10 38.44 Installing collected packages: uv, tabulate, ruff, pygments, pydantic_core, psutil, protobuf, propcache, Pillow, packaging, numpy, multidict, grpcio, frozenlist, contextlib2, click, attrs, aiohappyeyeballs, yarl, schema, googleapis-common-protos, aiosignal, clarifai-grpc, aiohttp, clarifai-protocol, clarifai | thread=8329666752
[INFO] 17:14:59.489056 #10 48.51 Attempting uninstall: packaging
#10 48.52 Found existing installation: packaging 24.2
#10 48.55 Uninstalling packaging-24.2:
#10 48.58 Successfully uninstalled packaging-24.2 | thread=8329666752
[INFO] 17:15:20.793720 #10 68.14 Successfully installed Pillow-12.0.0 aiohappyeyeballs-2.6.1 aiohttp-3.13.2 aiosignal-1.4.0 attrs-25.4.0 clarifai-11.10.2 clarifai-grpc-11.10.9 clarifai-protocol-0.0.34 click-8.3.1 contextlib2-21.6.0 frozenlist-1.8.0 googleapis-common-protos-1.72.0 grpcio-1.76.0 multidict-6.7.0 numpy-2.3.5 packaging-25.0 propcache-0.4.1 protobuf-6.33.1 psutil-7.0.0 pydantic_core-2.33.2 pygments-2.19.2 ruff-0.11.4 schema-0.7.5 tabulate-0.9.0 uv-0.7.12 yarl-1.22.0
#10 69.27
#10 69.27 \[notice] A new release of pip is available: 25.0.1 -> 25.3
#10 69.27 \[notice] To update, run: pip install --upgrade pip
#10 DONE 69.8s
#14 \[linux/arm64 4/5] COPY --link=true 1 /home/nonroot/main/1
#14 DONE 0.0s
#15 \[linux/arm64 5/5] COPY --link=true requirements.txt config.yaml /home/nonroot/main/
#15 DONE 0.0s
#16 exporting to image
#16 exporting layers | thread=8329666752
[INFO] 17:15:24.795350 #16 exporting layers 7.3s done
#16 exporting manifest sha256:************2bb1 done
#16 exporting config sha256:************fbda done
#16 exporting manifest sha256:************87fc done
#16 exporting config sha256:************29d1 done
#16 exporting manifest list sha256:************2d60 done
#16 pushing layers
#16 ...
#17 \[auth] sharing credentials for 891377382885.dkr.ecr.us-east-1.amazonaws.com | thread=8329666752
[INFO] 17:15:29.098369 #16 exporting layers 7.3s done
#16 exporting manifest sha256:************2bb1 done
#16 exporting config sha256:************fbda done
#16 exporting manifest sha256:************87fc done
#16 exporting config sha256:************29d1 done
#16 exporting manifest list sha256:************2d60 done
#16 pushing layers
#16 ...
#17 \[auth] sharing credentials for 891377382885.dkr.ecr.us-east-1.amazonaws.com
#17 DONE 0.0s
#16 exporting to image
#16 pushing layers 3.0s done
#16 pushing manifest for ****/prod/pytorch:****bcbdc86d37a7@sha256:************2d60
#16 pushing manifest for ****/prod/pytorch:****bcbdc86d37a7@sha256:************2d60 1.0s done
#16 DONE 11.3s
2025-11-26 14:15:26.682504 INFO: Done building image!!! | thread=8329666752
[INFO] 17:15:31.301665 ng... (elapsed 86.5s)
Pipeline step build complete! | thread=8329666752
[INFO] 17:15:31.302080 Build time elapsed 87.7s | thread=8329666752
[INFO] 17:15:31.305764 Generated config-lock.yaml at /Users/macbookpro/Desktop/pipelines/one/stepA/config-lock.yaml | thread=8329666752
[INFO] 17:15:31.305864 Generated config-lock.yaml for pipeline step with version 36e752e546334fa28d73bcbdc86d37a7 | thread=8329666752
[INFO] 17:15:31.306267 Successfully uploaded pipeline step stepA with version 36e752e546334fa28d73bcbdc86d37a7 | thread=8329666752
[INFO] 17:15:31.306353 Uploading pipeline step from directory: stepB | thread=8329666752
[INFO] 17:15:31.308736 No config-lock.yaml found, will upload pipeline step | thread=8329666752
[INFO] 17:15:31.309571 Created Dockerfile at /Users/macbookpro/Desktop/pipelines/one/stepB/Dockerfile | thread=8329666752
[INFO] 17:15:35.054580 Creating new pipeline step stepB | thread=8329666752
[INFO] 17:15:35.724878 Successfully created pipeline step stepB | thread=8329666752
[INFO] 17:15:35.735020 Uploading pipeline step content... | thread=6135721984
[INFO] 17:15:35.735406 Upload complete! | thread=6135721984
Status: Upload done, Upload Progress: 100%, Details: Completed upload of files, initiating pipelineStep version image build. request_id: sdk-python-11Status: Pipeline Step image is currently being built., Upload Progress: 100%, Details: Pipeline Step Version image is being built. request_id: sdk-pyt[INFO] 17:15:36.353718 a4e848dcda9d7ec46af50
Created Pipeline Step Version ID: 5e93c74ef8ae456ab353aa5e60e46f97 | thread=8329666752
[INFO] 17:15:42.766275 2025-11-26 14:15:37.007123 INFO: Downloading uploaded buildable from storage...
2025-11-26 14:15:37.835820 INFO: Done downloading buildable from storage
2025-11-26 14:15:37.839806 INFO: Extracting upload...
2025-11-26 14:15:37.844892 INFO: Done extracting upload
2025-11-26 14:15:37.847844 INFO: Parsing requirements file for buildable version ID ****aa5e60e46f97
2025-11-26 14:15:37.875202 INFO: Dockerfile found at /shared/context/Dockerfile
2025-11-26 14:15:38.599716 INFO: Setting up credentials
amazon-ecr-credential-helper
Version: 0.8.0
Git commit: ********
2025-11-26 14:15:38.604653 INFO: Building image...
#1 \[internal] load build definition from Dockerfile
#1 transferring dockerfile: 776B done
#1 WARN: FromAsCasing: 'as' and 'FROM' keywords' casing do not match (line 1)
#1 WARN: ****orm: Setting platform to predefined $TARGETPLATFORM in FROM is redundant as this is the default behavior (line 1)
#1 DONE 0.0s
#2 \[linux/amd64 internal] load metadata for public.ecr.aws/clarifai-models/python-base:3.12-********
#2 ...
#3 \[linux/arm64 internal] load metadata for public.ecr.aws/clarifai-models/python-base:3.12-********
#3 DONE 0.2s
#4 \[internal] load .dockerignore
#4 transferring context: 2B done
#4 DONE 0.0s
#2 \[linux/amd64 internal] load metadata for public.ecr.aws/clarifai-models/python-base:3.12-********
#2 DONE 0.2s
#5 \[internal] load build context
#5 transferring context: 1.28kB done
#5 DONE 0.0s
#6 \[linux/amd64 1/5] FROM public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62
#6 resolve public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62 done
#6 DONE 0.0s
#7 \[linux/amd64 2/5] COPY --link requirements.txt /home/nonroot/requirements.txt
#7 CACHED
#8 \[linux/amd64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#8 CACHED
#9 \[linux/arm64 1/5] FROM public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62
#9 resolve public.ecr.aws/clarifai-models/python-base:3.12-********@sha256:************2a62 done
#9 DONE 0.0s
#10 \[linux/arm64 2/5] COPY --link requirements.txt /home/nonroot/requirements.txt
#10 CACHED
#11 \[linux/arm64 3/5] RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
#11 CACHED
#12 \[linux/arm64 4/5] COPY --link=true 1 /home/nonroot/main/1
#12 DONE 0.0s
#13 \[linux/amd64 4/5] COPY --link=true 1 /home/nonroot/main/1
#13 DONE 0.0s
#14 \[linux/arm64 5/5] COPY --link=true requirements.txt config.yaml /home/nonroot/main/
#14 DONE 0.0s
#15 \[linux/amd64 5/5] COPY --link=true requirements.txt config.yaml /home/nonroot/main/
#15 DONE 0.0s
#16 exporting to image
#16 exporting layers done
#16 exporting manifest sha256:************2819 done
#16 exporting config sha256:************daac done
#16 exporting manifest sha256:************df9b done
#16 exporting config sha256:************2d14 done
#16 exporting manifest list sha256:************d0a7 done
#16 pushing layers
#16 ...
#17 \[auth] sharing credentials for 891377382885.dkr.ecr.us-east-1.amazonaws.com
#17 DONE 0.0s
#16 exporting to image
#16 pushing layers 1.0s done
#16 pushing manifest for ****/prod/pytorch:****aa5e60e46f97@sha256:************d0a7
#16 pushing manifest for ****/prod/pytorch:****aa5e60e46f97@sha256:************d0a7 0.9s done
#16 DONE 2.0s
2025-11-26 14:15:40.875095 INFO: Done building image!!! | thread=8329666752
[INFO] 17:15:48.372727 ng... (elapsed 7.4s)
Pipeline step build complete! | thread=8329666752
[INFO] 17:15:48.373267 Build time elapsed 12.0s | thread=8329666752
[INFO] 17:15:48.378134 Generated config-lock.yaml at /Users/macbookpro/Desktop/pipelines/one/stepB/config-lock.yaml | thread=8329666752
[INFO] 17:15:48.378296 Generated config-lock.yaml for pipeline step with version 5e93c74ef8ae456ab353aa5e60e46f97 | thread=8329666752
[INFO] 17:15:48.378631 Successfully uploaded pipeline step stepB with version 5e93c74ef8ae456ab353aa5e60e46f97 | thread=8329666752
[INFO] 17:15:48.381909 Updated templateRef from users/alfrick/apps/pipelines-1/pipeline_steps/stepA to users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7 | thread=8329666752
[INFO] 17:15:48.381975 Updated templateRef from users/alfrick/apps/pipelines-1/pipeline_steps/stepB to users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97 | thread=8329666752
[INFO] 17:15:48.383960 Creating pipeline hello-world-pipeline... | thread=8329666752
[INFO] 17:15:48.386609 Updated templateRef from users/alfrick/apps/pipelines-1/pipeline_steps/stepA to users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7 | thread=8329666752
[INFO] 17:15:48.386652 Updated templateRef from users/alfrick/apps/pipelines-1/pipeline_steps/stepB to users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97 | thread=8329666752
[INFO] 17:15:52.139766 Successfully created pipeline hello-world-pipeline | thread=8329666752
[INFO] 17:15:52.140368 Pipeline ID: hello-world-pipeline | thread=8329666752
[INFO] 17:15:52.140444 Pipeline version ID: 48862434906f482e94a2ec638a4233a1 | thread=8329666752
[INFO] 17:15:52.144477 Generated lockfile: /Users/macbookpro/Desktop/pipelines/one/config-lock.yaml | thread=8329666752
[INFO] 17:15:52.144650 Pipeline upload completed successfully with lockfile! | thread=8329666752
When you run the upload command, the CLI reads your pipeline configuration, uploads each step, and automatically builds a container image for every step using its code, configs, and dependencies.
- Dockerfile — For each step, it auto-generates a
Dockerfilebehind the scenes, which defines how the container image for the step is built (base image, Python version, dependency installation, and entrypoint setup). You can also create your own customized Dockerfile.
Example: Dockerfile
FROM --platform=$TARGETPLATFORM public.ecr.aws/clarifai-models/python-base:3.12-df565436eea93efb3e8d1eb558a0a46df29523ec as final
COPY --link requirements.txt /home/nonroot/requirements.txt
# Update clarifai package so we always have latest protocol to the API. Everything should land in /venv
RUN ["pip", "install", "--no-cache-dir", "-r", "/home/nonroot/requirements.txt"]
# Copy in the actual files like config.yaml, requirements.txt, and most importantly 1/pipeline_step.py for the actual pipeline step.
COPY --link=true 1 /home/nonroot/main/1
# At this point we only need these for validation in the SDK.
COPY --link=true requirements.txt config.yaml /home/nonroot/main/
- Lock file — If the CLI detects that no lock file is existing, it auto-creates a new
config-lock.yamlto “freeze” the exact step configuration and build details for that version. This lock file captures the resolved configuration, including the Python runtime (3.12), compute resources, input parameters, and a unique version ID. Thestep/config.yamlfile also stores an MD5 hash of the step contents, which is used to track changes and ensure reproducible builds.
Tip: Use the
--no-lockfileflag to disable generation ofconfig-lock.yaml. Example:clarifai pipeline upload --no-lockfile.
Example: config-lock.yaml (root)
pipeline:
id: hello-world-pipeline
user_id: alfrick
app_id: pipelines-1
version_id: 48862434906f482e94a2ec638a4233a1
orchestration_spec:
argo_orchestration_spec: |
apiVersion: argoproj.io/v1alpha1
kind: Workflow
spec:
arguments:
parameters:
- name: input_text
value: Input Text Here
entrypoint: sequence
templates:
- name: sequence
steps:
- - arguments:
parameters:
- name: input_text
value: '{{workflow.parameters.input_text}}'
name: step-0
templateRef:
name: users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7
template: users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7
- - arguments:
parameters:
- name: input_text
value: '{{workflow.parameters.input_text}}'
name: step-1
templateRef:
name: users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97
template: users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97
Example: step/config-lock.yaml
build_info:
python_version: '3.12'
hash:
algo: md5
value: f66785f39de8b22c68ee9c02929e7969
id: 36e752e546334fa28d73bcbdc86d37a7
pipeline_step:
app_id: pipelines-1
id: stepA
user_id: user-id
pipeline_step_compute_info:
cpu_limit: 500m
cpu_memory: 500Mi
num_accelerators: 0
pipeline_step_input_params:
- description: Text input for processing
name: input_text
- Container image — Using the generated Dockerfile and the locked configuration, Clarifai then builds a multi-architecture container image on its infrastructure. This image becomes a versioned, immutable artifact that’s tightly coupled with the
config-lock.yaml, ensuring every future pipeline execution runs with the exact same code, environment, and resource settings.
Note: If you modify your pipeline and upload it again to the Clarifai platform, a new version is automatically created. Only the updated sections are uploaded, ensuring efficient version management.
Step 5: Run the Pipeline
After successfully uploading your pipeline, you can execute it using the Clarifai CLI.
Run the following command from your current project directory to start the pipeline and monitor its progress until completion or timeout.
Note: You must specify both a compute cluster ID and a nodepool ID.
- CLI
clarifai pipeline run --compute_cluster_id cluster_id_here --nodepool_id nodepool_id_here
Note: The
clarifai pipeline runcommand requires the user ID, app ID, pipeline version ID, and pipeline version run ID, which it reads directly fromconfig-lock.yaml.
Example Output
clarifai pipeline run --compute_cluster_id advanced-cluster-b4z7 --nodepool_id advanced-nodepool-y43h
[INFO] 11:14:29.351647 Found config-lock.yaml, using it as default config source | thread=8329666752
[INFO] 11:14:29.372765 Starting pipeline run for pipeline hello-world-pipeline | thread=8329666752
---
[INFO] 11:16:32.978439 Pipeline run status: 64001 (JOB_RUNNING) | thread=8329666752
[INFO] 11:16:32.978534 Pipeline run in progress: 64001 (JOB_RUNNING) | thread=8329666752
[INFO] 11:16:45.383997 [LOG] time="2025-11-27T08:16:16.670Z" level=info msg="Starting Workflow Executor" version=v3.6.2
time="2025-11-27T08:16:16.672Z" level=info msg="Using executor retry strategy" Duration=1s Factor=1.6 Jitter=0.5 Steps=5
time="2025-11-27T08:16:16.672Z" level=info msg="Executor initialized" deadline="0001-01-01 00:00:00 +0000 UTC" includeScriptOutput=false namespace=prod-alfrick podName=cl-****65fa78cdf145-cl-****faeb11d525aa-4115062186 templateName=cl-****faeb11d525aa version="&Version{Version:v3.6.2,BuildDate:2024-12-02T14:13:35Z,GitCommit:********,GitTag:v3.6.2,GitTreeState:clean,GoVersion:go1.23.3,Compiler:gc,Platform:linux/amd64,}"
time="2025-11-27T08:16:16.687Z" level=info msg="Starting deadline monitor"
{"msg": "11.10.2", "@timestamp": "2025-11-27T08:16:28.188937Z", "filename": "pipeline_step.py", "stack_info": null, "lineno": 12, "taskName": null, "level": "info"}
{"msg": "stepA processed: Input Text Here", "@timestamp": "2025-11-27T08:16:28.189087Z", "filename": "pipeline_step.py", "stack_info": null, "lineno": 15, "taskName": null, "level": "info"}
time="2025-11-27T08:16:28.803Z" level=info msg="sub-process exited" argo=true error="<nil>"
time="2025-11-27T08:16:29.698Z" level=info msg="Main container completed" error="<nil>"
time="2025-11-27T08:16:29.698Z" level=info msg="No Script output reference in workflow. Capturing script output ignored"
time="2025-11-27T08:16:29.698Z" level=info msg="No output parameters"
time="2025-11-27T08:16:29.698Z" level=info msg="No output artifacts"
time="2025-11-27T08:16:29.721Z" level=info msg="Alloc=17269 TotalAlloc=21042 Sys=30805 NumGC=3 Goroutines=8" | thread=8329666752
[INFO] 11:16:45.384670 Pipeline run monitoring... (elapsed 129.5s) | thread=8329666752
[INFO] 11:16:45.384850 Pipeline run status: 64001 (JOB_RUNNING) | thread=8329666752
[INFO] 11:16:45.384979 Pipeline run in progress: 64001 (JOB_RUNNING) | thread=8329666752
[INFO] 11:16:58.557830 [LOG] time="2025-11-27T08:16:40.798Z" level=info msg="Starting Workflow Executor" version=v3.6.2
time="2025-11-27T08:16:40.801Z" level=info msg="Using executor retry strategy" Duration=1s Factor=1.6 Jitter=0.5 Steps=5
time="2025-11-27T08:16:40.801Z" level=info msg="Executor initialized" deadline="0001-01-01 00:00:00 +0000 UTC" includeScriptOutput=false namespace=prod-alfrick podName=cl-****65fa78cdf145-cl-****82ec90ebac58-2572335262 templateName=cl-****82ec90ebac58 version="&Version{Version:v3.6.2,BuildDate:2024-12-02T14:13:35Z,GitCommit:********,GitTag:v3.6.2,GitTreeState:clean,GoVersion:go1.23.3,Compiler:gc,Platform:linux/amd64,}"
time="2025-11-27T08:16:40.814Z" level=info msg="Starting deadline monitor"
{"msg": "11.10.2", "@timestamp": "2025-11-27T08:16:42.679105Z", "filename": "pipeline_step.py", "stack_info": null, "lineno": 12, "taskName": null, "level": "info"}
{"msg": "stepB processed: Input Text Here", "@timestamp": "2025-11-27T08:16:42.679306Z", "filename": "pipeline_step.py", "stack_info": null, "lineno": 15, "taskName": null, "level": "info"}
time="2025-11-27T08:16:43.308Z" level=info msg="sub-process exited" argo=true error="<nil>"
time="2025-11-27T08:16:43.815Z" level=info msg="Main container completed" error="<nil>"
time="2025-11-27T08:16:43.815Z" level=info msg="No Script output reference in workflow. Capturing script output ignored"
time="2025-11-27T08:16:43.815Z" level=info msg="No output parameters"
time="2025-11-27T08:16:43.815Z" level=info msg="No output artifacts"
time="2025-11-27T08:16:43.890Z" level=info msg="Alloc=17164 TotalAlloc=20956 Sys=30805 NumGC=3 Goroutines=8"
time="2025-11-27T08:16:43.897Z" level=info msg="Deadline monitor stopped"
time="2025-11-27T08:16:43.898Z" level=info msg="stopping progress monitor (context done)" error="context canceled" | thread=8329666752
[INFO] 11:16:58.558881 Pipeline run monitoring... (elapsed 142.7s) | thread=8329666752
[INFO] 11:16:58.559035 Pipeline run status: 64001 (JOB_RUNNING) | thread=8329666752
[INFO] 11:16:58.559159 Pipeline run in progress: 64001 (JOB_RUNNING) | thread=8329666752
[INFO] 11:17:10.044546 Pipeline run monitoring... (elapsed 154.2s) | thread=8329666752
[INFO] 11:17:10.044669 Pipeline run status: 64002 (JOB_COMPLETED) | thread=8329666752
[INFO] 11:17:10.044715 Pipeline run completed successfully! | thread=8329666752
{
"status": "success",
"pipeline_version_run": {
"id": "e8c41c0b6d334daba757a73e99ef0e2d",
"pipeline_version": {
"id": "48862434906f482e94a2ec638a4233a1",
"app_id": "pipelines-1",
"user_id": "alfrick",
"orchestration_spec": {
"argo_orchestration_spec": {
"api_version": "argoproj.io/v1alpha1",
"spec_json": "{\"apiVersion\": \"argoproj.io/v1alpha1\", \"kind\": \"Workflow\", \"spec\": {\"entrypoint\": \"sequence\", \"arguments\": {\"parameters\": [{\"name\": \"input_text\", \"value\": \"Input Text Here\"}]}, \"templates\": [{\"name\": \"sequence\", \"steps\": [[{\"name\": \"step-0\", \"templateRef\": {\"name\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7\", \"template\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7\"}, \"arguments\": {\"parameters\": [{\"name\": \"input_text\", \"value\": \"{{workflow.parameters.input_text}}\"}]}}], [{\"name\": \"step-1\", \"templateRef\": {\"name\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97\", \"template\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97\"}, \"arguments\": {\"parameters\": [{\"name\": \"input_text\", \"value\": \"{{workflow.parameters.input_text}}\"}]}}]]}]}}"
}
},
"pipeline_id": "hello-world-pipeline",
"created_at": "2025-11-26T14:15:51.680568Z",
"modified_at": "2025-11-26T14:15:51.680568Z"
},
"nodepools": [
{
"id": "advanced-nodepool-y43h",
"created_at": "2025-11-27T06:02:02.006900Z",
"modified_at": "2025-11-27T06:02:02.006900Z",
"node_capacity_type": {
"capacity_types": [
"ON_DEMAND_TYPE"
]
},
"instance_types": [
{
"id": "g6e.xlarge",
"description": "g6e.xlarge",
"compute_info": {
"cpu_memory": "29033Mi",
"num_accelerators": 1,
"accelerator_memory": "46068Mi",
"accelerator_type": [
"NVIDIA-L40S"
],
"cpu_limit": "3535m"
},
"price": "65.000000",
"cloud_provider": {
"id": "aws",
"name": "aws"
},
"region": "us-east-1",
"feature_flag_group": "ComputeResourceMedium"
}
],
"max_instances": 1,
"visibility": {
"gettable": "PRIVATE"
},
"enforced_max_instances": 1
}
],
"orchestration_status": {
"argo_status": {
"status": "{\"phase\":\"Succeeded\",\"startedAt\":\"2025-11-27T08:14:41Z\",\"finishedAt\":\"2025-11-27T08:16:48Z\",\"progress\":\"2/2\",\"nodes\":{\"cl-71575952e4355045ad2965fa78cdf145\":{\"id\":\"cl-71575952e4355045ad2965fa78cdf145\",\"name\":\"cl-71575952e4355045ad2965fa78cdf145\",\"displayName\":\"cl-71575952e4355045ad2965fa78cdf145\",\"type\":\"Steps\",\"templateName\":\"sequence\",\"templateScope\":\"local/cl-71575952e4355045ad2965fa78cdf145\",\"phase\":\"Succeeded\",\"startedAt\":\"2025-11-27T08:14:41Z\",\"finishedAt\":\"2025-11-27T08:16:48Z\",\"progress\":\"2/2\",\"resourcesDuration\":{\"cpu\":1,\"memory\":26},\"children\":[\"cl-71575952e4355045ad2965fa78cdf145-3359065953\"],\"outboundNodes\":[\"cl-71575952e4355045ad2965fa78cdf145-2572335262\"]},\"cl-71575952e4355045ad2965fa78cdf145-2572335262\":{\"id\":\"cl-71575952e4355045ad2965fa78cdf145-2572335262\",\"name\":\"cl-71575952e4355045ad2965fa78cdf145[1].step-1\",\"displayName\":\"step-1\",\"type\":\"Pod\",\"templateRef\":{\"name\":\"cl-d0a20398d7e446f0512982ec90ebac58\",\"template\":\"cl-d0a20398d7e446f0512982ec90ebac58\"},\"templateScope\":\"local/cl-71575952e4355045ad2965fa78cdf145\",\"phase\":\"Succeeded\",\"boundaryID\":\"cl-71575952e4355045ad2965fa78cdf145\",\"startedAt\":\"2025-11-27T08:16:38Z\",\"finishedAt\":\"2025-11-27T08:16:43Z\",\"progress\":\"1/1\",\"resourcesDuration\":{\"cpu\":0,\"memory\":8},\"inputs\":{\"parameters\":[{\"name\":\"input_text\",\"default\":\"\",\"value\":\"Input Text Here\",\"description\":\"Text input for processing\"},{\"name\":\"CLARIFAI_API_BASE\",\"value\":\"api.clarifai.com:443\"},{\"name\":\"CLARIFAI_USER_ID\",\"value\":\"alfrick\"},{\"name\":\"CLARIFAI_COMPUTE_CLUSTER_ID\",\"value\":\"advanced-cluster-b4z7\"},{\"name\":\"CLARIFAI_NODEPOOL_ID\",\"value\":\"advanced-nodepool-y43h\"},{\"name\":\"CLARIFAI_PIPELINE_VER_RUN_ID\",\"value\":\"e8c41c0b6d334daba757a73e99ef0e2d\"}]},\"outputs\":{\"exitCode\":\"0\"},\"hostNodeName\":\"ip-10-7-183-105.ec2.internal\"},\"cl-71575952e4355045ad2965fa78cdf145-3292102572\":{\"id\":\"cl-71575952e4355045ad2965fa78cdf145-3292102572\",\"name\":\"cl-71575952e4355045ad2965fa78cdf145[1]\",\"displayName\":\"[1]\",\"type\":\"StepGroup\",\"templateScope\":\"local/cl-71575952e4355045ad2965fa78cdf145\",\"phase\":\"Succeeded\",\"boundaryID\":\"cl-71575952e4355045ad2965fa78cdf145\",\"startedAt\":\"2025-11-27T08:16:38Z\",\"finishedAt\":\"2025-11-27T08:16:48Z\",\"progress\":\"1/1\",\"resourcesDuration\":{\"cpu\":0,\"memory\":8},\"nodeFlag\":{},\"children\":[\"cl-71575952e4355045ad2965fa78cdf145-2572335262\"]},\"cl-71575952e4355045ad2965fa78cdf145-3359065953\":{\"id\":\"cl-71575952e4355045ad2965fa78cdf145-3359065953\",\"name\":\"cl-71575952e4355045ad2965fa78cdf145[0]\",\"displayName\":\"[0]\",\"type\":\"StepGroup\",\"templateScope\":\"local/cl-71575952e4355045ad2965fa78cdf145\",\"phase\":\"Succeeded\",\"boundaryID\":\"cl-71575952e4355045ad2965fa78cdf145\",\"startedAt\":\"2025-11-27T08:14:41Z\",\"finishedAt\":\"2025-11-27T08:16:38Z\",\"progress\":\"2/2\",\"resourcesDuration\":{\"cpu\":1,\"memory\":26},\"nodeFlag\":{},\"children\":[\"cl-71575952e4355045ad2965fa78cdf145-4115062186\"]},\"cl-71575952e4355045ad2965fa78cdf145-4115062186\":{\"id\":\"cl-71575952e4355045ad2965fa78cdf145-4115062186\",\"name\":\"cl-71575952e4355045ad2965fa78cdf145[0].step-0\",\"displayName\":\"step-0\",\"type\":\"Pod\",\"templateRef\":{\"name\":\"cl-4495e1a8eea2dae2dad6faeb11d525aa\",\"template\":\"cl-4495e1a8eea2dae2dad6faeb11d525aa\"},\"templateScope\":\"local/cl-71575952e4355045ad2965fa78cdf145\",\"phase\":\"Succeeded\",\"boundaryID\":\"cl-71575952e4355045ad2965fa78cdf145\",\"startedAt\":\"2025-11-27T08:14:41Z\",\"finishedAt\":\"2025-11-27T08:16:29Z\",\"progress\":\"1/1\",\"resourcesDuration\":{\"cpu\":1,\"memory\":18},\"inputs\":{\"parameters\":[{\"name\":\"input_text\",\"default\":\"\",\"value\":\"Input Text Here\",\"description\":\"Text input for processing\"},{\"name\":\"CLARIFAI_API_BASE\",\"value\":\"api.clarifai.com:443\"},{\"name\":\"CLARIFAI_USER_ID\",\"value\":\"alfrick\"},{\"name\":\"CLARIFAI_COMPUTE_CLUSTER_ID\",\"value\":\"advanced-cluster-b4z7\"},{\"name\":\"CLARIFAI_NODEPOOL_ID\",\"value\":\"advanced-nodepool-y43h\"},{\"name\":\"CLARIFAI_PIPELINE_VER_RUN_ID\",\"value\":\"e8c41c0b6d334daba757a73e99ef0e2d\"}]},\"outputs\":{\"exitCode\":\"0\"},\"children\":[\"cl-71575952e4355045ad2965fa78cdf145-3292102572\"],\"hostNodeName\":\"ip-10-7-183-105.ec2.internal\"}},\"storedTemplates\":{\"namespaced/cl-4495e1a8eea2dae2dad6faeb11d525aa/cl-4495e1a8eea2dae2dad6faeb11d525aa\":{\"name\":\"cl-4495e1a8eea2dae2dad6faeb11d525aa\",\"inputs\":{\"parameters\":[{\"name\":\"input_text\",\"default\":\"\",\"description\":\"Text input for processing\"},{\"name\":\"CLARIFAI_API_BASE\"},{\"name\":\"CLARIFAI_USER_ID\"},{\"name\":\"CLARIFAI_COMPUTE_CLUSTER_ID\"},{\"name\":\"CLARIFAI_NODEPOOL_ID\"},{\"name\":\"CLARIFAI_PIPELINE_VER_RUN_ID\"}]},\"outputs\":{},\"metadata\":{},\"container\":{\"name\":\"\",\"image\":\"data.clarifai.com/users/alfrick/pipeline_step_versions/36e752e546334fa28d73bcbdc86d37a7/image\",\"command\":[\"python\",\"/home/nonroot/main/1/pipeline_step.py\"],\"args\":[\"--input_text\",\"{{inputs.parameters.input_text}}\"],\"env\":[{\"name\":\"CLARIFAI_API_BASE\",\"value\":\"{{inputs.parameters.CLARIFAI_API_BASE}}\"},{\"name\":\"CLARIFAI_USER_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_USER_ID}}\"},{\"name\":\"CLARIFAI_COMPUTE_CLUSTER_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_COMPUTE_CLUSTER_ID}}\"},{\"name\":\"CLARIFAI_NODEPOOL_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_NODEPOOL_ID}}\"},{\"name\":\"CLARIFAI_PIPELINE_VER_RUN_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_PIPELINE_VER_RUN_ID}}\"},{\"name\":\"CLARIFAI_PAT\",\"valueFrom\":{\"secretKeyRef\":{\"name\":\"cl-a021ef5e66fb7933a00b83f07a07a835\",\"key\":\"clarifaiPAT\"}}}],\"resources\":{\"limits\":{\"cpu\":\"500m\",\"memory\":\"500Mi\"},\"requests\":{\"cpu\":\"500m\",\"memory\":\"500Mi\"}}}},\"namespaced/cl-d0a20398d7e446f0512982ec90ebac58/cl-d0a20398d7e446f0512982ec90ebac58\":{\"name\":\"cl-d0a20398d7e446f0512982ec90ebac58\",\"inputs\":{\"parameters\":[{\"name\":\"input_text\",\"default\":\"\",\"description\":\"Text input for processing\"},{\"name\":\"CLARIFAI_API_BASE\"},{\"name\":\"CLARIFAI_USER_ID\"},{\"name\":\"CLARIFAI_COMPUTE_CLUSTER_ID\"},{\"name\":\"CLARIFAI_NODEPOOL_ID\"},{\"name\":\"CLARIFAI_PIPELINE_VER_RUN_ID\"}]},\"outputs\":{},\"metadata\":{},\"container\":{\"name\":\"\",\"image\":\"data.clarifai.com/users/alfrick/pipeline_step_versions/5e93c74ef8ae456ab353aa5e60e46f97/image\",\"command\":[\"python\",\"/home/nonroot/main/1/pipeline_step.py\"],\"args\":[\"--input_text\",\"{{inputs.parameters.input_text}}\"],\"env\":[{\"name\":\"CLARIFAI_API_BASE\",\"value\":\"{{inputs.parameters.CLARIFAI_API_BASE}}\"},{\"name\":\"CLARIFAI_USER_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_USER_ID}}\"},{\"name\":\"CLARIFAI_COMPUTE_CLUSTER_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_COMPUTE_CLUSTER_ID}}\"},{\"name\":\"CLARIFAI_NODEPOOL_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_NODEPOOL_ID}}\"},{\"name\":\"CLARIFAI_PIPELINE_VER_RUN_ID\",\"value\":\"{{inputs.parameters.CLARIFAI_PIPELINE_VER_RUN_ID}}\"},{\"name\":\"CLARIFAI_PAT\",\"valueFrom\":{\"secretKeyRef\":{\"name\":\"cl-a021ef5e66fb7933a00b83f07a07a835\",\"key\":\"clarifaiPAT\"}}}],\"resources\":{\"limits\":{\"cpu\":\"500m\",\"memory\":\"500Mi\"},\"requests\":{\"cpu\":\"500m\",\"memory\":\"500Mi\"}}}}},\"conditions\":[{\"type\":\"PodRunning\",\"status\":\"False\"},{\"type\":\"Completed\",\"status\":\"True\"}],\"resourcesDuration\":{\"cpu\":1,\"memory\":26},\"artifactRepositoryRef\":{\"default\":true,\"artifactRepository\":{}},\"artifactGCStatus\":{\"notSpecified\":true},\"taskResultsCompletionStatus\":{\"cl-71575952e4355045ad2965fa78cdf145-2572335262\":true,\"cl-71575952e4355045ad2965fa78cdf145-4115062186\":true}}"
},
"status": {
"code": "JOB_COMPLETED",
"description": "Argo workflow at phase: Succeeded, progress: 2/2, message: "
}
},
"user_id": "user-id",
"app_id": "pipelines-1",
"created_at": "2025-11-27T08:14:36.197820Z",
"modified_at": "2025-11-27T08:17:00.924518Z",
"orchestration_spec": {
"argo_orchestration_spec": {
"api_version": "argoproj.io/v1alpha1",
"spec_json": "{\"apiVersion\": \"argoproj.io/v1alpha1\", \"kind\": \"Workflow\", \"spec\": {\"entrypoint\": \"sequence\", \"arguments\": {\"parameters\": [{\"name\": \"input_text\", \"value\": \"Input Text Here\"}]}, \"templates\": [{\"name\": \"sequence\", \"steps\": [[{\"name\": \"step-0\", \"templateRef\": {\"name\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7\", \"template\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepA/versions/36e752e546334fa28d73bcbdc86d37a7\"}, \"arguments\": {\"parameters\": [{\"name\": \"input_text\", \"value\": \"{{workflow.parameters.input_text}}\"}]}}], [{\"name\": \"step-1\", \"templateRef\": {\"name\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97\", \"template\": \"users/alfrick/apps/pipelines-1/pipeline_steps/stepB/versions/5e93c74ef8ae456ab353aa5e60e46f97\"}, \"arguments\": {\"parameters\": [{\"name\": \"input_text\", \"value\": \"{{workflow.parameters.input_text}}\"}]}}]]}]}}"
}
}
}
}
Run Command Options
The table below summarizes the available options you can use with the clarifai pipeline run command.
| Option | Type | Description | Defaults / Notes |
|---|---|---|---|
| Targeting & Identity | |||
--pipeline_id | TEXT | ID of the pipeline to execute | |
--user_id | TEXT | User ID owning the pipeline | |
--app_id | TEXT | App ID containing the pipeline | |
--pipeline_version_id | TEXT | Specific version of the pipeline to run | |
--pipeline_url | TEXT | Full URL to the pipeline resource | Alternative to providing IDs separately |
| Execution Control | |||
--config | PATH | Path to a local configuration file for the run | |
--pipeline_version_run_id | TEXT | Custom ID for this specific execution run | A UUID is generated automatically if omitted |
--nodepool_id | TEXT | Specific Nodepool to execute on | |
--compute_cluster_id | TEXT | Specific Compute Cluster to execute on | |
| Inputs & Parameters | |||
--set | TEXT | Override parameter values inline | Format: key=value. Can be used multiple times. Example --set prompt="hello", --set temperature="0.7" |
--overrides-file | PATH | Path to JSON/YAML file for bulk parameter overrides | |
| Monitoring & Logging | |||
--monitor | FLAG | Watch an existing run instead of starting a new one | Requires --pipeline_version_run_id |
--monitor_interval | INTEGER | Frequency of status checks | Default: 10 seconds. |
--timeout | INTEGER | Max time to wait for completion | Default: 3600 seconds (1 hour) |
--log_file | PATH | File path to save execution logs | If omitted, logs output to console |
That's it!