Integrations
Kubernetes
and
ZenML logo in purple, representing machine learning pipelines and MLOps framework.
Seamlessly Orchestrate ML Pipelines on Kubernetes with ZenML
The image is blank. No elements are visible for description or keyword inclusion.
Kubernetes
All integrations

Kubernetes

Seamlessly Orchestrate ML Pipelines on Kubernetes with ZenML
Add to ZenML

Seamlessly Orchestrate ML Pipelines on Kubernetes with ZenML

Leverage the power of Kubernetes to orchestrate and scale your machine learning workflows using ZenML's Kubernetes integration. This lightweight, minimalist orchestrator enables you to run ML pipelines on Kubernetes clusters without the complexity of managing additional frameworks like Kubeflow.

Features with ZenML

  • Orchestrate ZenML pipelines on Kubernetes with minimal configuration
  • Scale ML workloads seamlessly across Kubernetes clusters
  • Easily transition from local to distributed orchestration
  • Ideal for teams new to distributed ML looking for a lightweight solution

Main Features

  • Automated container orchestration and scaling
  • Declarative configuration using YAML files
  • Support for GPU-accelerated workloads
  • Rich ecosystem of tools and extensions
  • Industry standard for managing containerized applications

How to use ZenML with
Kubernetes

# Step 1: Register a new Kubrnetes orchestrator
>>> zenml orchestrator register <ORCHESTRATOR_NAME> \
    --flavor=kubernetes
    
# Step 2: Authernticate Sagemaker orchestrator
# Option 1 (recomended): Service Connector
>>> zenml orchestrator connect <ORCHESTRATOR_NAME> --connector <CONNECTOR_NAME>

# Option 2 (not recommended): Explicit authentication
>>> zenml orchestrator register <ORCHESTRATOR_NAME> \
    --flavor=kubernetes \
    --kubernetes_context=<KUBERNETES_CONTEXT>

# Step 3: Update your stack to use the Sagemaker orchestrator
>>> zenml stack update -o <ORCHESTRATOR_NAME>

from zenml import step, pipeline
from zenml.integrations.kubernetes.flavors.kubernetes_orchestrator_flavor import (
    KubernetesOrchestratorSettings,
)

kubernetes_settings = KubernetesOrchestratorSettings(
    pod_settings={
        "resources": {
            "requests": {"cpu": "1", "memory": "1Gi"},
            "limits": {"cpu": "2", "memory": "2Gi"},
        },
        "labels": {
            "app": "ml-pipeline",
            "environment": "production",
            "team": "mlops",
        },
    },
    orchestrator_pod_settings={
        "resources": {
            "requests": {"cpu": "1", "memory": "1Gi"},
            "limits": {"cpu": "2", "memory": "2Gi"},
        },
        "labels": {
            "app": "zenml-orchestrator",
            "component": "pipeline-runner",
            "team": "mlops",
        },
    },
)


@step
def load_data() -> dict:
    # Load data here
    return {1: [1, 2], 2: [3, 4]}


@step
def preprocess_data(raw_data: dict) -> dict:
    # Preprocess data here
    return {k: v * 2 for k, v in raw_data.items()}


@pipeline(
    settings={
        "orchestrator.kubernetes": kubernetes_settings,
    }
)
def my_kubernetes_pipeline():
    # Pipeline steps here
    raw_data = load_data()
    preprocess_data(raw_data)


if __name__ == "__main__":
    my_kubernetes_pipeline()
    

This code snippet demonstrates how to configure a ZenML pipeline to run on a Kubernetes cluster. The KubernetesOrchestratorSettings object is used to specify the Kubernetes relevant settings for the pipeline. These settings are then passed to the @pipeline decorator using the settings parameter. The pipeline steps are defined as usual.

Additional Resources
ZenML Kubernetes Integration Documentation
Kubernetes Documentation

Seamlessly Orchestrate ML Pipelines on Kubernetes with ZenML

Leverage the power of Kubernetes to orchestrate and scale your machine learning workflows using ZenML's Kubernetes integration. This lightweight, minimalist orchestrator enables you to run ML pipelines on Kubernetes clusters without the complexity of managing additional frameworks like Kubeflow.
Kubernetes

Start Your Free Trial Now

No new paradigms - Bring your own tools and infrastructure
No data leaves your servers, we only track metadata
Free trial included - no strings attached, cancel anytime
Dashboard displaying machine learning models, including versions, authors, and tags. Relevant to model monitoring and ML pipelines.

Connect Your ML Pipelines to a World of Tools

Expand your ML pipelines with Apache Airflow and other 50+ ZenML Integrations
Google Cloud Storage (GCS)
TensorBoard
Evidently
Amazon S3
PyTorch Lightning
LightGBM
Seldon
Docker
TensorFlow
Argilla
Great Expectations