Develop ML pipelines locally that run on any MLOps stack

Automate, track, and version your entire ML lifecycle - without the complexity of cloud and tool integrations
Free 14-day Trial - No card required
Header image

Trusted by 1,000s of top companies to power their MLOps platforms

AirbusBrevoDeepLFrontiersHematoIkeaMercado LibrePlaytikaRivianRiverbankWalmartWayflyerAirbusBrevoDeepLFrontiersHematoIkeaMercado LibrePlaytikaRivianRiverbankWalmartWayflyer
Why ZenML?

Reduce complexity and standardize your ML workflows

ZenML unifies ML workflows into one streamlined, integrated platform.
Without ZenML
With ZenML
With ZenML
Features

Beautifully structured workflows

Modular code for immediate production-ready ML projects.
Dependency management and automated dockerization
Simple templates for common ML paradigms.
Dashboard mockup
Dashboard mockup

Reproducible machine learning

Automated data versioning of common ML artifacts (pandas, polars, pytorch datasets, etc)
Reading / Writing from common data stores - with no code changes!
Automated lineage and provenance

No MLOps infrastructure hassle

Deploys on any machine learning infrastructure like AWS Sagemaker, Google Vertex AI orAzureML Platform.
Model lifecycle management with integration into popular experiment tracking and model registry tools like MLflow, Weights & Biases, and Neptune.
50+ Integrations with the most popular cloud and open-source tools
Dashboard mockup
Dashboard mockup

Built-in security best practices

Separate config from code with modular settings
Centralized credentials management
Secret management integrated with Vault, GCP Secret Store, AWS Secrets Manager, and more

Fully compliant  data science

Comes with a beautiful UI for full observability and collaboration between teams.
Set up data quality gates and embed visualizations with your reports
Create alerts and monitoring for your machine learning pipelines.
Dashboard mockup
Product

How ZenML Works?

pip install zenml is all you need
Python script that shows what a ZenML step looks like

Start Local With Python

Give your python functions superpowers by adding a simple decorator.
Run everything locally - no complex infrastructure setup until its needed.
No restrictions - Use the existing ML tools you know and love.
Mockup

Connect Your MLOps Stack

Architect your own MLOps stack - with pre-made integrations or by creating your own.
Switch between local, staging, dev, and production easily.
Data scientists write data science code, engineers set up their MLOps infrastructure.
Mockup

Run in Production

Run the same code on any stack.
Organize data, model, code across your tooling stack all in one place.
Estabilish one source of truth for all ML activities.
Support

Frequently asked questions

Everything you need to know about the product.
What is the difference between ZenML and other machine learning orchestrators?
Unlike other machine learning pipeline frameworks, ZenML does not take an opinion on the orchestration layer. You start writing locally, and then deploy your pipeline on an orchestrator defined in your MLOps stack. ZenML supports many orchestrators natively, and can be easily extended to other orchestrators. Read more about why you might want to write your machine learning pipelines in a platform agnostic way here.
Does ZenML integrate with my MLOps stack (cloud, ML libraries, other tools etc.)?
As long as you're working in Python, you can leverage the entire ecosystem. In terms of machine learning infrastructure, ZenML pipelines can already be deployed on Kubernetes, AWS Sagemaker, GCP Vertex AI, KubeflowApache Airflow and many more. Artifact, secrets, and container storage is also supported for all major cloud providers.
Does ZenML help in GenAI / LLMOps use-cases?
Yes! ZenML is fully compatabile, and is intended to be used to productionalize LLM applications. There are examples on the ZenML projects repository that showcases our integrations with Llama Index, OpenAI, and Langchain. Check them out here!
How can I build my MLOps/LLMOps platform using ZenML?
The best way is to start simple. The starter and production guides walk you through how to build a miminal cloud MLOps stack. You can then extend with the other numerous components such as experiment tracker, model deployers, model registries and more!
What is the difference between the open source and cloud product?
ZenML is and always will be open-source at its heart. The core framework is freely available on Github and you can run and manage it in-house without using the cloud product. On the other hand, the cloud product offers one of the best experiences to use ZenML, and includes a managed version of the OSS product, including some cloud-only features that create the best collaborative experience for many companies that are scaling their ML efforts. You can see a more detailed comparison here.
Still not clear?
Ask us on Slack
HashiCorp
ZenML offers the capability to build end-to-end ML workflows that seamlessly integrate with various components of the ML stack, such as different providers, data stores, and orchestrators. This enables teams to accelerate their time to market by bridging the gap between data scientists and engineers, while ensuring consistent implementation regardless of the underlying technology.
Harold Giménez
Harold Giménez
SVP R&D at HashiCorp
Stanford University
"Many, many teams still struggle with managing models, datasets, code, and monitoring as they deploy ML models into production. ZenML provides a solid toolkit for making that easy in the Python ML world"
Chris Manning
Chris Manning
Professor of Linguistics and CS at Stanford
MadeWithML
"ZenML allows you to quickly and responsibly go from POC to production ML systems while enabling reproducibility, flexibitiliy, and above all, sanity."
Goku Mohandas
Goku Mohandas
Founder of MadeWithML
Salesforce
"ZenML allows orchestrating ML pipelines independent of any infrastructure or tooling choices. ML teams can free their minds of tooling FOMO from the fast-moving MLOps space, with the simple and extensible ZenML interface. No more vendor lock-in, or massive switching costs!"
Richard Socher
Richard Socher
Former Chief Scientist Salesforce and Founder of You.com
Wisetech Global
Thanks to ZenML we've set up a pipeline where before we had only jupyter notebooks. It helped us tremendously with data and model versioning and we really look forward to any future improvements!
Francesco Pudda
Francesco Pudda
Machine Learning Engineer at WiseTech Global

Start Your Free Trial Now

No new paradigms - Bring your own tools and infrastructure
No data leaves your servers, we only track metadata
Free trial included - no strings attached, cancel anytime
MacBook mockup