Ship ML models fast, your way

ZenML simplifies and standardizes your MLOps processes. Structure code in pipelines. Integrate your tools. Deploy to the cloud.
Header image

Trusted by 1,000s of top companies to standardize their MLOps workflows

AdeoAirbusAiseraBoschBrevoDeepLDevoteamEarthDaily agroFrontiersGoodyearHematoIkeaLeroy MerlinMercado LibreNielsenIQPlaytikaRiverbankRivianSymphonyAITelefonicaWalmartWayflyerWisetech GlobalAdeoAirbusAiseraBoschBrevoDeepLDevoteamEarthDaily agroFrontiersGoodyearHematoIkeaMercado LibreLeroy MerlinNielsenIQPlaytikaRiverbankRivianSymphonyAITelefonicaWalmart
Why ZenML?

Go from manual chaos to a streamlined MLOps workflow

Without ZenML
Without ZenML
With ZenML
With ZenML
Using ZenML

It's extremely simple to plugin ZenML

Just add Python decorators to your existing code and see the magic happen
Experiment tracker
Automatically track experiments in your experiment tracker
Pythonic
Return pythonic objects and have them versioned automatically
Track metadata
Track model metadata and lineage
Switch orchestrators
Switch easily between local and  cloud orchestration
Data dependencies
Define  data dependencies and modularize your entire codebase
  	@step(experiment_tracker="mlflow")
def read_data_from_snowflake(config: pydantic.BaseModel) -> pd.DataFrame:
    
  
  	  df = read_data(client.get_secret("snowflake_credentials")
  mlflow.log_metric("data_shape", df.shape)
    
  
  	  return df
    
  
  	@step(
  settings={"resources": ResourceSettings(memory="240Gb") }
    
  
  	  model=Model(name="my_model", model_registry="mlflow")
)
    
  
  	def my_trainer(df: pd.DataFrame) -> transformers.AutoModel:
  tokenizer, model = train_model(df)
  return model
  
  	@pipeline(
   active_stack="databricks_stack",
  
  	   on_failure=on_failure_hook
)
  
  	def my_pipeline():
  df = read_data_from_snowflake()    
  my_trainer(df)

my_pipeline()
  
Remove sensitive information
Remove sensitive information from your code
Infrastructure
Choose resources abstracted  frominfrastructure
Frameworks
Works for any framework - classical ML or LLM’s
Alerts
Easily define alerts for observability
No compliance headaches

Your VPC, your data

ZenML is a metadata layer on top of your existing infrastructure, meaning all data and compute stays on your side.
A fully flexible framework

Build MLOps Your Way

Boost your MLOps with unmatched flexibility, whether you use our integrations or seamlessly integrate your own components

Use standard MLOps integrations

ZenML Standard MLOps integrations
Flexible pipelines with ZenML

Integrate your own stack

Integrate your own stack
Hugging Face
Hugging Face

Seamless Integrations

Leverage our expanding suite of integrations, tailored for synergy with your bespoke creations.

No Vendor Lock-In

Commit to innovation, not contracts. ZenML supports a robust ecosystem that evolves with your project's needs.

Bring your own stack

ZenML adapts to your existing tools, ensuring a smooth integration with your current workflow.
NEW! ZenML Studio

Integrate ZenML with our VSCode Extension

Work with your ZenML pipelines, stacks, and manage your server right in the comfort of your IDE

Looking to Get Ahead in MLOps & LLMOps?

Subscribe to the ZenML newsletter and receive regular product updates, tutorials, examples, and more.
We care about your data in our privacy policy.
Support

Frequently asked questions

Everything you need to know about the product.
What is the difference between ZenML and other machine learning orchestrators?
Unlike other machine learning pipeline frameworks, ZenML does not take an opinion on the orchestration layer. You start writing locally, and then deploy your pipeline on an orchestrator defined in your MLOps stack. ZenML supports many orchestrators natively, and can be easily extended to other orchestrators. Read more about why you might want to write your machine learning pipelines in a platform agnostic way here.
Does ZenML integrate with my MLOps stack (cloud, ML libraries, other tools etc.)?
As long as you're working in Python, you can leverage the entire ecosystem. In terms of machine learning infrastructure, ZenML pipelines can already be deployed on Kubernetes, AWS Sagemaker, GCP Vertex AI, KubeflowApache Airflow and many more. Artifact, secrets, and container storage is also supported for all major cloud providers.
Does ZenML help in GenAI / LLMOps use-cases?
Yes! ZenML is fully compatabile, and is intended to be used to productionalize LLM applications. There are examples on the ZenML projects repository that showcases our integrations with Llama Index, OpenAI, and Langchain. Check them out here!
How can I build my MLOps/LLMOps platform using ZenML?
The best way is to start simple. The starter and production guides walk you through how to build a miminal cloud MLOps stack. You can then extend with the other numerous components such as experiment tracker, model deployers, model registries and more!
What is the difference between the open source and Pro product?
ZenML is and always will be open-source at its heart. The core framework is freely available on Github and you can run and manage it in-house without using the Pro product. On the other hand, ZenML Pro offers one of the best experiences to use ZenML, and includes a managed version of the OSS product, including some Pro-only features that create the best collaborative experience for many companies that are scaling their ML efforts. You can see a more detailed comparison here.
Still not clear?
Ask us on Slack
HashiCorp
ZenML offers the capability to build end-to-end ML workflows that seamlessly integrate with various components of the ML stack, such as different providers, data stores, and orchestrators. This enables teams to accelerate their time to market by bridging the gap between data scientists and engineers, while ensuring consistent implementation regardless of the underlying technology.
Harold Giménez
Harold Giménez
SVP R&D at HashiCorp
Stanford University
"Many, many teams still struggle with managing models, datasets, code, and monitoring as they deploy ML models into production. ZenML provides a solid toolkit for making that easy in the Python ML world"
Chris Manning
Chris Manning
Professor of Linguistics and CS at Stanford
MadeWithML
"ZenML allows you to quickly and responsibly go from POC to production ML systems while enabling reproducibility, flexibitiliy, and above all, sanity."
Goku Mohandas
Goku Mohandas
Founder of MadeWithML
Salesforce
"ZenML allows orchestrating ML pipelines independent of any infrastructure or tooling choices. ML teams can free their minds of tooling FOMO from the fast-moving MLOps space, with the simple and extensible ZenML interface. No more vendor lock-in, or massive switching costs!"
Richard Socher
Richard Socher
Former Chief Scientist Salesforce and Founder of You.com
Wisetech Global
Thanks to ZenML we've set up a pipeline where before we had only jupyter notebooks. It helped us tremendously with data and model versioning and we really look forward to any future improvements!
Francesco Pudda
Francesco Pudda
Machine Learning Engineer at WiseTech Global

Start Your Free Trial Now

No new paradigms - Bring your own tools and infrastructure
No data leaves your servers, we only track metadata
Free trial included - no strings attached, cancel anytime
MacBook mockup