
How I Rebuilt zenml.io in a Week with Claude Code
I rebuilt zenml.io — 2,224 pages, 20 CMS collections — from Webflow to Astro in a week using Claude Code and a multi-model AI workflow. Here's how.
Discover how ZenML stacks up against Dagster in the world of data pipeline orchestration. While Dagster offers a flexible, open-source platform for building and managing data pipelines, ZenML provides a more specialized solution focused on machine learning workflows. Compare ZenML's ML-centric features and integrations with Dagster's general-purpose pipeline orchestration capabilities. Learn how ZenML can streamline your ML operations with its intuitive pipeline definition, built-in experiment tracking, and seamless integration with popular ML frameworks, while Dagster caters to a broader range of data engineering and ETL use cases.
Feature-by-feature comparison
| ML Workflow Orchestration | Specialized for machine learning pipelines | General-purpose data pipeline orchestration not purpose-built for MLOps |
| ML Framework Integration | Built-in integrations with popular ML frameworks (scikit-learn, TensorFlow, PyTorch) | Requires custom integration with ML frameworks |
| Experiment Tracking | Built-in experiment tracking and comparison | Relies on external tools for experiment tracking |
| Model Registry | Integrated model registry for versioning and deployment | No built-in model registry |
| Data Processing | Supports data processing tasks within ML pipelines | Robust support for data processing and ETL workflows |
| Pipeline Definition | Clean and intuitive pipeline definition using Python decorators | Flexible pipeline definition using Python or YAML |
| Cloud Integration | Built-in support for popular cloud platforms (AWS, GCP, Azure) | Integrates with various cloud platforms and data stores |
| Scalability | Scales ML workloads across different compute backends | Scales data pipelines through various execution engines |
| Workflow Scheduling | Supports scheduled execution of ML pipelines | Robust scheduling and triggering of data pipelines |
| Community and Ecosystem | Growing community focused on ML workflows | Large and active community around data engineering and ETL |
Code comparison
from zenml import pipeline, step
from sklearn.ensemble import RandomForestRegressor
from sklearn.metrics import mean_squared_error
@step
def ingest_data():
return pd.read_csv("data/dataset.csv")
@step
def train_model(df):
X, y = df.drop("target", axis=1), df["target"]
model = RandomForestRegressor(n_estimators=100)
model.fit(X, y)
return model
@step
def evaluate_model(model, df):
X, y = df.drop("target", axis=1), df["target"]
rmse = mean_squared_error(y, model.predict(X)) ** 0.5
print(f"RMSE: {rmse}")
@pipeline
def ml_pipeline():
df = ingest_data()
model = train_model(df)
evaluate_model(model, df)
ml_pipeline() from dagster import pipeline, solid
from sklearn.ensemble import RandomForestRegressor
from sklearn.metrics import mean_squared_error
@solid
def ingest_data(_):
return pd.read_csv("data/dataset.csv")
@solid
def train_model(_, df):
X, y = df.drop("target", axis=1), df["target"]
model = RandomForestRegressor(n_estimators=100)
model.fit(X, y)
return model
@solid
def evaluate_model(_, model, df):
X, y = df.drop("target", axis=1), df["target"]
rmse = mean_squared_error(y, model.predict(X)) ** 0.5
print(f"RMSE: {rmse}")
@pipeline
def ml_pipeline():
df = ingest_data()
model = train_model(df)
evaluate_model(model, df)
ml_pipeline.execute_in_process()
ZenML is purpose-built for machine learning workflows, providing a more specialized and optimized experience compared to Dagster's general-purpose pipeline orchestration.
ZenML offers built-in integrations with popular ML frameworks, such as scikit-learn, TensorFlow, and PyTorch, making it easier to incorporate ML tasks into your pipelines.
With ZenML, you can leverage built-in experiment tracking and model registry capabilities, streamlining the management of your ML experiments and model versions.
ZenML provides a clean and intuitive way to define ML pipelines using Python decorators, allowing you to focus on the core logic of your workflows.
ZenML places a strong emphasis on MLOps best practices, ensuring your ML workflows are reproducible, traceable, and compliant with governance requirements.
Expand Your Knowledge

I rebuilt zenml.io — 2,224 pages, 20 CMS collections — from Webflow to Astro in a week using Claude Code and a multi-model AI workflow. Here's how.


Agentic RAG without guardrails spirals out of control. Here's how ZenML's dynamic pipelines give you fan-out, budget limits, and lineage without limiting the LLMs.