Last updated: October 17, 2022.
We missed the last release blog, so we’re making it up with a longer one today. ZenML 0.7.2 and 0.7.3 are both a doozy in terms of new and exciting features, from exciting new integrations to better stack management. Read on to learn more!
📊 Experiment Tracking Components
PR 530 adds a new stack component to ZenML’s ever-growing list: experiment_trackers allows users to configure your experiment tracking tools with ZenML. Examples of experiment tracking tools are Weights & Biases, MLflow, Neptune, amongst others.
Existing users might be confused as ZenML has had MLflow and wandb support for a while now without such a component. However, this component allows users more control over the configuration of MLflow and wandb with the new MLFlowExperimentTracker and WandbExperimentTracker components. This allows these tools to work in more scenarios than the currently limiting local use-cases.
As an example, let’s take the MLFlowExperimentTracker. Users can add it to the stack as follows:
And the above will be pointed towards a local MLflow instance.
However, let’s say you want to have a stack that points to a shared MLflow instance. You can simply create a new stack now pointing to that configuration:
And that’s it, we have a shared experiment tracking component in our stack. We have tested this on a deployed MLflow instance on GCP using the default orchestrator. Pro-tip: MLflow can be deployed easily with one click using this awesome repo by Artefactory.
🤗 HuggingFace Support + Weights & Biases + ZenML 💗 By The Community
The latest ZenML release brings two amazing new integrations with HuggingFace and Weights & Biases. Both these integrations were contributions from the growing ZenML community and we could not be more grateful 🙏! In a timely fashion, Richard Socher gave a shoutout to all three companies in his latest interview with Future by a16z.
🔎 XGBoost and LightGBM support
XGBoost and LightGBM are one of the most widely used boosting algorithm libraries out there. That’s why we introduced new materializers for both in the latest ZenML release! Now you can pass the following objects into your steps:
📂 Parameterized S3FS support to enable non-AWS S3 storage (minio, ceph)
A big complaint of the S3 Artifact Store integration was that it was hard to parameterize it in a way that it supports non-AWS S3 storage like minio and ceph. The latest release made this super simple! When you want to register an S3ArtifactStore from the CLI, you can now pass in
client_kwargs, config_kwargs or s3_additional_kwargs as a JSON string. For example:
See PR #532 for more details.
🧱 New CLI commands to update stack components
We added functionality to allow users to update stacks that already exist. This shows the basic workflow:
More details are in the CLI docs. Users can add new stack components to a pre-existing stack, or they can modify already-present stack components. They can also rename their stack and individual stack components.
🐛 Seldon Core authentication through ZenML secrets
The Seldon Core Model Deployer stack component was updated in this release to allow the configuration of ZenML secrets with credentials that authenticate Seldon to access the Artifact Store. The Seldon Core integration provides three different secret schemas for the three flavors of Artifact Store: AWS, GCP, and Azure, but custom secrets can be used as well. For more information on how to use this feature please refer to our Seldon Core deployment example.
Lastly, we had numerous other changes such as ensuring the PyTorch materializer works across all artifact stores and the Kubeflow Metadata Store can be easily queried locally.
🙌 Talk to Us
Join our Slack to be part of the growing ZenML community. We would love to talk to you and see if ZenML is helping you, and get your input as to where it should go next!