Create a Repo > Start an IDE/Notebook > Train > Tune > Deploy > Monitor
How to set-up your first project in DKube by setting up and connecting with your code and data repositories. Learn what kind of code and data sources are available by default in DKube.
How to set-up a Jupyter Notebook or R-Studio IDE to import or write your program code including Kubeflow pipeline DSL. Learn about the supported ML frameworks or custom image import into your notebook.
How to launch a single model training or data preprocessing run through the DKube UI while passing new parameters to pass to the training program via the UI and designated environment variables. Learn how to track the performance characteristics and the lineage of the run with the dataset and model versions used or created in the process.
How to launch hyperparameter tuning runs taking advantage of the Katib based hyperparameter tuning available in Kubeflow. Pick the best model from the multiple training runs as the winner based on pre-set criteria.
How to deploy a preferred version of a model into production or to first push it to a model catalog for another gatekeeper to test and select a preferred model before pushing into production. The model catalog capability is unique to DKube and minimizes accidental escape to production of a model that may not be ready yet.
How to set-up and execute Kubeflow pipelines through DKube's JuypterLab notebook and how to manage or automate the pipeline runs through the UI. This is one of the hottest features of Kubeflow allowing you to set-up a multi-stage pipeline of data prep, feature engineering, training, and production deployment based on time triggers or other event triggers.
Want to learn how to monitor your models in production? The DKube platform integrates model monitoring into the overall system with DKube Monitor. It includes everything necessary for engineers and executives to identify how well your models are achieving their business goals - and facilitates a smooth workflow to improve them when necessary.
Integrate with HPC/LSF Clusters
Learn how to integrate DKube with HPC/LSF clusters including configuring the initial set-up and scheduling pre-processing or training jobs including Kubeflow pipelines jobs and analyze results with MLFlow based model comparison metrics.