Skip to content

Latest commit

 

History

History
50 lines (32 loc) · 2.5 KB

6_kubeflow_pipelines.md

File metadata and controls

50 lines (32 loc) · 2.5 KB

Kubeflow Pipelines

Table of Contents:

Kubeflow Pipelines is a simple platform for building and deploying containerized machine learning workflows on Kubernetes. Kubeflow pipelines make it easy to implement production grade machine learning pipelines without bothering on the low-level details of managing a Kubernetes cluster.

Kubeflow Pipelines is a core component of Kubeflow and is also deployed when Kubeflow is deployed.

OAuth consent screen.

Components of Kubeflow Pipelines

A Pipeline describes a Machine Learning workflow, where each component of the pipeline is a self-contained set of codes that are packaged as Docker images. Each pipeline can be uploaded individually and shared on the Kubeflow Pipelines User Interface (UI). A pipeline takes inputs (parameters) required to run the pipeline and the inputs and outputs of each component.

The Kubeflow Pipelines platform consists of:

  • A user interface (UI) for managing and tracking experiments, jobs, and runs.
  • An engine for scheduling multi-step ML workflows.
  • An SDK for defining and manipulating pipelines and components.
  • Notebooks for interacting with the system using the SDK. (Taken from: Overview of Kubeflow Pipelines)

Executing a Sample Pipeline

  1. Click on the name [Sample] Basic - Condition.

Select a simple pipeline.

  1. Click Start an experiment.

Create an experiment.

  1. Give the Experiment a Name.

Name the experiment.

  1. Give the Run Name.

Name the run.

  1. Click on the Run Name to start the Run.

Running pipeline.

Delete Resources

See the end of Deploying an End-to-End Machine Learning Solution on Kubeflow Pipelines to delete billable GCP resources.