Skip to content

Commit

Permalink
product name change and typos
Browse files Browse the repository at this point in the history
  • Loading branch information
rsriniva committed Dec 6, 2023
1 parent f8a9c81 commit 96050e2
Show file tree
Hide file tree
Showing 8 changed files with 32 additions and 33 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Deploying Machine Learning Models with Red Hat OpenShift Data Science

This course is the fifth in a series of 6 courses about Red Hat OpenShift Data Science (RHODS). This course teaches you how to deploy machine learning models in RHODS, and how to consume them from client applications.
This course is the fourth in a series of five courses about Red Hat OpenShift AI (RHOAI). This course teaches you how to deploy machine learning models in RHOAI, and how to consume them from client applications.

# Creating Course Content

Expand Down
2 changes: 1 addition & 1 deletion antora-playbook.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
site:
title: Deploying Machine Learning Models with Red Hat OpenShift Data Science
title: Deploying Machine Learning Models with Red Hat OpenShift AI
start_page: rhods-deploy::index.adoc

content:
Expand Down
2 changes: 1 addition & 1 deletion antora.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
name: rhods-deploy
title: Deploying Machine Learning Models with Red Hat OpenShift Data Science
title: Deploying Machine Learning Models with Red Hat OpenShift AI
version: 1.33
nav:
- modules/ROOT/nav.adoc
Expand Down
29 changes: 14 additions & 15 deletions modules/ROOT/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,15 +1,14 @@
= Deploying Machine Learning Models with Red Hat OpenShift Data Science
= Deploying Machine Learning Models with Red Hat OpenShift AI
:navtitle: Home

Welcome to this quick course on the _Deploying Machine Learning Models with Red Hat OpenShift Data Science_.
This course is the fifth in a series of 6 courses about Red Hat OpenShift Data Science:
Welcome to this quick course on the _Deploying Machine Learning Models with Red Hat OpenShift AI_.
This course is the *fourth* in a series of *five* courses about Red Hat OpenShift AI:

1. https://redhatquickcourses.github.io/rhods-intro[Introduction to Red Hat OpenShift Data Science]
2. https://redhatquickcourses.github.io/rhods-admin[Red Hat OpenShift Data Science Administration]
3. Data Analysis and Visualization with Red Hat OpenShift Data Science (Under Development)
4. https://redhatquickcourses.github.io/rhods-model[Creating Machine Learning Models with Red Hat OpenShift Data Science]
5. Deploying Machine Learning Models with Red Hat OpenShift Data Science (_This course_)
6. https://redhatquickcourses.github.io/rhods-pipelines[Automation using Data Science Pipelines]
1. https://redhatquickcourses.github.io/rhods-intro[Introduction to Red Hat OpenShift AI]
2. https://redhatquickcourses.github.io/rhods-admin[Red Hat OpenShift AI Administration]
3. https://redhatquickcourses.github.io/rhods-model[Creating Machine Learning Models with Red Hat OpenShift Data Science]
4. Deploying Machine Learning Models with Red Hat OpenShift AI (_This course_)
5. https://redhatquickcourses.github.io/rhods-pipelines[Automation using Data Science Pipelines]
NOTE: After you have completed all the courses in the learning path, you can attempt the https://github.com/RedHatQuickCourses/rhods-qc-apps/tree/main/7.hands-on-lab["Hit the RHODS"] exercise, that tests your understanding of the concepts taught in all the courses.

Expand All @@ -24,24 +23,24 @@ The PTL team acknowledges the valuable contributions of the following Red Hat as

== Classroom Environment

There are two options, based on whether you are taking this course standalone (just this course), or as part of the full 6 course learning path where you installed RHODS on an OpenShift cluster in the second course in the learning path, _Red Hat OpenShift Data Science Administration_.
There are two options, based on whether you are taking this course standalone (just this course), or as part of the full five course learning path where you installed RHOAI on an OpenShift cluster in the second course in the learning path, _Red Hat OpenShift AI Administration_.

=== Option 1: Standalone (RHODS Pre-installed on OpenShift)
=== Option 1: Standalone (RHOAI Pre-installed on OpenShift)

You will use the https://demo.redhat.com/catalog?search=openshift+data+science&item=babylon-catalog-prod%2Fsandboxes-gpte.ocp4-workshop-rhods-base-aws.prod[Base RHODS on AWS] catalog item in the Red Hat Demo Platform (RHDP) to run the hands on exercises in this course.

This classroom has a pre-installed version of Red Hat OpenShift Data Science on OpenShift.

=== Option 2: Six Course Learning Path
=== Option 2: Five Course Learning Path

Continue using the https://demo.redhat.com/catalog?search=Red+Hat+OpenShift+Container+Platform+4.13+Workshop&item=babylon-catalog-prod%2Fopenshift-cnv.ocp413-wksp-cnv.prod[Red Hat OpenShift Container Platform 4.13 Workshop] catalog item from the _Red Hat OpenShift Data Science Administration_ course.
Continue using the https://demo.redhat.com/catalog?search=Red+Hat+OpenShift+Container+Platform+4.13+Workshop&item=babylon-catalog-prod%2Fopenshift-cnv.ocp413-wksp-cnv.prod[Red Hat OpenShift Container Platform 4.13 Workshop] catalog item from the _Red Hat OpenShift AI Administration_ course.

[TIP]
====
To prevent problems when allocating the workbench pods, make sure that your catalog item has been configured with `64Gi` as the worker memory size.
====

This classroom does *NOT* have RHODS pre-installed. You are expected to complete the _Red Hat OpenShift Data Science Administration_ course, where you install and configure a basic RHODS instance, and then continue with this course.
This classroom does *NOT* have RHOAI pre-installed. You are expected to complete the _Red Hat OpenShift AI Administration_ course, where you install and configure a basic RHOAI instance, and then continue with this course.

== Prerequisites

Expand All @@ -56,7 +55,7 @@ This classroom does *NOT* have RHODS pre-installed. You are expected to complete
The overall objectives of this course include:

* Deploy and serve machine learning models on OpenShift
* Model deployment with RHODS model serving
* Model deployment with RHOAI model serving
* Model serving authentication and authorization
* Troubleshoot deployed models
* Send inference requests to deployed models via REST API calls
2 changes: 1 addition & 1 deletion modules/chapter1/pages/index.adoc
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
= Model Serving in RHODS
= Model Serving in Red Hat OpenShift AI

== Introduction

Expand Down
14 changes: 7 additions & 7 deletions modules/chapter1/pages/section1.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ When deploying machine learning models, we need to deploy a container that serve

==== Train a model

Using a RHODS instance, let us train and deploy an example.
Using a RHOAI instance, let us train and deploy an example.

. In a data science project, create a `Standard Data Science`workbench.
Then, open the workbench to go to the JupyterLab interface.
Expand Down Expand Up @@ -133,7 +133,7 @@ The pickle model that we previously exported can be used in a Flask application.

[IMPORTANT]
====
Although we are actually serving a model with Flask in the exercise, Flask is not considered part of the Model Serving feature. This example represents one way in which some customers decide to embed their models in containers, although RHODS provides for mechanisms that can make this process of serving a model a simpler process, when provided with the proper model formats.
Although we are actually serving a model with Flask in the exercise, Flask is not considered part of the Model Serving feature. This example represents one way in which some customers decide to embed their models in containers, although RHOAI provides for mechanisms that can make this process of serving a model a simpler process, when provided with the proper model formats.
====

. In your computer, create a new directory to save the source code of the web application.
Expand Down Expand Up @@ -300,18 +300,18 @@ There are automated and faster ways to perform these steps. In the following sec

=== RHOAI Model Serving Runtimes

In the previous example, we manually created a Model Server by sending the model to an image that can interpret the model and expose it for consumtion. In our example we used Flask.
In the previous example, we manually created a Model Server by sending the model to an image that can interpret the model and expose it for consumption. In our example we used Flask.

However, in OpenShift AI, you do not need to manually create serving runtimes.
By default, OpenShift AI includes a pre-configured model serving runtime, OpenVINO, which can load, execute, and expose models trained with TensorFlow and PyTorch.
However, in Red Hat OpenShift AI, you do not need to manually create serving runtimes.
By default, Red Hat OpenShift AI includes a pre-configured model serving runtime, OpenVINO, which can load, execute, and expose models trained with TensorFlow and PyTorch.
OpenVINO supports various model formats, such as the following ones:

- https://onnx.ai[ONNX]: An open standard for machine learning interoperability.
- https://docs.openvino.ai/latest/openvino_ir.html[OpenVino IR]: The proprietary model format of OpenVINO, the model serving runtime used in OpenShift Data Science.
- https://docs.openvino.ai/latest/openvino_ir.html[OpenVino IR]: The proprietary model format of OpenVINO, the model serving runtime used in OpenShift AI.

In order to leverage the benefits of OpenVINO, you must:

. Export the model in a format compatible with one of the available RHODS runtimes.
. Export the model in a format compatible with one of the available RHOAI runtimes.
. Upload the model to an S3
. Create a Data Connection to the S3 containing the model
. Create or use one of the available serving runtimes in a Model Server configuration that specifies the size and resources to use while setting up an inference engine.
Expand Down
6 changes: 3 additions & 3 deletions modules/chapter1/pages/section2.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -40,13 +40,13 @@ oc get routes -n object-datastore | grep minio-api | awk '{print $2}'
+
[INFO]
====
Use this route as the S3 API endpoint. Basically, this is the URL that we will use when creating a data connection to the S3 in RHODS.
Use this route as the S3 API endpoint. Basically, this is the URL that we will use when creating a data connection to the S3 in RHOAI.
====

== Training The Model
We will use the iris dataset model for this excercise.

. Using a JupyterLab workbench at RHODS, import the repository: https://github.com/RedHatQuickCourses/rhods-qc-apps.git
. Using a JupyterLab workbench at RHOAI, import the repository: https://github.com/RedHatQuickCourses/rhods-qc-apps.git
+
[TIP]
====
Expand Down Expand Up @@ -83,7 +83,7 @@ Make sure to create a new path in your bucket, and upload to such path, not to r

== Create A Data Connection

. In RHODS dashboard, create a project named **iris-project**.
. In the RHOAI dashboard, create a project named **iris-project**.

. In the **Data Connections** section, create a Data Connection to your S3
+
Expand Down
8 changes: 4 additions & 4 deletions modules/chapter1/pages/section3.adoc
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
= Creating a Custom Model Serving Runtime

A model-serving runtime provides integration with a specified model server and the model frameworks that it supports. By default, Red Hat OpenShift Data Science includes the OpenVINO Model Server runtime. However, if this runtime doesn’t meet your needs (it doesn’t support a particular model framework, for example), you might want to add your own, custom runtimes.
A model-serving runtime provides integration with a specified model server and the model frameworks that it supports. By default, Red Hat OpenShift AI includes the OpenVINO Model Server runtime. However, if this runtime doesn’t meet your needs (it doesn’t support a particular model framework, for example), you might want to add your own, custom runtimes.

As an administrator, you can use the OpenShift Data Science interface to add and enable custom model-serving runtimes. You can then choose from your enabled runtimes when you create a new model server.
As an administrator, you can use the OpenShift AI interface to add and enable custom model-serving runtimes. You can then choose from your enabled runtimes when you create a new model server.

== Prerequisite

Expand All @@ -16,12 +16,12 @@ This exercise will guide you through the broad steps necessary to deploy a custo

[NOTE]
====
While RHODS supports the ability to add your own runtime, it does not support the runtimes themselves. Therefore, it is up to you to configure, adjust and maintain your custom runtimes.
While RHOAI supports the ability to add your own runtime, it does not support the runtimes themselves. Therefore, it is up to you to configure, adjust and maintain your custom runtimes.
====

== Adding The Custom Runtime

. Log in to RHODS with a user who is part of the RHODS admin group
. Log in to RHOAI with a user who is part of the RHOAI admin group

. Navigate to the Settings menu, then Serving Runtimes
+
Expand Down

0 comments on commit 96050e2

Please sign in to comment.