Skip to content

Commit

Permalink
Merge pull request #275 from RizaFarheen/main
Browse files Browse the repository at this point in the history
Update orderfulfillment6.md
  • Loading branch information
nhandt2021 authored Jan 16, 2023
2 parents 848d4c1 + e360e17 commit 31b0e72
Show file tree
Hide file tree
Showing 5 changed files with 49 additions and 49 deletions.
26 changes: 13 additions & 13 deletions docs/codelab/orderfulfillment6.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,9 @@
# Parallel Tasks and Subworkflows
# Order Fulfillment Codelab part 6

You're running order fulfillment at Bob's Widgets, and at the start of this tutorial, it was a totally manual process. We've got 2 versions of the process running in Conductor, with error handling, and the ability to process multiple orders at once.
You're running order fulfillment at Bob's Widgets, and at the start of this tutorial, it was a totally manual process. We've got 2 versions of the process running in Conductor, with error handling and the ability to process multiple orders at once.

The sales team has learned that many of Bob's Widget's customers are buying multiple widgets and then sending then out again to valued customers. They know that offering 'drop-shipping' - or the ability for customers to upload a list of addresses, and to have Bob's Widgets mail them directly - would be a feature that our customers would pay extra for.
The sales team has learned that many of Bob's Widget's customers buy multiple widgets and then send them out to valued customers again. They know that offering 'drop-shipping' - or the ability for customers to upload a list of addresses and to have Bob's Widgets mail them directly - would be a feature that our customers would pay extra for.

Our workflow creates the labels for multiple shipments - but currently only handles one address as input. So, this will require a new version of our workflow (but it will lean heavily on the existing work already completed).

Expand All @@ -13,15 +13,15 @@ Our workflow creates the labels for multiple shipments - but currently only hand

<p align="center"><img src="/content/img/codelab/OF5_5_loopworkflow.png" alt="adding the do-while loop" width="400" style={{paddingBottom: 40, paddingTop: 40}} /></p>

In order to support drop-shipping while still supporting multiple shipments to a single address, we need to run the ```shipping_loop``` (and the internal ```widget_shipping```) tasks once per address. To do this, we'll utilize the [FORK](/content/docs/reference-docs/fork-task) System task. The Fork task creates a number of parallel task flows thcat can run simultaneously. We'll use a FORK to create address labels for multiple addresses at once.
In order to support drop-shipping while still supporting multiple shipments to a single address, we need to run the ```shipping_loop``` (and the internal ```widget_shipping```) tasks once per address. To do this, we'll utilize the [FORK](/content/docs/reference-docs/fork-task) System task. The Fork task creates a number of parallel task flows that can run simultaneously. We'll use a fork to create address labels for multiple addresses at once.

## Forks

<p align="center"><iframe width="560" height="315" src="https://www.youtube.com/embed/01LG4qLeXw4" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>

A Fork (and a Join) are system tasks that run inside the Conductor server. Forks split your workflow into multiple paths that can be run asynchronously. The JOIN task tells Conductor when to reconnect the paths and continue through the workflow.

An example fork might look like:
An example fork might look like this:
```
{
"name": "ship_multiple_fork",
Expand All @@ -46,11 +46,11 @@ An example fork might look like:
]
}
```
For space, the 2 forkTasks are left out, but imagine reusing the ```widget_shipping``` tasks in version 1, and then appending a unique value (in this case 1 & 2) to ensure each task has a unique reference. The workflow would look something like:
For space, the 2 forkTasks are left out, but imagine reusing the ```widget_shipping``` tasks in version 1 and then appending a unique value (in this case, 1 & 2) to ensure each task has a unique reference. The workflow would look something like this:

<p align="center"><img src="/content/img/codelab/of4_forkexample.png" alt="version 2 regular fork" width="500" style={{paddingBottom: 40, paddingTop: 40}} /></p>

Now, this is really great...but with a FORK, the number of 'tines' in the fork are defined at workflow definition. Since the number of addresses will change on each order, we need something more flexible, that can be defined at runtime. Luckily we have that - the [DYNAMIC_FORK](/content/docs/reference-docs/dynamic-fork-task). Dynamic forks determine the number of 'tines' at workflow runtime - which is exactly what we need to build our dropshipping application. This will provide the flexibility we need to support dropshipping of Bob's widgets.
Now, this is great, but with a fork, the number of 'tines' in the fork is defined at workflow definition. Since the number of addresses will change on each order, we need something more flexible that can be defined at runtime. Luckily we have that - the [DYNAMIC_FORK](/content/docs/reference-docs/dynamic-fork-task). Dynamic forks determine the number of 'tines' at workflow runtime - which is exactly what we need to build our dropshipping application. This will provide the flexibility we need to support dropshipping of Bob's widgets.

But - before we build our dynamic fork, we have a lot of housekeeping to take care of.

Expand All @@ -63,10 +63,10 @@ This is going to be a bit of work, so we'll start with creating a subworkflow.

<p align="center"><iframe width="560" height="315" src="https://www.youtube.com/embed/TUm0C3x_vYg" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe></p>

Each tine of our dynamic fork is to have a DO/WHILE loop with the ```widget_shipping``` task inside it. However, a dynamic forks can only reference ONE task. To make this work, we'll encase these two tasks inside a subworkflow.
Each tine of our dynamic fork is to have a DO/WHILE loop with the ```widget_shipping``` task inside it. However, a dynamic fork can only reference one task. To make this work, we'll encase these two tasks inside a subworkflow.


Conductor has a system task called the [SUB_WORKFLOW](/content/docs/reference-docs/sub-workflow-task). It allows an entire workflow to be called in the place of a task. There are a number of reasons SUBWORKFLOWS are useful.
Conductor has a system task called the [SUB_WORKFLOW](/content/docs/reference-docs/sub-workflow-task). It allows an entire workflow to be called in place of a task. There are a number of reasons SUBWORKFLOWS are useful.

* Placing a series of tasks in a Dynamic fork (our use case).
* Simplifying workflows when a series of tasks are reused multiple times.
Expand Down Expand Up @@ -136,13 +136,13 @@ To encase our two tasks into the Dynamic Fork, we'll create the following subwor
"inputTemplate": {}
}
```
> Note: We updated the workflow ```outputParameters``` to output the JSON from the loop. This will allow us to take the output from all of the subworkflows and combine them in the output of the overall workflow.
> Note: We updated the workflow ```outputParameters``` to output the JSON from the loop. This will allow us to take the output from all the sub workflows and combine them into the output of the overall workflow.
> Also note: This new workflow must be added to the application logic under ```Applications``` in the playground - so that the application management knows that the workflow is authorized to run with the same Key/Secret. We did this in the 2nd page of this codelab [here's a link for a refresher](/content/docs/codelab/orderfulfillment2#workflow-and-task-permissions).
> Also note: This new workflow must be added to the application logic under ```Applications``` in the playground - so that the application management knows that the workflow is authorized to run with the same Key/Secret. We did this on the 2nd page of this codelab; [here's a link for a refresher](/content/docs/codelab/orderfulfillment2#workflow-and-task-permissions).
<p align="center"><img src="/content/img/codelab/of6_subworkflow.png" alt="subworkflow for dynamic task" width="400" style={{paddingBottom: 40, paddingTop: 40}} /></p>

Before implementing the Dynamic Fork, we can update our existing workflow to call our new Subworkflow in place of the loop/shipping widget. This is done by replacing the Do/while loop (and the embedded task) in the workflow with a subworkflow task. This task defines the workflow (and the version of the workflow to be called.
Before implementing Dynamic Fork, we can update our existing workflow to call our new Subworkflow in place of the loop/shipping widget. This is done by replacing the Do/while loop (and the embedded task) in the workflow with a subworkflow task. This task defines the workflow (and the version of the workflow to be called.)

```JSON
{
Expand All @@ -166,7 +166,7 @@ Before implementing the Dynamic Fork, we can update our existing workflow to cal
}
```

Outwardly, the workflow will behave in exactly the same way - all the same tasks are called in exactly the same order. IF we look under the hood, however, we see that the ```Bobs_widget_fulfillment``` workflow creates a ```shipping_loop_workflow``` for the looping portion. So there are now 2 workflow executions (the parent and the one subworkflow) whenever the workflow is run.
Outwardly, the workflow will behave exactly the same way - all the same tasks are called in exactly the same order. If we look under the hood, however, we see that the ```Bobs_widget_fulfillment``` workflow creates a ```shipping_loop_workflow``` for the looping portion. So there are now 2 workflow executions (the parent and the one subworkflow) whenever the workflow is run.

<p align="center"><img src="/content/img/codelab/of6_twoexecutions.png" alt="table showing 2 workflow executions" width="800" style={{paddingBottom: 40, paddingTop: 40}} /></p>

Expand All @@ -177,5 +177,5 @@ The ```shipping_loop_workflow``` and ```Bobs_widget_fulfillment``` executions lo
<p align="center"><img src="/content/img/codelab/of6_workflow.png" alt="new workflow" width="400" style={{paddingBottom: 40, paddingTop: 40}} /></p>


Building this subworkflow is step one in getting our Dynamic task ready. In our next step, we must create the input data that the Dynamic task requires - and we'll us JQ TRansform tasks to accomplish that.
Building this sub workflow is step one in getting our Dynamic task ready. In our next step, we must create the input data that the Dynamic task requires - and we'll use JQ Transform tasks to accomplish that.

14 changes: 7 additions & 7 deletions docs/codelab/orderfulfillment7.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@

To improve our Widget Shipping platform and support dropshipping our widgets, we've decided to implement Dynamic Forks. Dynamic forks will allow our workflow to change dynamically at runtime to ship multiple widgets to multiple addresses.

There are a few steps to get ready for the Dynamic fork. In our last step, we moved all of the tasks we wish to run simultaneously (for each address) into a subworkflow.
There are a few steps to get ready for the Dynamic fork. In our last step, we moved all of the tasks we wished to run simultaneously (for each address) into a subworkflow.

In this section, we'll jq to parse and re-format the data we have into the formats that are required for the Dynamic Fork to run. JQ is a powerful command line tool for JSON data manipulation. There is also a Conductor System [JQ transform](/content/docs/reference-docs/system-tasks/json-jq-transform-task) task that can do the same. It's an invaluable tool for developers who work with JSON. Luckily, it is also built as a System Task in Conductor, so we can do our JSON manipulation as a part of our workflow (with the only effort being "how do I do this in JQ"). Unfortunately, this can be like solving a regex, so it can take some patience.
In this section, we'll use jq to parse and re-format the data we have into the formats that are required for the Dynamic Fork to run. JQ is a powerful command line tool for JSON data manipulation. There is also a Conductor System [JQ transform](/content/docs/reference-docs/system-tasks/json-jq-transform-task) task that can do the same. It's an invaluable tool for developers who work with JSON. Luckily, it is also built as a System Task in Conductor, so we can do our JSON manipulation as a part of our workflow (with the only effort being "how do I do this in JQ"). Unfortunately, this can be like solving a regex, so that it can take some patience.

> Tip: JQ has a great [playground](https://jqplay.org/) to help you figure out your JQ queries.
Expand All @@ -20,7 +20,7 @@ There are 2 sets of JSON that the Dynamic Fork will need:
* ```dynamicForkTasksParam```: A list of all the workflows to run (this defines how many forks will be created).
* ```dynamicForkTasksInputParamName```: Input data for each of the Forks.

The first set of data is a JSON array, where each input names all of the ```dynamicForkTasksParam``` (this defines the number of tines in the fork). It will a JSON array wth n entries. The name of the workflow called is ```subworkflowParam.name```, the ```taskReferenceName``` must be unique (here we increment the last value by one), and the ```type``` must be ```SUB_WORKFLOW```:
The first set of data is a JSON array, where each input names all of the ```dynamicForkTasksParam``` (this defines the number of tines in the fork). It will be a JSON array with n entries. The name of the workflow called is ```subworkflowParam.name```, the ```taskReferenceName``` must be unique (here we increment the last value by one), and the ```type``` must be ```SUB_WORKFLOW```:

```JSON
{ "dynamicTasks": [
Expand Down Expand Up @@ -76,7 +76,7 @@ Let's assume that this version of the workflow (this will be version 3) will hav
}
```

JQ can tell us the length of this with the command ```.[] | length ```. This simply reads the array, and returns the length. Our Conductor task looks like:
JQ can tell us the length of this with the command ```.[] | length ```. This simply reads the array and returns the length. Our Conductor task looks like this:

```json
{
Expand All @@ -101,7 +101,7 @@ JQ can tell us the length of this with the command ```.[] | length ```. This si

For the input sample above, this task returns the value ```2```.

Now that we know the length of the address JSON, we know how many dynamic forks to create. Our JQ query looks like:
Now that we know the length of the address JSON, we know how many dynamic forks to create. Our JQ query looks like this:

```bash
reduce range(0,${jq_address_count_ref.output.result}) as $f (.; .dynamicTasks[$f].subWorkflowParam.name = \"Shipping_loop_workflow\" | .dynamicTasks[$f].taskReferenceName = \"shipping_loop_subworkflow_ref_\\($f)\" | .dynamicTasks[$f].type = \"SUB_WORKFLOW\")
Expand Down Expand Up @@ -211,13 +211,13 @@ Here is the JQ command we use to generate this JSON:

The input to the JQ is an empty array, but we also read in the ```workflow.input.addressList``` that is given to the workflow, and the list of dynamic forks we created in the JQ task above ```${jq_create_dynamictasks_ref.output.result}```.

The reduce creates an JSON of the correct length, and then we name each parameter with the ```shipping_loop_subworkflow_ref_<n>```, and give it the value of the address parameters for the same index.
The reduce creates a JSON of the correct length, and then we name each parameter with the ```shipping_loop_subworkflow_ref_<n>```, and give it the value of the address parameters for the same index.

```json
.dynamicTasksInput.\"shipping_loop_subworkflow_ref_\\($f)\" = .addresses[$f])
```

When this is run, the resulting JSON matchers the required format. With these 3 JQ transforms, we've completed all of the prep work for our Dynamic task - creating a subworkflow, and formatting all of the input data for the task.
When this is run, the resulting JSON matches the required format. With these 3 JQ transforms, we've completed all of the prep work for our Dynamic task - creating a sub workflow and formatting all of the input data for the task.

We are now ready to define our Dynamic task, creating a dropshipping workflow for our widget shipping.

Expand Down
Loading

0 comments on commit 31b0e72

Please sign in to comment.