You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -3,9 +3,26 @@ Flows are the high-level containers that encapsulate and orchestrate AI-powered
3
3
4
4
## Creating Flows
5
5
6
-
Flows can be created using the `Flow` class or the `@flow` decorator.
6
+
Flows can be created using the `Flow` class or the `@flow` decorator.
7
7
8
-
### Using the `Flow` Class
8
+
### The `@flow` Decorator
9
+
10
+
The `@flow` decorator provides a convenient way to define a flow using a Python function.
11
+
12
+
```python
13
+
from controlflow import flow
14
+
15
+
@flow
16
+
defdata_processing_flow():
17
+
data = load_data()
18
+
cleaned_data = clean_data(data)
19
+
insights = analyze_data(cleaned_data)
20
+
return insights
21
+
```
22
+
23
+
When using the `@flow` decorator, the decorated function becomes the entry point for the flow. The function can contain tasks, which are automatically executed when the flow is run. The `@flow` decorator also allows you to specify flow-level properties such as agents, tools, and context.
24
+
25
+
### The `Flow` Class
9
26
10
27
The `Flow` class allows you to explicitly define a flow and its properties.
11
28
@@ -22,22 +39,34 @@ flow = Flow(
22
39
23
40
By creating a `Flow` instance, you can specify the name, description, agents, tools, and other properties of the flow. This approach provides full control over the flow definition and is particularly useful when you need to customize the flow's behavior or configure advanced settings.
24
41
25
-
###Using the `@flow` Decorator
42
+
#### Adding tasks to a flow
26
43
27
-
The `@flow` decorator provides a convenient way to define a flow using a Python function.
44
+
Tasks can be added to a flow object in two ways: by calling `Flow.add_task(task)` or by creating the tasks inside the `Flow` context manager.
45
+
<CodeGroup>
46
+
```python Using a flow context
47
+
from controlflow import Flow, Task
28
48
29
-
```python
30
-
from controlflow import flow
49
+
with Flow() as flow:
50
+
task = Task('Load data')
51
+
```
31
52
32
-
@flow
33
-
defdata_processing_flow():
34
-
data = load_data()
35
-
cleaned_data = clean_data(data)
36
-
insights = analyze_data(cleaned_data)
37
-
return insights
53
+
```python Adding tasks imperatively
54
+
from controlflow import Flow, Task
55
+
56
+
flow = Flow()
57
+
task = Task('Load data')
58
+
flow.add_task(task)
38
59
```
60
+
</CodeGroup>
61
+
### Which Approach Should I Use?
39
62
40
-
When using the `@flow` decorator, the decorated function becomes the entry point for the flow. The function can contain tasks, which are automatically executed when the flow is run. The `@flow` decorator also allows you to specify flow-level properties such as agents, tools, and context.
63
+
<Tip>
64
+
**tldr:** Prefer the `@flow` decorator for simplicity and conciseness. Use the `Flow` class only for advanced customization and configuration.
65
+
</Tip>
66
+
67
+
Both the `Flow` class and the `@flow` decorator offer ways to compartmentalize and structure your workflow. The choice between the two approaches depends on your preference and the complexity of the flow.
68
+
69
+
In most cases, including this documentation, the `@flow` decorator is used for its simplicity and conciseness. As a user, you rarely need to interact with the flow object itself; its main job is to provide an isolated container for your tasks. By using the `@flow` decorator, you can cleanly define your workflow's inputs, logic, and outputs in a clear and readable manner. Moreover, calling the decorated function will automatically run the flow, making it easy to execute and test.
41
70
42
71
## Flow Properties
43
72
@@ -56,7 +85,9 @@ flow = Flow(
56
85
57
86
### Agents and Tools
58
87
59
-
The `agents` and `tools` properties allow you to specify the AI agents and tools that are available throughout the flow. These agents and tools can be used by tasks within the flow to perform specific actions or computations.
88
+
The `agents` and `tools` properties allow you to specify AI agents and tools that are available to tasks throughout the flow.
89
+
90
+
Flow-level agents are used by tasks **unless** the tasks have their own agents assigned. Flow-level tools are used by tasks **in addition** to any tools they have defined.
60
91
61
92
```python
62
93
flow = Flow(
@@ -65,7 +96,6 @@ flow = Flow(
65
96
)
66
97
```
67
98
68
-
Agents and tools defined at the flow level are accessible to all tasks within the flow. However, tasks can also have their own specific agents and tools assigned to them.
69
99
70
100
### Context
71
101
@@ -84,118 +114,76 @@ The context can be accessed and modified by tasks and agents during the flow exe
84
114
85
115
## Running Flows
86
116
87
-
Flows can be run using the `run()` method, which executes all of the tasks that were defined within the flow.
117
+
To a run a `@flow` decorated function, simply call the function with appropriate arguments. The arguments are automatically added to the flow's context, making them visible to all tasks even if they aren't passed directly to that task's context. Any tasks returned from the flow are automatically resolved into their `result` values.
118
+
119
+
To run a `Flow` instance, use its `run()` method, which executes all of the tasks that were defined within the flow. You can then access the results of individual tasks by referencing their `result` attribute, or by calling them (if they are `@task`-decorated functions).
120
+
88
121
<CodeGroup>
89
-
90
122
```python @flow decorator
91
123
@flow
92
-
defdata_processing_flow():
93
-
data = load_data()
94
-
cleaned_data = clean_data(data)
95
-
insights = analyze_data(cleaned_data)
96
-
return insights
97
-
98
-
results = data_processing_flow()
124
+
defitem_flow():
125
+
price = Task('generate a price between 1 and 1000', result_type=int)
126
+
item = Task(
127
+
'Come up with an common item that has the provided price',
128
+
result_type=str,
129
+
context=dict(price=price)
130
+
)
131
+
return item
132
+
133
+
# call the flow; the result is automatically resolved
134
+
# as the result of the `item` task.
135
+
item = item_flow()
99
136
```
100
137
```python Flow class
101
-
with Flow() as data_processing_flow:
102
-
data = load_data()
103
-
cleaned_data = clean_data(data)
104
-
insights = analyze_data(cleaned_data)
105
-
106
-
data_processing_flow.run()
107
-
print(insights.result)
138
+
with Flow() as item_flow:
139
+
price = Task('generate a price between 1 and 1000', result_type=int)
140
+
item = Task(
141
+
'Come up with an common item that has the provided price',
142
+
result_type=str,
143
+
context=dict(price=price)
144
+
)
145
+
146
+
# run all tasks in the flow
147
+
item_flow.run()
148
+
# access the item task's result
149
+
item.result
108
150
```
109
151
</CodeGroup>
110
152
111
-
When a flow is run, ControlFlow orchestrates the execution of tasks, resolving dependencies, and managing the flow of data between tasks. The flow ensures that tasks are executed in the correct order and that the necessary context and results are propagated throughout the flow.
112
-
113
-
## Flow Execution and Task Dependencies
114
-
115
-
Flows in ControlFlow follow a structured execution model based on task dependencies. When a flow is run, ControlFlow analyzes the dependencies between tasks and determines the execution order.
116
-
117
-
### Task Dependencies
153
+
<Tip>
154
+
**What happens when a flow is run?**
118
155
119
-
Tasks within a flow can have dependencies on other tasks. These dependencies define the order in which tasks should be executed and ensure that tasks have access to the necessary data or results from previous tasks.
156
+
When a flow is run, the decorated function is executed and any tasks created within the function are registered with the flow. The flow then orchestrates the execution of the tasks, resolving dependencies, and managing the flow of data between tasks. If the flow function returns a task, or a nested collection of tasks, the flow will automatically replace them with their final results.
157
+
</Tip>
120
158
121
-
Dependencies can be specified using the `depends_on` property of the `Task` class or by passing tasks as arguments to other tasks.
159
+
## Controlling Execution
122
160
123
-
```python
124
-
@flow
125
-
defdata_processing_flow():
126
-
raw_data = load_data()
127
-
cleaned_data = clean_data(raw_data)
128
-
insights = analyze_data(cleaned_data)
129
-
report = generate_report(insights)
130
-
return report
131
-
```
161
+
ControlFlow provides many mechanisms for determining how tasks are executed within a flow. So far, we've only looked at flows composed entirely of dependent tasks. These tasks form a DAG which is automatically executed when the flow runs.
132
162
133
-
In this example, the `clean_data` task depends on the `load_data` task, the `analyze_data` task depends on the `clean_data` task, and the `generate_report` task depends on the `analyze_data` task. ControlFlow ensures that the tasks are executed in the correct order based on these dependencies.
163
+
### Control Flow
134
164
135
-
### Parallel Execution
165
+
Because a flow function is a regular Python function, you can use standard Python control flow to determine when tasks are executed and in what order. At any point, you can manually `run()` any task in order to work with its result. Running a task inside a flow will also run any tasks it depends on.
136
166
137
-
ControlFlow supports parallel execution of tasks that are independent of each other. When multiple tasks have no dependencies between them, they can be executed concurrently, improving the overall performance of the flow.
167
+
In this flow, we flip a coin to determine which poem to write. The coin toss task is run manually, and the result is used to determine which poem task to return, using a standard Python `if` statement:
138
168
139
169
```python
140
170
@flow
141
-
defparallel_flow():
142
-
task1 = process_data1()
143
-
task2 = process_data2()
144
-
task3 = process_data3()
145
-
results = combine_results(task1, task2, task3)
146
-
return results
147
-
```
148
-
149
-
In this example, `task1`, `task2`, and `task3` have no dependencies on each other and can be executed in parallel. ControlFlow automatically manages the parallel execution and ensures that the results are properly combined in the `combine_results` task.
150
-
151
-
## Error Handling and Flow Control
152
-
153
-
ControlFlow provides mechanisms for error handling and flow control within flows.
154
-
155
-
### Error Handling
156
-
157
-
Errors that occur during task execution can be handled using exception handling. By wrapping tasks in try-except blocks, you can catch and handle specific exceptions, providing appropriate error messages or fallback behavior.
171
+
defconditional_flow():
172
+
coin_toss_task = Task('Flip a coin', result_type=['heads', 'tails'])
173
+
# manually run the coin-toss task
174
+
outcome = coin_toss_task.run()
175
+
176
+
# generate a different task based on the outcome of the toss
177
+
if outcome =='heads':
178
+
poem = Task('Write a poem about Mt. Rushmore', result_type=str)
179
+
elif outcome =='tails':
180
+
poem = Task('Write a poem about the Grand Canyon', result_type=str)
181
+
182
+
# return the poem task
183
+
return poem
158
184
159
-
```python
160
-
@flow
161
-
deferror_handling_flow():
162
-
try:
163
-
data = load_data()
164
-
cleaned_data = clean_data(data)
165
-
insights = analyze_data(cleaned_data)
166
-
except DataLoadError:
167
-
logger.error("Failed to load data")
168
-
insights =None
169
-
except DataCleaningError:
170
-
logger.error("Failed to clean data")
171
-
insights =None
172
-
return insights
185
+
print(conditional_flow())
186
+
# Upon granite heights, 'neath skies of blue,
187
+
# Mount Rushmore stands, a sight to view.
188
+
# ...
173
189
```
174
-
175
-
In this example, if an error occurs during the `load_data` or `clean_data` tasks, the corresponding exception is caught, an error message is logged, and the `insights` variable is set to `None`. This allows the flow to gracefully handle errors and continue execution.
176
-
177
-
### Flow Control
178
-
179
-
ControlFlow provides flow control mechanisms to conditionally execute tasks or repeat tasks based on certain conditions.
180
-
181
-
```python
182
-
@flow
183
-
defconditional_flow(condition):
184
-
data = load_data()
185
-
if condition:
186
-
cleaned_data = clean_data(data)
187
-
insights = analyze_data(cleaned_data)
188
-
else:
189
-
insights = analyze_raw_data(data)
190
-
return insights
191
-
```
192
-
193
-
In this example, the flow conditionally executes either the `clean_data` and `analyze_data` tasks or the `analyze_raw_data` task based on the value of the `condition` variable. This allows for dynamic flow execution based on runtime conditions.
194
-
195
-
## Conclusion
196
-
197
-
Flows in ControlFlow provide a powerful and flexible way to orchestrate AI-powered workflows. By defining flows using the `Flow` class or the `@flow` decorator, developers can create structured and organized workflows that manage tasks, agents, tools, and context.
198
-
199
-
Flows enable the execution of tasks in a defined order, resolving dependencies and allowing for parallel execution when possible. They also provide mechanisms for error handling and flow control, allowing for robust and adaptive workflow execution.
200
-
201
-
By leveraging flows in ControlFlow, developers can build complex and dynamic AI-powered applications that are maintainable, scalable, and aligned with business requirements. Flows provide a high-level abstraction for managing AI workflows, enabling developers to focus on the logic and objectives of their applications while ControlFlow handles the underlying orchestration and execution.
Copy file name to clipboardExpand all lines: docs/introduction.mdx
+1-4Lines changed: 1 addition & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -5,10 +5,7 @@ title: What is ControlFlow?
5
5
**ControlFlow is a framework for orchestrating agentic LLM workflows.**
6
6
7
7
<Note>
8
-
An **agentic workflow** is a process that delegates at least some of its work
9
-
to an LLM agent. An agent is an autonomous entity that is invoked repeatedly
10
-
to make decisions and perform complex tasks. To learn more, see the [AI
11
-
glossary](/glossary/agentic-workflow).
8
+
An **agentic workflow** is a process that delegates at least some of its work to an LLM agent. An agent is an autonomous entity that is invoked repeatedly to make decisions and perform complex tasks. To learn more, see the [AI glossary](/glossary/agentic-workflow).
12
9
</Note>
13
10
14
11
LLMs are powerful AI models that can understand and generate human-like text, enabling them to perform a wide range of tasks. However, building applications with LLMs can be challenging due to their complexity, unpredictability, and potential for hallucinating or generating irrelevant outputs.
Copy file name to clipboardExpand all lines: docs/reference/task-decorator.mdx
+5-5Lines changed: 5 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -44,14 +44,14 @@ When a task has `user_access` set to `True`, the AI agents are provided with a s
44
44
By default, `user_access` is set to `False`. It can be explicitly set to `True` using the `@task` decorator when the task requires human interaction.
45
45
</ParamField>
46
46
47
-
<ParamFieldpath="eager"type="bool">
48
-
The `eager` parameter determines whether the task should be executed eagerly or lazily. It is a boolean flag that controls the execution behavior of the task.
47
+
<ParamFieldpath="lazy"type="bool">
48
+
The `lazy` parameter determines whether the task should be executed eagerly or lazily. It is a boolean flag that controls the execution behavior of the task.
49
49
50
-
When `eager`is set to `True` (default), the task is executed immediately when the decorated function is called. The task is run, and the result is returned synchronously.
50
+
The default `lazy` behavior is determined by the global `eager_mode` setting in ControlFlow. Eager mode is enabled by default, which means that tasks are executed immediately. The `lazy` parameter allows you to override this behavior for a specific task.
51
51
52
-
When `eager` is set to `False`, the task is not executed immediately. Instead, a `Task` instance is returned, representing the deferred execution of the task. The task can be run later using the `run()` or `run_once()` methods.
52
+
When `lazy` is set to `True`, the task is not executed immediately. Instead, a `Task` instance is returned, representing the deferred execution of the task. The task can be run later using the `run()` or `run_once()` methods.
53
53
54
-
By default, the `eager` parameter is set to the global setting (`True`). It can be explicitly specified using the `@task` decorator to override the global setting for a specific task.
54
+
When `lazy`is set to `False` (default), the task is executed immediately when the decorated function is called. Setting `lazy=False` ensures the task is executed eagerly, even if the global `eager_mode` is disabled.
0 commit comments