Skip to content

Commit

Permalink
merge
Browse files Browse the repository at this point in the history
  • Loading branch information
aaazzam committed Jun 13, 2024
2 parents 0116614 + fcd4a07 commit a067e29
Show file tree
Hide file tree
Showing 15 changed files with 374 additions and 77 deletions.
153 changes: 153 additions & 0 deletions docs/guides/llms.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
---
title: Configuring LLMs
---

ControlFlow is optimized for workflows that are composed of multiple tasks, each of which can be completed by a different agent. One benefit of this approach is that you can use a different LLM for each task, or even for each agent assigned to a task.

ControlFlow will ensure that all agents share a consistent context and history, even if they are using different models. This allows you to leverage the relative strengths of different models, depending on your requirements.

## The default model

By default, ControlFlow uses OpenAI's GPT-4o model. GPT-4o is an extremely powerful and popular model that provides excellent out-of-the-box performance on most tasks. This does mean that to run an agent with no additional configuration, you will need to provide an OpenAI API key.

## Selecting a different LLM

Every ControlFlow agent can be assigned a specific LLM. When instantiating an agent, you can pass a `model` parameter to specify the LLM to use.

ControlFlow agents can use any LangChain LLM class that supports chat-based APIs and tool calling. For a complete list of available models, settings, and instructions, please see LangChain's [LLM provider documentation](https://python.langchain.com/docs/integrations/chat/).

<Tip>
ControlFlow includes OpenAI and Azure OpenAI models by default. To use other models, you'll need to install the corresponding LangChain package first. See the model's documentation for more information.
</Tip>


To configure a different LLM, follow these steps:
<Steps>
<Step title="Install required packages">
To use an LLM, first make sure you have installed the appropriate provider package. ControlFlow only includes `langchain_openai` by default. For example, to use an Anthropic model, first run:
```
pip install langchain_anthropic
```
</Step>
<Step title="Configure API keys">
You must provide the correct API keys and configuration for the LLM you want to use. These can be provided as environment variables or when you create the model in your script. For example, to use an Anthropic model, set the `ANTHROPIC_API_KEY` environment variable:

```
export ANTHROPIC_API_KEY=<your-api-key>
```
For model-specific instructions, please refer to the provider's documentation.
</Step>
<Step title="Create the model">
Begin by creating the LLM object in your script. For example, to use Claude 3 Opus:

```python
from langchain_anthropic import ChatAnthropic

# create the model
model = ChatAnthropic(model='claude-3-opus-20240229')
```
</Step>
<Step title="Pass the model to an agent">
Next, create an agent with the specified model:

```python
import controlflow as cf

# provide the model to an agent
agent = cf.Agent(model=model)
```
</Step>
<Step title='Assign the agent to a task'>
Finally, assign your agent to a task:

```python
# assign the agent to a task
task = cf.Task('Write a short poem about LLMs', agents=[agent])

# (optional) run the task
task.run()
```
</Step>
</Steps>

<Accordion title="Click here to copy the entire example script">

```python
import controlflow as cf
from langchain_anthropic import ChatAnthropic

# create the model
model = ChatAnthropic(model='claude-3-opus-20240229')

# provide the model to an agent
agent = cf.Agent(model=model)

# assign the agent to a task
task = cf.Task('Write a short poem about LLMs', agents=[agent])

# (optional) run the task
task.run()
```
</Accordion>

### Model configuration

In addition to choosing a specific model, you can also configure the model's parameters. For example, you can set the temperature for GPT-4o:

```python
import controlflow as cf
from langchain_openai import ChatOpenAI

model = ChatOpenAI(model='gpt-4o', temperature=0.1)
agent = cf.Agent(model=model)

assert agent.model.temperature == 0.1
```

## Changing the default model

### From a model object

To use any model as the default LLM, create the model object in your script and assign it to controlflow's `default_model` attribute. It will be used by any agent that does not have a model specified.

```python
import controlflow as cf
from langchain_anthropic import ChatAnthropic

# set the default model
cf.default_model = ChatAnthropic(
model='claude-3-opus-20240229',
temperature=0.1,
)

# check that the default model is loaded
assert cf.Agent('Marvin').model.model_name == 'claude-3-opus-20240229'
```
### From a string setting

If you don't need to configure the model object, you can set the default model using a string setting. The string must have the form `<provider>/<model name>`.


You can change this setting either with an environment variable or by modifying it in your script. For example, to use GPT 3.5 Turbo as the default model:

<CodeGroup>
```bash As an environment variable
export CONTROLFLOW_LLM_MODEL=openai/gpt-3.5-turbo
```

```python In your script
import controlflow as cf
# set the default model
cf.settings.llm_model = "openai/gpt-3.5-turbo"

# check that the default model is loaded
assert cf.Agent('Marvin').model.model_name == 'gpt-3.5-turbo'
```
</CodeGroup>


At this time, setting the default model via string is only supported for the following providers:
- `openai`
- `azure-openai`
- `anthropic`
- `google`
59 changes: 59 additions & 0 deletions docs/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,65 @@ LLMs are powerful AI models that can understand and generate human-like text, en

ControlFlow provides a structured and intuitive way to create sophisticated agentic workflows while adhereing to traditional software engineering best practices. The resulting applications are observable, controllable, and easy to trust.

<CodeGroup>
```python Example: Restaurant recommendations
from controlflow import flow, Task
from pydantic import BaseModel


class Restaurant(BaseModel):
name: str
description: str


@flow
def restaurant_recs(n:int) -> list[Restaurant]:
"""
An agentic workflow that asks the user for their location and
cuisine preference, then recommends n restaurants based on their input.
"""

# get the user's location
location = Task("Get a location", user_access=True)

# get the user's preferred cuisine
cuisine = Task("Get a preferred cuisine", user_access=True)

# generate the recommendations from the user's input
recs = Task(
f"Recommend {n} restaurants to the user",
context=dict(location=location, cuisine=cuisine),
result_type=list[Restaurant],
)
return recs


recs = restaurant_recs(n=3)
```
```python Result
# >> Agent: Hi! Could you please tell me your current location? Also,
# what type of cuisine are you in the mood for?

# >> User: I'm in DC looking for a cafe

# -------------------------------------------------

[
Restaurant(
name="Compass Coffee",
description="A popular coffee shop known for its quality coffee and relaxed atmosphere.",
),
Restaurant(
name="The Wydown Coffee Bar",
description="A stylish cafe offering specialty coffee, pastries, and a cozy environment.",
),
Restaurant(
name="Tryst Coffeehouse",
description="A vibrant coffeehouse with great coffee, food options, and a welcoming ambiance.",
),
]
```
</CodeGroup>
## Design principles

ControlFlow's design is informed by a strong opinion: LLMs are powerful tools, but they are most effective when applied to small, well-defined tasks within a structured workflow. This approach mitigates many of the challenges associated with LLMs, such as hallucinations, biases, and unpredictable behavior, while also making it easier to debug, monitor, and control the application.
Expand Down
1 change: 1 addition & 0 deletions docs/mint.json
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@
"guides/apis",
"guides/execution-modes",
"guides/dependencies",
"guides/llms",
"guides/orchestration",
"guides/deployment"
]
Expand Down
46 changes: 6 additions & 40 deletions docs/tutorial.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -356,46 +356,6 @@ docs_agent = cf.Agent(
What's the difference between a description and instructions? The description is a high-level overview of the agent's purpose and capabilities, while instructions provide detailed guidance on how the agent should complete a task. Agent descriptions can be seen by other agents, but instructions are private, which can affect how agents collaborate with each other.
</Tip>

### Choosing an LLM model

By default, agents use OpenAI's GPT-4o model, but you can specify a different model by passing a `model` parameter to the agent. ControlFlow is built on LangChain Core, so you can provide [any LangChain LLM](https://python.langchain.com/v0.1/docs/integrations/chat/) that supports the chat and tool calling APIs. Here are a few examples:

```python
import controlflow as cf
from langchain_openai import ChatOpenAI
from langchain_anthropic import ChatAnthropic
from langchain_google_genai import ChatGoogleGenerativeAI

# GPT-3.5-turbo
gpt35_agent = cf.Agent(
name="GPT-3.5-turbo",
model=ChatOpenAI(model="gpt-3.5-turbo"),
)

# GPT-4o with low temperature
gpt4o_low_temp_agent = cf.Agent(
name="GPT-4o-low-temp",
model=ChatOpenAI(model="gpt-4o", temperature=0.1),
)

# Claude 3
claude_agent = cf.Agent(
name="Claude",
model=ChatAnthropic(model='claude-3-opus-20240229'),
)

# Gemini Pro
gemini_agent = cf.Agent(
name="Gemini",
model=ChatGoogleGenerativeAI(model='gemini-pro'),
)
```

Note that ControlFlow only includes `langchain_openai` as a dependency by default. If you want to use other models, you'll need to install the corresponding LangChain package.

<Tip>
By assigning different agents to tasks, you can take advantage of each model's strenghts, latency, and cost to optimize your workflow.
</Tip>


### Assigning an agent to a task
Expand Down Expand Up @@ -474,3 +434,9 @@ In the above example, we also introduced the `instructions` context manager. Thi
- Assign an agent to a task by passing it to the task's `agents` parameter
- You can assign multiple agents to a task to have them collaborate
</Check>


## What's next?
- Read more about core concepts like [tasks](/concepts/tasks), [flows](/concepts/flows), and [agents](/concepts/agents)
- Understand ControlFlow's [workflow APIs](/guides/apis) and [execution modes](/guides/execution-modes)
- Learn how to use [different LLM models](/guides/llms)
33 changes: 33 additions & 0 deletions examples/restaurant_recs.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
from controlflow import Task, flow
from pydantic import BaseModel


class Restaurant(BaseModel):
name: str
description: str


@flow
def restaurant_recs(n: int) -> list[Restaurant]:
"""
An agentic workflow that asks the user for their location and
cuisine preference, then recommends n restaurants based on their input.
"""

# get the user's location
location = Task("Get a location", user_access=True)

# get the user's preferred cuisine
cuisine = Task("Get a preferred cuisine", user_access=True)

# generate the recommendations from the user's input
recs = Task(
f"Recommend {n} restaurants to the user",
context=dict(location=location, cuisine=cuisine),
result_type=list[Restaurant],
)
return recs


if __name__ == "__main__":
restaurant_recs(5)
25 changes: 12 additions & 13 deletions src/controlflow/__init__.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,9 @@
from .settings import settings
from . import llm
import controlflow.llm

# --- Default model ---
# assign to controlflow.default_model to change the default model
from .llm.models import DEFAULT_MODEL as default_model

from .core.flow import Flow
from .core.task import Task
Expand All @@ -9,22 +13,17 @@
from .instructions import instructions
from .decorators import flow, task

# --- Default agent ---

from .core.agent import DEFAULT_AGENT

default_agent = DEFAULT_AGENT
del DEFAULT_AGENT

# --- Default history ---
# assign to controlflow.default_history to change the default history
from .llm.history import DEFAULT_HISTORY as default_history

from .llm.history import DEFAULT_HISTORY

default_history = DEFAULT_HISTORY
del DEFAULT_HISTORY

# --- Default agent ---
# assign to controlflow.default_agent to change the default agent
from .core.agent import DEFAULT_AGENT as default_agent

# --- Version ---
try:
from ._version import version as __version__
from ._version import version as __version__ # type: ignore
except ImportError:
__version__ = "unknown"
4 changes: 2 additions & 2 deletions src/controlflow/core/controller/controller.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
from controlflow.llm.handlers import PrintHandler, ResponseHandler, TUIHandler
from controlflow.llm.history import History
from controlflow.llm.messages import AIMessage, MessageType, SystemMessage
from controlflow.llm.tools import as_tools
from controlflow.tui.app import TUIApp as TUI
from controlflow.utilities.context import ctx
from controlflow.utilities.tasks import all_complete, any_incomplete
Expand Down Expand Up @@ -183,13 +184,12 @@ def _setup_run(self):
handlers.append(TUIHandler())
if controlflow.settings.enable_print_handler:
handlers.append(PrintHandler())

with ctx(controller_agent=agent):
# yield the agent payload
yield dict(
agent=agent,
messages=[system_message] + messages,
tools=tools,
tools=as_tools(tools),
handlers=handlers,
)

Expand Down
Loading

0 comments on commit a067e29

Please sign in to comment.