Skip to content

Commit

Permalink
Merge branch 'main' into run_until
Browse files Browse the repository at this point in the history
  • Loading branch information
jlowin authored Oct 9, 2024
2 parents 2a30d83 + 70ef1dc commit b6fc900
Show file tree
Hide file tree
Showing 4 changed files with 70 additions and 5 deletions.
31 changes: 31 additions & 0 deletions docs/concepts/agents.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,8 @@ description: The intelligent workers in your AI workflows.
icon: robot
---

import { VersionBadge } from '/snippets/version-badge.mdx'

Agents are the intelligent, autonomous entities that power your AI workflows in ControlFlow. They represent AI models capable of understanding instructions, making decisions, and completing tasks.

```python
Expand Down Expand Up @@ -69,6 +71,35 @@ Each agent has a model, which is the LLM that powers the agent responses. This a

ControlFlow supports any LangChain LLM that supports chat and function calling. For more details on how to configure models, see the [LLMs guide](/guides/configure-llms).

```python
import controlflow as cf


agent1 = cf.Agent(model="openai/gpt-4o")
agent2 = cf.Agent(model="anthropic/claude-3-5-sonnet-20240620")
```

### LLM rules
<VersionBadge version="0.11.0" />

Each LLM provider may have different requirements for how messages are formatted or presented. For example, OpenAI permits system messages to be interspersed between user messages, but Anthropic requires them to be at the beginning of the conversation. ControlFlow uses provider-specific rules to properly compile messages for each agent's API.

For common providers like OpenAI and Anthropic, LLM rules can be automatically inferred from the agent's model. However, you can use a custom `LLMRules` object to override these rules or provide rules for non-standard providers.

Here is an example of how to tell the agent to use the Anthropic compilation rules with a custom model that can't be automatically inferred:

```python
import controlflow as cf

# note: this is just an example
llm_model = CustomAnthropicModel()

agent = cf.Agent(
model=model,
llm_rules=cf.llm.rules.AnthropicRules(model=model)
)
```

### Interactivity

By default, agents have no way to communicate with users. If you want to chat with an agent, set `interactive=True`. By default, this will let the agent communicate with users on the command line.
Expand Down
10 changes: 9 additions & 1 deletion src/controlflow/agents/agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,11 @@ class Agent(ControlFlowModel, abc.ABC):
description="The LangChain BaseChatModel used by the agent. If not provided, the default model will be used. A compatible string can be passed to automatically retrieve the model.",
exclude=True,
)
llm_rules: Optional[LLMRules] = Field(
None,
description="The LLM rules for the agent. If not provided, the rules will be inferred from the model (if possible).",
)

_cm_stack: list[contextmanager] = []

def __init__(self, instructions: str = None, **kwargs):
Expand Down Expand Up @@ -169,7 +174,10 @@ def get_llm_rules(self) -> LLMRules:
"""
Retrieve the LLM rules for this agent's model
"""
return controlflow.llm.rules.rules_for_model(self.get_model())
if self.llm_rules is None:
return controlflow.llm.rules.rules_for_model(self.get_model())
else:
return self.llm_rules

def get_tools(self) -> list["Tool"]:
from controlflow.tools.input import cli_input
Expand Down
15 changes: 12 additions & 3 deletions src/controlflow/llm/rules.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,16 @@ class AnthropicRules(LLMRules):
def rules_for_model(model: BaseChatModel) -> LLMRules:
if isinstance(model, (ChatOpenAI, AzureChatOpenAI)):
return OpenAIRules(model=model)
elif isinstance(model, ChatAnthropic):
if isinstance(model, ChatAnthropic):
return AnthropicRules(model=model)
else:
return LLMRules(model=model)

try:
from langchain_google_vertexai.model_garden import ChatAnthropicVertex

if isinstance(model, ChatAnthropicVertex):
return AnthropicRules(model=model)
except ImportError:
pass

# catchall
return LLMRules(model=model)
19 changes: 18 additions & 1 deletion tests/agents/test_agents.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
from controlflow.events.base import Event
from controlflow.events.events import AgentMessage
from controlflow.instructions import instructions
from controlflow.llm.rules import LLMRules
from controlflow.llm.rules import AnthropicRules, LLMRules, OpenAIRules
from controlflow.orchestration.handler import Handler
from controlflow.tasks.task import Task

Expand Down Expand Up @@ -199,3 +199,20 @@ async def test_agent_run_async_with_handlers(self):

assert len(handler.events) > 0
assert len(handler.agent_messages) == 1


class TestLLMRules:
def test_llm_rules_from_model_openai(self):
agent = Agent(model=ChatOpenAI(model="gpt-4o-mini"))
rules = agent.get_llm_rules()
assert isinstance(rules, OpenAIRules)

def test_llm_rules_from_model_anthropic(self):
agent = Agent(model=ChatAnthropic(model="claude-3-haiku-20240307"))
rules = agent.get_llm_rules()
assert isinstance(rules, AnthropicRules)

def test_custom_llm_rules(self):
rules = LLMRules(model=None)
agent = Agent(llm_rules=rules, model=ChatOpenAI(model="gpt-4o-mini"))
assert agent.get_llm_rules() is rules

0 comments on commit b6fc900

Please sign in to comment.