Skip to content

Commit f7d8aa0

Browse files
authored
docs: extend tools docs (#170)
1 parent f1f3e44 commit f7d8aa0

File tree

1 file changed

+63
-31
lines changed

1 file changed

+63
-31
lines changed

README.md

Lines changed: 63 additions & 31 deletions
Original file line numberDiff line numberDiff line change
@@ -145,6 +145,69 @@ final class CompanyName
145145
}
146146
```
147147

148+
#### Tool Return Value
149+
150+
In the end, the tool's response needs to be a string, but LLM Chain converts arrays and objects, that implement the
151+
`JsonSerializable` interface, to JSON strings for you. So you can return arrays or objects directly from your tool.
152+
153+
#### Tool Methods
154+
155+
You can configure the method to be called by the LLM with the `#[AsTool]` attribute and have multiple tools per class:
156+
157+
```php
158+
use PhpLlm\LlmChain\ToolBox\Attribute\AsTool;
159+
160+
161+
#[AsTool(name: 'weather_current', description: 'get current weather for a location', method: 'current')]
162+
#[AsTool(name: 'weather_forecast', description: 'get weather forecast for a location', method: 'forecast')]
163+
final readonly class OpenMeteo
164+
{
165+
public function current(float $latitude, float $longitude): array
166+
{
167+
// ...
168+
}
169+
170+
public function forecast(float $latitude, float $longitude): array
171+
{
172+
// ...
173+
}
174+
}
175+
```
176+
177+
#### Tool Parameters
178+
179+
LLM Chain generates a JSON Schema representation for all tools in the `ToolBox` based on the `#[AsTool]` attribute and
180+
method arguments and param comments in the doc block. Additionally, JSON Schema support validation rules, which are
181+
partially support by LLMs like GPT.
182+
183+
To leverage this, configure the `#[ToolParameter]` attribute on the method arguments of your tool:
184+
```php
185+
use PhpLlm\LlmChain\ToolBox\Attribute\AsTool;
186+
use PhpLlm\LlmChain\ToolBox\Attribute\ToolParameter;
187+
188+
#[AsTool('my_tool', 'Example tool with parameters requirements.')]
189+
final class MyTool
190+
{
191+
/**
192+
* @param string $name The name of an object
193+
* @param int $number The number of an object
194+
*/
195+
public function __invoke(
196+
#[ToolParameter(pattern: '/([a-z0-1]){5}/')]
197+
string $name,
198+
#[ToolParameter(minimum: 0, maximum: 10)]
199+
int $number,
200+
): string {
201+
// ...
202+
}
203+
}
204+
```
205+
206+
See attribute class [ToolParameter](src/Chain/ToolBox/Attribute/ToolParameter.php) for all available options.
207+
208+
> [!NOTE]
209+
> Please be aware, that this is only converted in a JSON Schema for the LLM to respect, but not validated by LLM Chain.
210+
148211
#### Code Examples (with built-in tools)
149212

150213
1. **Clock Tool**: [toolbox-clock.php](examples/toolbox-clock.php)
@@ -313,37 +376,6 @@ dump($response->getContent()); // returns an array
313376
1. **Structured Output** (PHP class): [structured-output-math.php](examples/structured-output-math.php)
314377
1. **Structured Output** (array): [structured-output-clock.php](examples/structured-output-clock.php)
315378

316-
### Tool Parameters
317-
318-
LLM Chain generates a JSON Schema representation for all tools in the `ToolBox` based on the `#[AsTool]` attribute and
319-
method arguments and doc block. Additionally, JSON Schema support validation rules, which are partially support by
320-
LLMs like GPT.
321-
322-
To leverage this, configure the `#[ToolParameter]` attribute on the method arguments of your tool:
323-
```php
324-
use PhpLlm\LlmChain\ToolBox\Attribute\AsTool;
325-
use PhpLlm\LlmChain\ToolBox\Attribute\ToolParameter;
326-
327-
#[AsTool('my_tool', 'Example tool with parameters requirements.')]
328-
final class MyTool
329-
{
330-
/**
331-
* @param string $name The name of an object
332-
* @param int $number The number of an object
333-
*/
334-
public function __invoke(
335-
#[ToolParameter(pattern: '/([a-z0-1]){5}/')]
336-
string $name,
337-
#[ToolParameter(minimum: 0, maximum: 10)]
338-
int $number,
339-
): string {
340-
// ...
341-
}
342-
}
343-
```
344-
> [!NOTE]
345-
> Please be aware, that this is only converted in a JSON Schema for the LLM to respect, but not validated by LLM Chain.
346-
347379
### Response Streaming
348380

349381
Since LLMs usually generate a response word by word, most of them also support streaming the response using Server Side

0 commit comments

Comments
 (0)