Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sys_prompt support in Agent #938

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open

sys_prompt support in Agent #938

wants to merge 2 commits into from

Conversation

ehhuang
Copy link
Contributor

@ehhuang ehhuang commented Feb 3, 2025

What does this PR do?

The current default system prompt for llama3.2 tends to overindex on tool calling and doesn't work well when the prompt does not require tool calling.

This PR adds an option to override the default system prompt, and organizes tool-related configs into a new config object.

  • Addresses issue (#issue)

Test Plan

LLAMA_STACK_CONFIG=together pytest --inference-model=meta-llama/Llama-3.3-70B-Instruct -s -v tests/client-sdk/agents/test_agents.py::test_override_system_message_behavior

Sources

Please link relevant resources if necessary.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Ran pre-commit to handle lint / formatting issues.
  • Read the contributor guideline,
    Pull Request section?
  • Updated relevant documentation.
  • Wrote necessary unit or integration tests.

Stack created with Sapling. Best reviewed with ReviewStack.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Meta Open Source bot. label Feb 3, 2025
@ehhuang ehhuang force-pushed the pr938 branch 6 times, most recently from f275609 to 2dcdca2 Compare February 3, 2025 20:26
ehhuang added a commit to meta-llama/llama-stack-client-python that referenced this pull request Feb 3, 2025
# What does this PR do?

Support meta-llama/llama-stack#938

- [ ] Addresses issue (#issue)


## Test Plan



## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
@ehhuang ehhuang marked this pull request as ready for review February 3, 2025 20:36
ehhuang added a commit to meta-llama/llama-stack-client-python that referenced this pull request Feb 3, 2025
# What does this PR do?

Updates AgentConfig to support meta-llama/llama-stack#938

- [ ] Addresses issue (#issue)


## Test Plan



## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
ehhuang added a commit to meta-llama/llama-stack-client-python that referenced this pull request Feb 3, 2025
# What does this PR do?

Updates AgentConfig to support meta-llama/llama-stack#938

- [ ] Addresses issue (#issue)


## Test Plan

Tested in meta-llama/llama-stack#938

## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
@ehhuang ehhuang force-pushed the pr938 branch 2 times, most recently from da8adbc to f956e44 Compare February 3, 2025 23:14
# What does this PR do?

The current default system prompt for llama3.2 tends to overindex on tool calling and doesn't work well when the prompt does not require tool calling.

This PR adds an option to override the default system prompt, and organizes tool-related configs into a new config object.

- [ ] Addresses issue (#issue)


## Test Plan

python -m unittest llama_stack.providers.tests.inference.test_prompt_adapter


## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
# What does this PR do?

The current default system prompt for llama3.2 tends to overindex on tool calling and doesn't work well when the prompt does not require tool calling.

This PR adds an option to override the default system prompt, and organizes tool-related configs into a new config object.

- [ ] Addresses issue (#issue)


## Test Plan


LLAMA_STACK_CONFIG=together pytest \-\-inference\-model=meta\-llama/Llama\-3\.3\-70B\-Instruct -s -v tests/client-sdk/agents/test_agents.py::test_override_system_message_behavior


## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
ehhuang added a commit that referenced this pull request Feb 4, 2025
# What does this PR do?

The current default system prompt for llama3.2 tends to overindex on
tool calling and doesn't work well when the prompt does not require tool
calling.

This PR adds an option to override the default system prompt, and
organizes tool-related configs into a new config object.

- [ ] Addresses issue (#issue)


## Test Plan

python -m unittest
llama_stack.providers.tests.inference.test_prompt_adapter


## Sources

Please link relevant resources if necessary.


## Before submitting

- [ ] This PR fixes a typo or improves the docs (you can dismiss the
other checks if that's the case).
- [ ] Ran pre-commit to handle lint / formatting issues.
- [ ] Read the [contributor
guideline](https://github.com/meta-llama/llama-stack/blob/main/CONTRIBUTING.md),
      Pull Request section?
- [ ] Updated relevant documentation.
- [ ] Wrote necessary unit or integration tests.
---
[//]: # (BEGIN SAPLING FOOTER)
Stack created with [Sapling](https://sapling-scm.com). Best reviewed
with
[ReviewStack](https://reviewstack.dev/meta-llama/llama-stack/pull/937).
* #938
* __->__ #937
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Meta Open Source bot.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants