Fix stop sequences synchronization for Anthropic, Bedrock, and Gemini providers #3837
+4,474
−4,072
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fix stop sequences synchronization for Anthropic, Bedrock, and Gemini providers
Summary
Fixes issue #3836 where stop sequences were not being sent to Anthropic, Bedrock, and Gemini APIs, causing massive token usage (10x higher costs).
Root cause:
CrewAgentExecutorsetsllm.stopto control when the model should stop generating, but these three providers store stop sequences in a separatestop_sequencesattribute that gets sent to the API. Since there was no synchronization between these two attributes, the API never received stop instructions, causing the model to generate entire conversations in a single response instead of stopping at\nObservation:boundaries.Fix: Added
@propertygetter/setter forstopattribute inAnthropicCompletion,BedrockCompletion, andGeminiCompletionclasses to synchronizestopwithstop_sequences. WhenCrewAgentExecutorsetsllm.stop, it now properly updatesstop_sequenceswhich is what gets sent to the API.Changes:
AnthropicCompletion(lib/crewai/src/crewai/llms/providers/anthropic/completion.py)BedrockCompletion(lib/crewai/src/crewai/llms/providers/bedrock/completion.py)GeminiCompletion(lib/crewai/src/crewai/llms/providers/gemini/completion.py)lib/crewai/tests/llms/test_stop_sequences_sync.py)Review & Testing Checklist for Human
test_stop_sequences_sync.pycontains 18 tests across 3 providers. Confirm all pass.\nObservation:instead of inventing full conversationsBaseLLM.__init__, then the child class also setsstop_sequences. Verify no issues with object construction.BedrockCompletion.stopgetter returnslist(self.stop_sequences)(new list) while Anthropic/Gemini return the same reference. Confirm this doesn't break anything in executor code that might rely on reference equality.self.stopdirectly (notstop_sequences), so they should be unaffected. Quick sanity check recommended.Notes
stopandstop_sequencesattributes without synchronization). All three are fixed in this PR.Link to Devin run: https://app.devin.ai/sessions/4ba6b0786af24d0a93988371fe403744
Requested by: João (joao@crewai.com)