Fix: Apply additional_drop_params filtering across all LLM providers #3815
+156
−4
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Fix: Apply additional_drop_params filtering across all LLM providers
Summary
Fixes issue #3814 where the
additional_drop_paramsparameter was being ignored when LLM was used within an Agent. This parameter is critical for reasoning models (o1-mini, o4-mini) that don't support certain parameters like "stop".Root cause: The parameter filtering logic was missing from
_prepare_completion_params()methods across all LLM providers (LiteLLM fallback path and native providers for OpenAI, Azure, Anthropic).Solution:
_apply_additional_drop_params()helper method inBaseLLMclassdrop_additionnal_paramsparameterReview & Testing Checklist for Human
o1-mini) that rejects the "stop" parameter, both with and withoutadditional_drop_params=["stop"]drop_additionnal_params(misspelled variant) still works if it's being used in productionWhy manual testing is important: The automated tests use extensive mocking and don't make actual API calls, so they verify the filtering logic but not end-to-end behavior with real LLM providers.
Notes