Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Oct 30, 2025

Fix: Apply additional_drop_params filtering across all LLM providers

Summary

Fixes issue #3814 where the additional_drop_params parameter was being ignored when LLM was used within an Agent. This parameter is critical for reasoning models (o1-mini, o4-mini) that don't support certain parameters like "stop".

Root cause: The parameter filtering logic was missing from _prepare_completion_params() methods across all LLM providers (LiteLLM fallback path and native providers for OpenAI, Azure, Anthropic).

Solution:

  • Added centralized _apply_additional_drop_params() helper method in BaseLLM class
  • Applied filtering consistently in all provider implementations
  • Maintained backwards compatibility with misspelled drop_additionnal_params parameter

Review & Testing Checklist for Human

⚠️ Critical - Manual testing needed with actual API calls:

  • Test with a reasoning model (e.g., o1-mini) that rejects the "stop" parameter, both with and without additional_drop_params=["stop"]
  • Verify the fix works specifically when LLM is used within an Agent (the original bug scenario from issue [BUG] drop_additionnal_params parameter ignored when working with agents #3814)
  • Test that normal (non-reasoning) models still work correctly and aren't affected by this change
  • Confirm backwards compatibility: verify that drop_additionnal_params (misspelled variant) still works if it's being used in production

Why manual testing is important: The automated tests use extensive mocking and don't make actual API calls, so they verify the filtering logic but not end-to-end behavior with real LLM providers.

Notes

- Added centralized _apply_additional_drop_params helper method in BaseLLM
- Applied filtering in LiteLLM fallback path (_prepare_completion_params in llm.py)
- Applied filtering in Azure native provider (azure/completion.py)
- Applied filtering in OpenAI native provider (openai/completion.py)
- Applied filtering in Anthropic native provider (anthropic/completion.py)
- Added comprehensive tests covering both direct LLM usage and Agent usage
- Tests verify filtering works for single and multiple parameters
- Tests verify backwards compatibility with misspelled drop_additionnal_params

Fixes #3814

Co-Authored-By: João <joao@crewai.com>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

cursor[bot]

This comment was marked as outdated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants