-
Notifications
You must be signed in to change notification settings - Fork 614
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore(wren-ai-service): Exclude 'alias' from model parameters in LLM and Embedder processors #1353
Conversation
WalkthroughThis pull request modifies the Changes
Possibly related PRs
Poem
✨ Finishing Touches
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
97ba669
to
5ec380d
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
🧹 Nitpick comments (1)
wren-ai-service/src/providers/__init__.py (1)
125-127
: Consider adding unit tests for this parameter exclusion.While the code change is straightforward, it might be worth adding unit tests to verify that 'alias' is correctly excluded from the generated model configurations to prevent regression in the future.
def test_alias_exclusion_in_llm_processor(): """Test that 'alias' is correctly excluded from model_additional_params in llm_processor.""" entry = { "type": "llm", "provider": "test_provider", "models": [ { "model": "test_model", "alias": "test_alias", "kwargs": {} } ] } result = llm_processor(entry) model_config = result["test_provider.test_alias"] assert "alias" not in model_config, "alias should be excluded from model configuration" def test_alias_exclusion_in_embedder_processor(): """Test that 'alias' is correctly excluded from model_additional_params in embedder_processor.""" entry = { "type": "embedder", "provider": "test_provider", "models": [ { "model": "test_model", "alias": "test_alias" } ] } result = embedder_processor(entry) model_config = result["test_provider.test_alias"] assert "alias" not in model_config, "alias should be excluded from model configuration"
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
wren-ai-service/src/providers/__init__.py
(2 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (2)
- GitHub Check: pytest
- GitHub Check: Analyze (go)
🔇 Additional comments (3)
wren-ai-service/src/providers/__init__.py (3)
73-73
: Correctly excludes 'alias' from model parameters.The change ensures that 'alias' parameter is used only for model identification but not included in the model_additional_params dictionary that gets passed to the actual LLM provider implementation. This is consistent with how 'model' and 'kwargs' are handled, and prevents potentially unwanted parameters from reaching the provider API.
127-127
: Correctly excludes 'alias' from embedder model parameters.Similar to the LLM processor change, this ensures 'alias' is excluded from model_additional_params in the embedder processor. The change maintains consistency between both processor functions and prevents potentially unwanted parameters from being passed to the embedder provider.
71-73
:❓ Verification inconclusive
Verify the alias usage in existing configurations.
The change looks good, but I'd recommend verifying that all existing configurations that might use 'alias' continue to work correctly after this change. The 'alias' field is still used for identification purposes on line 71, but now will not appear in the final model configuration.
🏁 Script executed:
#!/bin/bash # Search for any code or configuration files that might be using 'alias' with these models echo "Looking for configuration files that might use the alias parameter:" rg -l "'alias'|\"alias\"" --type yaml --type json rg -l "\.get\(.*'alias'|\.get\(.*\"alias\"" --type python echo "Checking if tests might be affected by this change:" rg -l "model_additional_params" --type pythonLength of output: 557
Below is an updated verification script to re-run the search for “alias” usage in Python files using a file glob pattern. This will help uncover any alias retrievals (e.g. via the .get() method) that our original command missed due to the unrecognized file type error:
Next Steps:
- Manually verify if the occurrence in
wren-ai-service/src/pipelines/generation/utils/vega-lite-schema-v5.json
using the alias field is expected.- Check the output of the above script to ensure no unexpected usage of the alias field occurs elsewhere in Python code.
- Confirm that the removal of
alias
from the final model configuration does not break any functionality in tests or configurations.Action: Please review the new search results and confirm that all configurations referencing “alias” are either updated or unaffected by the change.
Review Priority: Verify that removing alias from the final configuration is safe in all contexts.
🏁 Script executed:
#!/bin/bash echo "Searching for alias usage in Python files:" rg -g '*.py' -l "\.get\(.*('alias'|\"alias\")" . echo "Searching for model_additional_params usage in Python files:" rg -g '*.py' -l "model_additional_params" .Length of output: 534
Action Required: Verify alias usage in external configuration files.
- Verification indicates that the only Python reference to alias extraction is in
wren-ai-service/src/providers/__init__.py
(as expected).- A configuration file (
wren-ai-service/src/pipelines/generation/utils/vega-lite-schema-v5.json
) still contains the alias parameter. Please manually verify that its usage is correct and that removing alias from the final model configuration does not lead to unintended side effects.- Confirm that tests referencing
model_additional_params
in the providers file and elsewhere remain valid.
This PR fixes an issue where the 'alias' parameter was being incorrectly included in the model's additional parameters for both LLM and Embedder processors. The fix ensures that 'alias' is properly excluded along with 'model' and 'kwargs' when processing model configurations.
Changes made:
Summary by CodeRabbit