Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Prompt Override #1599

Open
wants to merge 4 commits into
base: feature/api-v3
Choose a base branch
from
Open

Conversation

NolanTrem
Copy link
Collaborator

@NolanTrem NolanTrem commented Nov 16, 2024

Important

Add debugging print statements and improve prompt handling in retrieval_router.py, litellm.py, and prompt.py.

  • Debugging:
    • Add print statement in rag_app() in retrieval_router.py to log task_prompt_override.
    • Add print statements in _execute_task() and _execute_task_sync() in litellm.py to log arguments for completions.
  • Prompt Handling:
    • Modify get_cached_prompt() in prompt.py to handle prompt_override and input formatting, raising ValueError for missing inputs.
  • Type Annotations:
    • Remove unused Union import in prompt.py.

This description was created by Ellipsis for acb0994. It will automatically update as commits are pushed.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❌ Changes requested. Reviewed everything up to 2ef8e93 in 39 seconds

More details
  • Looked at 88 lines of code in 3 files
  • Skipped 0 files when reviewing.
  • Skipped posting 1 drafted comments based on config settings.
1. py/core/providers/llm/litellm.py:57
  • Draft comment:
    Replace print with logger.debug for better logging practices. This also applies to line 71.
  • Reason this comment was not posted:
    Marked as duplicate.

Workflow ID: wflow_dMytHdjFmyFDAIu2


Want Ellipsis to fix these issues? Tag @ellipsis-dev in a comment. You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

py/core/main/api/v3/retrieval_router.py Outdated Show resolved Hide resolved
Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Incremental review on 16ec71e in 9 seconds

More details
  • Looked at 35 lines of code in 2 files
  • Skipped 0 files when reviewing.
  • Skipped posting 1 drafted comments based on config settings.
1. py/core/providers/llm/litellm.py:54
  • Draft comment:
    The PR description mentions adding print statements for debugging, but they seem to have been removed. Ensure this is intentional and not an oversight.
  • Reason this comment was not posted:
    Confidence changes required: 50%
    The PR description mentions adding print statements for debugging, but they are not present in the current diff. This might be an oversight or intentional removal.

Workflow ID: wflow_ayl32eHHHvMbqjCh


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

❌ Changes requested. Incremental review on 04ff668 in 19 seconds

More details
  • Looked at 41 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 0 drafted comments based on config settings.

Workflow ID: wflow_h1kdGNmwjnvMf06f


Want Ellipsis to fix these issues? Tag @ellipsis-dev in a comment. You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.


if not prompt_override and not bypass_cache:
# Cache the template itself if no inputs
self._prompt_cache.set(prompt_name, template)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider using cache_key instead of prompt_name when setting the cache to ensure consistency in cache operations.

Suggested change
self._prompt_cache.set(prompt_name, template)
self._prompt_cache.set(cache_key, template)

Copy link
Contributor

@ellipsis-dev ellipsis-dev bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Looks good to me! Incremental review on acb0994 in 15 seconds

More details
  • Looked at 55 lines of code in 1 files
  • Skipped 0 files when reviewing.
  • Skipped posting 1 drafted comments based on config settings.
1. py/core/providers/database/prompt.py:148
  • Draft comment:
    Consider raising a ValueError for missing inputs when prompt_override is used, to align with the PR description and ensure expected behavior.
  • Reason this comment was not posted:
    Decided after close inspection that this draft comment was likely wrong and/or not actionable:
    The comment suggests a change in behavior that is not reflected in the code. The current code handles a KeyError by returning the prompt_override, which seems intentional. Without the PR description, it's unclear if the behavior should be changed. The comment lacks strong evidence from the code itself to support its suggestion.
    I might be missing the context provided by the PR description, which could justify the comment. However, without that context, the comment seems speculative.
    The lack of context from the PR description makes it difficult to validate the comment. The code itself does not indicate a need for the suggested change.
    Delete the comment as it lacks strong evidence from the code to support the suggested change.

Workflow ID: wflow_4ACAWwMNeiV62pqW


You can customize Ellipsis with 👍 / 👎 feedback, review rules, user-specific overrides, quiet mode, and more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant