Feat/custom OpenAI endpoint for compact#101
Open
KagaJiankui wants to merge 4 commits intozilliztech:mainfrom
Open
Feat/custom OpenAI endpoint for compact#101KagaJiankui wants to merge 4 commits intozilliztech:mainfrom
KagaJiankui wants to merge 4 commits intozilliztech:mainfrom
Conversation
Co-authored-by: KagaJiankui <58453886+KagaJiankui@users.noreply.github.com>
…act_openai Co-authored-by: KagaJiankui <58453886+KagaJiankui@users.noreply.github.com>
There was a problem hiding this comment.
Pull request overview
Adds configurable OpenAI-compatible endpoint and API key support to the compact summarization workflow, enabling users to point compact at custom/OpenAI-compatible LLM providers via CLI/config/env resolution.
Changes:
- Extended
CompactConfigwithbase_urlandapi_key, includingenv:reference resolution via existing config resolution. - Added
--llm-base-url/--llm-api-keyCLI options and plumbed them throughcli.py → core.py → compact.py. - Updated OpenAI compact client initialization to accept an explicit
base_url/api_key, and added config tests for the new fields.
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated no comments.
Show a summary per file
| File | Description |
|---|---|
tests/test_config.py |
Adds tests for new CompactConfig fields, env resolution, and config set/get roundtrips. |
src/memsearch/core.py |
Extends MemSearch.compact() signature and passes base URL / API key down to compaction. |
src/memsearch/config.py |
Adds base_url / api_key to CompactConfig. |
src/memsearch/compact.py |
Threads base URL / API key into the OpenAI client construction for compact. |
src/memsearch/cli.py |
Adds new CLI flags and maps them into config overrides and compact invocation. |
Comments suppressed due to low confidence (1)
tests/test_config.py:208
- The test docstring says it resolves
env:references for bothcompact.api_keyandcompact.base_url, but the test only uses anenv:value forapi_key(base_url is a literal URL). Either update the docstring to match the test, or add coverage forbase_urlusing anenv:reference as well.
def test_compact_config_env_ref_resolved(tmp_path: Path, monkeypatch: pytest.MonkeyPatch):
"""resolve_config should resolve env: references in compact.api_key and compact.base_url."""
monkeypatch.setenv("TEST_LLM_KEY", "sk-llm-from-env")
cfg_file = tmp_path / "config.toml"
save_config({
"compact": {
"api_key": "env:TEST_LLM_KEY",
"base_url": "https://my-llm-endpoint.com",
},
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request adds support for configuring a custom base URL and API key for OpenAI-compatible LLM providers in the compact summarization workflow, enabling easy conversion to cloud-hosted open source LLMs and enhancing symmetry of the configuration.
These new options can be set via CLI, configuration files, or environment variables, and are passed through all relevant layers of the application. The changes also include tests to ensure the new configuration fields work as expected.
CLI and configuration enhancements:
Added
--llm-base-urland--llm-api-keyoptions to the CLI, allowing users to specify a custom OpenAI-compatible endpoint and API key for summarization tasks. These options are also mapped in the config key mapping and passed through all function layers (cli.py,core.py,compact.py).Updated the
CompactConfigdataclass to includebase_urlandapi_keyfields with empty string defaults, and ensured these fields are properly resolved from environment variables if specified using theenv:syntax.Core logic and API changes:
base_url,api_key) down to the OpenAI client, resolving environment references where necessary. The OpenAI client is now initialized with these values if provided.Testing improvements:
base_urlandapi_keyfields are present inCompactConfig, that environment references are resolved correctly, and that config set/get roundtrips work for these fields.These changes make it easier to use custom or cloud-hosted OpenAI-compatible LLM endpoints for summarization, improving flexibility and security for users who need to specify their own API credentials or endpoints.