Skip to content

Feat/custom OpenAI endpoint for compact#101

Open
KagaJiankui wants to merge 4 commits intozilliztech:mainfrom
KagaJiankui:feat/custom-openai-endpoint-for-compact
Open

Feat/custom OpenAI endpoint for compact#101
KagaJiankui wants to merge 4 commits intozilliztech:mainfrom
KagaJiankui:feat/custom-openai-endpoint-for-compact

Conversation

@KagaJiankui
Copy link

@KagaJiankui KagaJiankui commented Mar 4, 2026

This pull request adds support for configuring a custom base URL and API key for OpenAI-compatible LLM providers in the compact summarization workflow, enabling easy conversion to cloud-hosted open source LLMs and enhancing symmetry of the configuration.

These new options can be set via CLI, configuration files, or environment variables, and are passed through all relevant layers of the application. The changes also include tests to ensure the new configuration fields work as expected.

CLI and configuration enhancements:

  • Added --llm-base-url and --llm-api-key options to the CLI, allowing users to specify a custom OpenAI-compatible endpoint and API key for summarization tasks. These options are also mapped in the config key mapping and passed through all function layers (cli.py, core.py, compact.py).

  • Updated the CompactConfig dataclass to include base_url and api_key fields with empty string defaults, and ensured these fields are properly resolved from environment variables if specified using the env: syntax.

Core logic and API changes:

  • Modified the summarization logic to pass the new configuration options (base_url, api_key) down to the OpenAI client, resolving environment references where necessary. The OpenAI client is now initialized with these values if provided.

Testing improvements:

  • Added tests to verify that the new base_url and api_key fields are present in CompactConfig, that environment references are resolved correctly, and that config set/get roundtrips work for these fields.

These changes make it easier to use custom or cloud-hosted OpenAI-compatible LLM endpoints for summarization, improving flexibility and security for users who need to specify their own API credentials or endpoints.

Copilot AI and others added 4 commits March 4, 2026 04:19
Co-authored-by: KagaJiankui <58453886+KagaJiankui@users.noreply.github.com>
…act_openai

Co-authored-by: KagaJiankui <58453886+KagaJiankui@users.noreply.github.com>
Copilot AI review requested due to automatic review settings March 4, 2026 04:51
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds configurable OpenAI-compatible endpoint and API key support to the compact summarization workflow, enabling users to point compact at custom/OpenAI-compatible LLM providers via CLI/config/env resolution.

Changes:

  • Extended CompactConfig with base_url and api_key, including env: reference resolution via existing config resolution.
  • Added --llm-base-url / --llm-api-key CLI options and plumbed them through cli.py → core.py → compact.py.
  • Updated OpenAI compact client initialization to accept an explicit base_url / api_key, and added config tests for the new fields.

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated no comments.

Show a summary per file
File Description
tests/test_config.py Adds tests for new CompactConfig fields, env resolution, and config set/get roundtrips.
src/memsearch/core.py Extends MemSearch.compact() signature and passes base URL / API key down to compaction.
src/memsearch/config.py Adds base_url / api_key to CompactConfig.
src/memsearch/compact.py Threads base URL / API key into the OpenAI client construction for compact.
src/memsearch/cli.py Adds new CLI flags and maps them into config overrides and compact invocation.
Comments suppressed due to low confidence (1)

tests/test_config.py:208

  • The test docstring says it resolves env: references for both compact.api_key and compact.base_url, but the test only uses an env: value for api_key (base_url is a literal URL). Either update the docstring to match the test, or add coverage for base_url using an env: reference as well.
def test_compact_config_env_ref_resolved(tmp_path: Path, monkeypatch: pytest.MonkeyPatch):
    """resolve_config should resolve env: references in compact.api_key and compact.base_url."""
    monkeypatch.setenv("TEST_LLM_KEY", "sk-llm-from-env")

    cfg_file = tmp_path / "config.toml"
    save_config({
        "compact": {
            "api_key": "env:TEST_LLM_KEY",
            "base_url": "https://my-llm-endpoint.com",
        },

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

You can also share your feedback on Copilot code review. Take the survey.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants