Skip to content

feat: add MiniMax as LLM provider (M2.7 default)#22

Open
octo-patch wants to merge 2 commits intomoltlaunch:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider (M2.7 default)#22
octo-patch wants to merge 2 commits intomoltlaunch:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 15, 2026

Summary

Add MiniMax as a first-class LLM provider using the OpenAI-compatible API adapter.

Changes

  • Add minimax provider option alongside Anthropic, OpenAI, and OpenRouter
  • Default model set to MiniMax-M2.7 (latest flagship with enhanced reasoning and coding)
  • Provider selection in setup wizard and settings page
  • README updated with MiniMax endpoint and default model info

Why

MiniMax offers an OpenAI-compatible API, making integration seamless via the existing shared adapter. M2.7 is the latest model with improved capabilities.

Testing

  • All existing unit tests pass (7/7)
  • Integration tested with MiniMax API

Add MiniMax (api.minimax.io) as a fourth LLM provider option alongside
Anthropic, OpenAI, and OpenRouter. MiniMax uses an OpenAI-compatible API,
so it reuses the existing OpenAI adapter with no new dependencies.

Changes:
- config.ts: add "minimax" to provider union type and model defaults
- llm/index.ts: add minimax case to createLLMProvider() factory
- ui/pages/setup/LLMStep.tsx: add MiniMax to setup wizard provider list
- ui/pages/Settings.tsx: add MiniMax to settings provider dropdown
- README.md: document MiniMax in provider table and description
@octo-patch octo-patch force-pushed the feature/add-minimax-provider branch from 6497393 to 5e40b30 Compare March 15, 2026 16:58
- Update default MiniMax model from M2.5 to M2.7 in config, UI, and docs
- MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding
@octo-patch octo-patch changed the title feat: add MiniMax as LLM provider feat: add MiniMax as LLM provider (M2.7 default) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant