Skip to content

Conversation

ServeurpersoCom
Copy link
Contributor

feat: add dynamic model selector with persistence for llama-swap workflows

  • Add ModelSelector component in ChatSidebar with dropdown interface
  • Implement models store with localStorage persistence and auto-selection
  • Add ModelsService for /v1/models API integration with multi-model setups
  • Inject selected model into chat completion requests
  • Display model capabilities with badges (Vision, Audio, etc.)
  • Auto-refresh server props after model usage for accurate state
  • normalize model display names (base filename + OpenAI-Compat name)

Particularly useful for llama-swap users managing multiple models dynamically
Works with llama-swap workflows or standalone llama-server

…flows

- Add ModelSelector component in ChatSidebar with dropdown interface
- Implement models store with localStorage persistence and auto-selection
- Add ModelsService for /v1/models API integration with multi-model setups
- Inject selected model into chat completion requests
- Display model capabilities with badges (Vision, Audio, etc.)
- Auto-refresh server props after model usage for accurate state

Particularly useful for llama-swap users managing multiple models dynamically.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant