0.3.3
Big Changes:
- Addition of "Templated Prompts" page in the Interact section of the app
- Completions and Templated Prompt now stream text
- All requests to the LLM are logged and shown in the new Prompt Logs page
- Allow manual setting of stop strings in the Chat interface
More details:
- Improved error reporting during installation
- Improved error reporting when a model download fails
- Do not allow "eject" while a model is running
- Improvements to Interact (chat) page UI and code
- Hide some inference settting knobs in the Interact page and put them behind a "All Generation Settings" section
- Show context window in model details page
- Check and update Python dependencies when upgrading the API
Full changelog: v0.3.2...v0.3.3