Releases: transformerlab/transformerlab-app
Releases · transformerlab/transformerlab-app
0.3.7
What's Changed
- Add Mistral v0.3 and Aya 23 :)
- You can now preview all of a dataset! @safiyamak added pagination
- Models that are downloading will show their download status even if you leave and come back to the page!
- If you go to the Computer Tab you can see all python dependencies and their versions, along with more information on the current conda environment
- Adding Data Store search feature by @safiyamak in #89
- Datasets that are downloaded are market appropriately by @safiyamak in #94
- We added a change that should (hopefully) build official Linux builds
- Remove the App Menu UI in Linux and Windows
- Tony continues to add lots of changes to Importing Models -- we will announce this once the entire set of features is complete and tested
- After upgrading plugins, you should have MLX upgraded to 0.14
New Contributors
- @safiyamak made their first contribution in #79
Full Changelog: v0.3.6...v0.3.7
API Changelog: transformerlab/transformerlab-api@v0.2.22...v0.2.23
0.3.6
This build updates the SDK to support updated routes.
When installing this update, please ensure you have also updated the API to at least 0.2.22
Full Changelog: v0.3.5...v0.3.6
0.3.5
Patch release to fix an issue that was breaking the Windows installer.
0.3.4
What's Changed
- Highly requested feature: stop current generation. You can now stop a generation that is in progress.
- Templated Prompts Page now lets you create your own prompt templates
- Improvements & bug fixes to Templated Prompts Page
- API page is simplified a tiny bit
- update deps + fix some TS by @rohannair in #76
- Revert "update deps + fix some TS" by @dadmobile in #78
New Contributors
- @rohannair made their first contribution in #76
Full Changelog: v0.3.3...v0.3.4
0.3.3
Big Changes:
- Addition of "Templated Prompts" page in the Interact section of the app
- Completions and Templated Prompt now stream text
- All requests to the LLM are logged and shown in the new Prompt Logs page
- Allow manual setting of stop strings in the Chat interface
More details:
- Improved error reporting during installation
- Improved error reporting when a model download fails
- Do not allow "eject" while a model is running
- Improvements to Interact (chat) page UI and code
- Hide some inference settting knobs in the Interact page and put them behind a "All Generation Settings" section
- Show context window in model details page
- Check and update Python dependencies when upgrading the API
Full changelog: v0.3.2...v0.3.3
0.3.2
Patch to fix unresponsive Step 1 on Windows Local Connection install.
0.3.1
- Improve RAG error display
0.3.0
Initial preview of Retrieval Augmented Generation and Windows support.
0.2.14
- Improved, minimalistic UI
- Regenerate button: ability to regenerate last message from LLM
- Save the last experiment you were on, per server
- Fix to local auto-installer
Full Changelog: v0.2.13...v0.2.14
0.2.13
- Uses the new expectation that conda is installed at ~/.transformerlab/miniconda3