Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/add settings to chat model #1027

Merged
merged 12 commits into from
Sep 9, 2024

Conversation

gecBurton
Copy link
Collaborator

@gecBurton gecBurton commented Sep 7, 2024

Context

I want the front end LLM sector to actually change the chat_backend, this will be "stuck" to the Chat object

Changes proposed in this pull request

  1. a new abstract model AbstractAISettings created, this has all the chat_backend field as AISettings.
  2. AISettings inherits from AbstractAISettings (i.e. no change)
  3. Chat now also inherits from AbstractAISettings
  4. All existing Chat.chat_backend instances are back populated with the relevant AISettings.chat_backend values
  5. Chat.save is modified to populate the chat_backend fields from chat.user.ai_settings

Guidance to review

Relevant links

Things to check

  • I have added any new ENV vars in all deployed environments
  • I have tested any code added or changed
  • I have run integration tests

@gecBurton gecBurton marked this pull request as draft September 7, 2024 15:21
@gecBurton gecBurton changed the base branch from main to feature/llm-selector September 7, 2024 18:47
@gecBurton gecBurton marked this pull request as ready for review September 9, 2024 11:14
user: User = self.scope.get("user", None)
user: User = self.scope.get("user")
chat_backend = data.get("llm")
temperature = data.get("temperature")
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will default to None, i.e. the frontend does not need to set this

@gecBurton gecBurton merged commit 1618d54 into feature/llm-selector Sep 9, 2024
4 checks passed
@gecBurton gecBurton deleted the feature/add-settings-to-chat-model branch September 9, 2024 13:19
KevinEtchells added a commit that referenced this pull request Sep 9, 2024
* added ai-settings to chat model

* added migration test

* simplified

* Add LLM selector to the UI

* Send selected LLM back to Django (via form submit and websocket)

* Explicitly set value attributes on LLM selector, so "(default)" isn't passed on

* pasing values to front end

* reduced the scale of change

* now updating

* added temprature

* added temperature to consumer

---------

Co-authored-by: George Burton <g.e.c.cburton@gmail.com>
Co-authored-by: Kevin Etchells <kevetchells@hotmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants