Skip to content
This repository has been archived by the owner on Sep 16, 2024. It is now read-only.

Commit

Permalink
add some information about used model
Browse files Browse the repository at this point in the history
  • Loading branch information
Max Kammler committed Feb 3, 2023
1 parent f24ca35 commit d7448b7
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,6 +107,7 @@ You only need to do this if you want to contribute code to this package.
- The "storage"-folder contains all your encryption keys. If you delete it, you will loose access to all your encrypted messages.
- The bot replies in a thread. If you want to keep the context you need to reply to this thread or the bot will think its a new conversation. "Threads" were recently an experimental feature so you may need to activate it in your clients settings (e.g. in Element in the "lab"-section).
- There is support to set the context to work at the room level, the thread level or both (threads fork the conversation from the main room)
- The `CHATGPT_MODEL` environment sets the used model. As of writing the default uses ChatGPT from late 2022 which works fine, however we can't tell if OpenAI decides to remove the model. If so, you can always change the model variable to `text-davinci-003` or any other of the [supported models](https://platform.openai.com/docs/models/gpt-3). Keep in mind that that these models are not free and will cost you OpenAI credits.
# FAQ
Expand Down

0 comments on commit d7448b7

Please sign in to comment.