Skip to content
This repository has been archived by the owner on Apr 24, 2024. It is now read-only.

Commit

Permalink
doc: add faq
Browse files Browse the repository at this point in the history
  • Loading branch information
dsdanielpark authored Dec 27, 2023
1 parent d3c81b9 commit 1685eec
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion documents/README_FAQ.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@ Furthermore, since this package is an unofficial Python package that intercepts
- Short answer: It seems unlikely. It might be possible, but it requires experimentation. If you find a solution, anyone can contribute through a pull request.
You can attempt to fix the session by referring to the contents of a reusable session or try to lock the returned values with a context ID. However, fundamentally, users need an option to fix the seed value, as seen in OpenAI's ChatGPT, to address this issue. Currently, Bard offers limited options to users, even temperature and basic settings, so it may take some time. To make the conversation memorable, you can 1) code to summarize and store the conversation in a database, ensuring the queue changes approximately every 3-5 turns, and 2) transmit the summarized conversation and response to Bard along with a new question. Other models like ChatGPT also remember conversations through similar methods (with more diverse solutions).

In conclusion, users cannot adjust the seed option in model inference, and some additional coding work is needed to remember the conversation.
In conclusion, users cannot adjust the seed option in model inference, and some additional coding work is needed to remember the conversation. However, using a reusable session allowed retrieving previous responses, showing some effectiveness. To maintain full context like GPT, a large database and resources would be needed, and even models like OpenAI's GPT or Meta's LLaMA-2 struggle to consistently answer. (Refer to LLaMA-2's ghost attention and some appendix examples; it's important to know that making a model operate as a single persona is difficult and costly. Thus, we should remember that general models like Bard or GPT can't be expected to function like specific counselors.)

If anyone has made progress on this, they are welcome to contribute.

Expand Down

0 comments on commit 1685eec

Please sign in to comment.