Add channel history into the prompt #90
Replies: 1 comment 1 reply
-
Thanks for the kind words! I've experimented with adding RAG as well. I'd want it to be optional since sometimes you don't want extra messages to be pulled in. The whole point of llmcord's reply-based chat system is to give the user total control over the conversation context with no extra bloat. I've tried more complex RAG implementations using LangChain where, rather than just pulling in the most recent X messages from the channel, it dynamically searches the entire channel for relevant messages based on the current conversation. Ultimately I couldn't get to something I'm happy with. It just felt too complex and imperfect. Another idea I had is to pull in recent messages from any user you @. So for example you could do "@bot, what have @john and @paul been talking about?" and it would pull in all recent messages from john and paul. There are issues with this too though. Honestly I feel like I'll never add RAG or anything similar to llmcord. Unless someone surprises me with a PR and does it super perfectly. However I always encourage people to fork llmcord and try it themselves! Glad you were able to. |
Beta Was this translation helpful? Give feedback.
-
Hey Jakob,
Really cool repo and I was able to get it to work. I wanted to implement a way to do RAG but turns out its quite complicated and unreliable depending on how you embedd and search discord messages, so I came up with the next best idea in that you feed it x amount of messages within the channel as context. For example I was using deepseek-r1 model so could theoretically feed 100k words (128k context size).
I made a fork as a POC on this idea I was testing but since I used LLM to write the code it ended up differing quite a bit so I don't trust enough to merge (does work on its own). But just food for thought on adding this feature as a way for the model to be more context aware.
https://github.com/paddelcourt/llmcord
Beta Was this translation helpful? Give feedback.
All reactions