Adds ADR for semantic query processing flow#947
Conversation
There was a problem hiding this comment.
I support the decision, both stated directly (doc-only, separate service) and implicitly (possibility of moving to AWS OpenSearch Serverless). Bi-encoding does have some advantages, but the promise of higher efficiency and less latency far outweighs them in my opinion.
I also feel more comfortable processing semantic queries outside of OpenSearch. The additional maintenance is worth considering, but hopefully the tool will be simple enough that it won't be too onerous.
Thanks for writing this up. 👍
matt-bernhardt
left a comment
There was a problem hiding this comment.
I support the decisions being described here. I'm curious about the inability to deploy ML models within a serverless context (although I've not looked into that at all, and it makes a sort of sense) - but like Isra mentioned, I kind of like having more control over more parts of this platform.
Certainly as an initial position, what's described here makes sense. If it turns out we need something different later, our experience will guide our future selves.
Why are these changes being introduced: * Documention in the repository of the decision to implement a semantic query processing flow in the application, and the rationale for doing so will help us remember in the future why we made this decision. Relevant ticket(s): * https://mitlibraries.atlassian.net/browse/USE-424 How does this address that need: * Provides context, the decision, and consequences for this decision.
b6368d1 to
682a9a8
Compare
Why are these changes being introduced:
Relevant ticket(s):
How does this address that need:
Developer
our guide and
all issues introduced by these changes have been resolved or opened as new
issues (link to those issues in the Pull Request details above)
Code Reviewer
(not just this pull request message)
Requires database migrations?
NO
Includes new or updated dependencies?
NO