You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Would it be possible to add a feature to stream over a database and return only the filtered entries by a supplied filter?
My problem are huge databases. We do have a DB which is round about 4gb in decoded state and it cannot be in ram the complete time.
The text was updated successfully, but these errors were encountered:
the size actually comes from the notes fields. there are multiple mappings between different id systems we are using and the file grew over time. Splitting it would probably not work either because we would need to remember in which db which entry was or load all of them into the program which would result in the same problem.
I know this isn't really the most usual use case for a keepass db
KeePass is generally not suitable to store large datasets, in order to slice through encrypted content one would need to orchestrate reading content blocks with streaming decryption plus deflation, and on top of that using some lenient XML parser with extra logic to merge partial data chunks. This is a fairly non-trivial endeavour that would still be slow if multiple queries were required 🤔
I don't know the context, but maybe a possible solution would be to migrate to a different format, maybe something like SQLite with partially encrypted data.
Hi there.
Would it be possible to add a feature to stream over a database and return only the filtered entries by a supplied filter?
My problem are huge databases. We do have a DB which is round about 4gb in decoded state and it cannot be in ram the complete time.
The text was updated successfully, but these errors were encountered: