Open kl11n opened 5 months ago
Hello,
this is quite unusual situation for KeePass. Does this size comes from huge binary attachments or entry fields?
Hi,
the size actually comes from the notes fields. there are multiple mappings between different id systems we are using and the file grew over time. Splitting it would probably not work either because we would need to remember in which db which entry was or load all of them into the program which would result in the same problem.
I know this isn't really the most usual use case for a keepass db
KeePass is generally not suitable to store large datasets, in order to slice through encrypted content one would need to orchestrate reading content blocks with streaming decryption plus deflation, and on top of that using some lenient XML parser with extra logic to merge partial data chunks. This is a fairly non-trivial endeavour that would still be slow if multiple queries were required 🤔
I don't know the context, but maybe a possible solution would be to migrate to a different format, maybe something like SQLite with partially encrypted data.
Hi there.
Would it be possible to add a feature to stream over a database and return only the filtered entries by a supplied filter? My problem are huge databases. We do have a DB which is round about 4gb in decoded state and it cannot be in ram the complete time.