Open prog20901 opened 4 years ago
Hi @prog20901,
Since lunr.js is an in-memory search index, I think 1GB of data would probably be too much for a user's browser to handle. Lunr's serialization and deserialization operates entirely on the level of the index (rather than individual terms in the inverted index), so as far as I know, it wouldn't be possible to only load parts of the index into memory as needed. Have you considered a search database such as ElasticSearch for your needs?
Yes, but wanted to make it serverless or user doesn't have to need anything except the browser..May be we can use indexed db or pouchdb or watermelon db. Can you please integrate which any of this for storing the index and retrieving?
I mean, that's certainly possible - I think it would probably be a substantial amount of work, though, and I don't know how performant it would be.
Let's back up a step, though - what are you trying to build that requires a 1GB full text search index that lives entirely within a user's browser? Maybe we can try to find another solution!
Did anyone solve this, or find an alternative search engine with lunr's simplicity but that can read/write a database. Our context is a server, intended to run offline where the internet is poor, it runs in javascript but it could easily have Gigabytes of content and doesn't have gigabytes of ram.
Hi,
I saw in the first version storing of index is done (Store to levelDB or any other db system #189) however it has been dropped due to slowness.
1) How much memory we would need if we need to load 1GB of data to index and load? 2) If i store the index in indexeddb or watermelondb or pouch DB of 1GB of data, will it will be slow on retrieval? 3) On retrieval do we need to have the equivalent amount of RAM?
How much slowness and data you have tested in the first version, any idea.
Please advise we want to build a search engine which would have maximum of 1 GB of .txt file which needs to be stored and searched for a keyword with type ahead.
Thanks.