snoww / loa-logs

LOA Logs - Modern DPS Meter for Lost Ark
GNU General Public License v3.0
132 stars 32 forks source link

Improve Full Text Searching #108

Closed anyduck closed 1 month ago

anyduck commented 1 month ago

Features

Benchmark

Benchmarking Environment Using 800MiB encounters.db file, thanks to b14ckout for providing it
The executable is compiled in release mode, with each query run only once 😐
It was executed on a laptop with an external SSD connected via USB3 and an NVMe SSD
Method Refresh Search "cyn" Search "red" Class "Bard" Migration
FTS5 USB3 2ms 8ms 3ms 7ms 20.7s
FTS5 NVMe 1ms 12ms 1ms 13ms 14.0s
LIKE USB3 312ms 381ms 348ms 455ms 0.2s
LIKE NVMe 244ms 308ms 289ms 244ms 2.3s

Problems

Notes

snoww commented 1 month ago

hi thanks for the pr.

its probably a good idea to just store everything in a compressed json blob (lol), that way its easier to handle changes to the encounter structure. or maybe a more robust db migration system, but im too lazy for that.

i've also thought about extracting the buff descriptions along with it's buff source skill into its separate table and fetching it when we need them, cus right now its duplicated many many times. but thats another issue on its own.

the pr looks good overall, ill test it out when i get the chance. i'll wait on ur update after you read the comment.

anyduck commented 1 month ago

pub fn insert_data()

Oops, I forgot that I deleted a bunch of files to make it compile

migration in a different thread

I tried using an example from the splashscreen docs, but it wasn't trivial to make it work. So, I'll just hide windows instead

more robust db migration system

That would be a great add because I broke the current one. I knew I shouldn't have touched it that much 😢 Is it okay with you to bring in dependency to do the job? rusqlite_migration looks okay

extracting the buff descriptions and buff source skill into its separate table

It looks like that data is already loaded into memory from Skill.json and SkillBuff.json. Do you want to extract it into a new table because there are some changes between the Lost Ark patches?

Edit: Looking into meter-data commit history, there are usually just name and description changes, sometimes a new group is added to a skill

anyduck commented 1 month ago

I wasn't able to test bd0e57a, but everything else should be working

snoww commented 1 month ago

hi sorry for the late response, been busy with irl stuff.

i sent an invite for meter-core-rs, so u can test inserting to the db, but keep in mind to keep it private.

feel free to try out adding crates to see if its an improvement or not, we could also keep it as is if it makes the whole thing simpler.

i think the skill.json and skillbuff.json change is out of scope of this pr so we can ignore that for now.

anyduck commented 1 month ago

It's ready for a review

snoww commented 1 month ago

im also thinking if we should run the migration on another thread, because currently it blocks the meter from listening to any packets while doing the migration. hmm.

anyduck commented 1 month ago

This won't work because hidden windows can still run javascript and logs_windows will query DB. We need some global mutex it'll wait on in the root layout.svelte or in DB related tauri commands.

snoww commented 1 month ago

im not 100% sure but wouldn't it just not load the encounters until the migration is finished? since the query will have to wait for the migration to finish anyways?

anyduck commented 1 month ago

Eh, weird. It panicked when I tried this 🤔

anyduck commented 1 month ago

Well LGTM. Maybe I forgot to revert some other code when trying this

snoww commented 1 month ago

really? i guess i only ran it in debug mode, maybe it panics in release mode, bummer, guess i need to remove the task

anyduck commented 1 month ago

It works now for some reason

anyduck commented 1 month ago

Nah, there is a race condition. This code panics if JS tries to query the encounter_preview table before it is created. I added this condition in 208d20f https://github.com/snoww/loa-logs/blob/4f3b0463583c47f7151c6d054545c88aaf85ea6a/src-tauri/src/main.rs#L373 So now migration code usually wins the race, but before that it used to panic on this code https://github.com/snoww/loa-logs/blob/4f3b0463583c47f7151c6d054545c88aaf85ea6a/src-tauri/src/main.rs#L719

anyduck commented 1 month ago

How about starting this thread before running migration in the main thread? https://github.com/snoww/loa-logs/blob/4f3b0463583c47f7151c6d054545c88aaf85ea6a/src-tauri/src/main.rs#L196-L200

snoww commented 1 month ago

since that thread is blocking, doesn't it mean anything past it won't get executed?

anyduck commented 1 month ago

tokio::task::spawn_blocking creates a new thread, but not on the thread pool like tokio::task::spawn

anything past it won't get executed

You are thinking of tokio::block_on

anyduck commented 1 month ago

But there'll still be a race condition for saving data to the DB. So we need a mutex on that, no matter how we try to parallelize it

snoww commented 1 month ago

think we just leave it sync and block