FINDarkside / TLD-Save-Editor

Edit The Long Dark save files
http://www.moddb.com/mods/the-long-dark-save-editor-2/downloads
MIT License
75 stars 31 forks source link

Load&save in separate thread #37

Open FINDarkside opened 4 years ago

FINDarkside commented 4 years ago

Loading big saves causes the UI to hang for a while, so loading the save should probably be done in separate thread. At least save button and save selection dropdown needs to be disabled when save/load is happening.

Pt-Djefferson commented 4 years ago

Almost all UI elements shell to be disabled in case of load or save except tabs labels. And some progress bar can be added at status bar.

FINDarkside commented 4 years ago

It'd be ideal to not disable controls when saving though. But because the bottleneck shouldn't actually be writing the save to the disk it'd be quite problematic as obviously no changes after clicking save shoud apply. Doing deep clone of the save data which could be supplied to the other thread wouldn't be trivial either, as the extraFields dictionary in DynamicSerializable uses object instance as key.

There's probably lot of stuff to be optimized in DynamicSerializable anyway, at least lot of the reflection stuff could be cached. Haven't done any profiling to check if that's even one of the bottlenecks though.

FINDarkside commented 4 years ago

Took a look at performance profile, most of the time is spent in JObject.Parse and JToken.ToObject. If performance really was a big problem writing a custom JsonTextReader would probably speed things up.

Pt-Djefferson commented 4 years ago

Or we can get (and set) only necessary data from text by precompiled regexp and by not all data at a time but partialy (depends on focused tab). It will be not nice as it is now but fast.

FINDarkside commented 4 years ago

Managed to reduce the load time by ~30% with couple simple changes. JObject.Parse remains as the bottleneck when loading saves though. It doesn't really help that the save file is a hot mess of nested serialized json strings and json strings compressed stored as byte array which is then again converted back to json 😄

Or we can get (and set) only necessary data from text by precompiled regexp and by not all data at a time but partialy (depends on focused tab). It will be not nice as it is now but fast.

I'm not sure if you understood you correctly, but editing json with regex is pretty problematic as json isn't a regular language and therefore it cannot be fully parsed with regex. I don't think splitting the work by tab would actually bring any noticeable difference, I'd guess that the biggest problem is the first deserialization&serialization as it can be 5MB of byte arrays. No matter what we do, we need to turn big byte arrays to json.

Just profiled save, and JSON.net is the bottleneck there as well, so at least it's not the DynamicSerializable that would be adding significant slowdown. I'll check if there's some easy optimizations though.

FINDarkside commented 4 years ago

Reduced save time by over 60% (in my example save) with d7a8b61bcaa73971bea344cba18631636e774767. It would probably be worth to do the same thing with all primitive arrays and lists, but byte arrays were obviously the biggest problem.

FINDarkside commented 4 years ago

Reduced save time by further ~25% by caching results of GetCustomAttribute. eef9c2700b9cc3612ec119233a3ae339021e9205

At the moment most of the save time is spent on JsonTextWriter.WriteValue, CLZF.lzf_compress and JsonConvert.SerializeObject so there probably aren't any big (trivial) gains anymore.

FINDarkside commented 4 years ago

Caching Type.GetFields and Type.GetProperties doesn't seem to give any noticeable improvements.