samiskin / redux-electron-store

⎋ A redux store enhancer that allows automatic synchronization between electron processes
MIT License
375 stars 32 forks source link

Large Store Performance #28

Closed jamesbillinger closed 8 years ago

jamesbillinger commented 8 years ago

Wonderful library - I've really enjoyed my time with electron and it's libraries like this that make it really stand out.

That said...I've noticed that when I load data into my store, initial load of new BrowserWindows takes a pretty big hit. For example:

That last one is really 10 seconds. At this point I just have my employees table loaded into my store - a fraction of what my total app will have.

I'm just curious if anyone else has run into this or if there are any ideas. Is this just due to the speed/throughput of the IPC? If so, I'll probably be better off using separate stores and using a service worker to cache the API results. Are there other options?

I can provide code samples, but it's all pretty straight forward and works well aside from the performance.

Thanks

samiskin commented 8 years ago

Hmm, I haven't tested with really large stores, so there may be places where things could be improved.

If you'd like to help investigate a better way, the way I get the initial data for the store is:

// In the main process
store = storeCreator(reducer, initialState);
global[globalName] = store;

// In the renderer
let browserStore = remote.getGlobal(globalName);
let storeData = browserStore.getState();
let filteredStoreData = excludeUnfilteredState ? fillShape(storeData, filter) : storeData; // excludeUnfilteredState is for in dev where its sometimes helpful to operate under the assumption that anything that doesn't pass your filter should be undefined, rather than the default value
let preload = stateTransformer(cloneDeep(filteredStoreData)); // lodash's cloneDeep is used as remote'd objects are handled in a unique way (breaks redux-immutable-state-invariant).  storeTransformer is something that no-ops by default
let newInitialState = objectMerge(initialState || reducer(undefined, { type: null }), preload); // This allows for the renderers to have their own state as well as the state from the main process.  Its merge by reference, not a deep merge.

Could you try those individual steps to see where the time is spent? You can skip the filter and stateTransformer parts since those aren't needed and aren't enabled by default.

samiskin commented 8 years ago

Based on https://github.com/electron/electron/issues/1948, it looks like if not enough improvements can be done on this library, it might be more worth it to use other methods rather than IPC/remote to transfer large amounts of data to the renderer.

Another thing that I suppose you could do would be to keep a pool of one or more windows that are initialized but hidden.

jamesbillinger commented 8 years ago

I added some logging:

electronRendererEnhancer.js?da9f:82remote.getGlobal took: 1ms milliseconds
electronRendererEnhancer.js?da9f:90browserStore.getState took: 2ms milliseconds
electronRendererEnhancer.js?da9f:93excludeUnfilteredState took: 0ms milliseconds
electronRendererEnhancer.js?da9f:96 stateTransformer took: 7885ms milliseconds
electronRendererEnhancer.js?da9f:99 objectMerge took: 1ms milliseconds

So that tells me that it's actually the lodash cloneDeep that is taking the time...not the transfer across the IPC. I assume that's how you would interpret those results?

If I understand this correctly, we're not actually using a single shared store...we are actually using multiple cloned copies of stores which are kept in sync. Hence the need for the "filter". I should have put that together sooner. I'm curious if it actually clones the actions - duplicating the processing against each store, or if it simply syncs the resulting changes to the state?

Either way, I think that I'm either going to have to rely on filter (which works well to improve performance) or maintain unique state in each renderer. Let me know if further thoughts.

Thanks again.

samiskin commented 8 years ago

Thanks for doing the measurements!

At the time that I first made this, what happened when I didn't cloneDeep was that it appeared to give me an object with many getters rather than a raw JSON object. It would then transfer each object over as I referenced it. My theory was that it sends more and more parts of it over IPC as the cloneDeep references each part.

The way this library works is that:

  1. The main process creates its store, then saves it into a global
  2. When a window is created, it reads in that global's state and copies it in as its initial state
  3. The window then registers itself with the main process along with a "filter"
  4. Action occurs in renderer/main process
  5. If in renderer, action is run through the reducer and forwarded to the main process
  6. The main process runs the action through the reducer.
  7. The main process compares its state prior to the reduction with the new state, and with reference checks (hence the need for immutable data), it determines what data changed
  8. The main process then iterates through each registered renderer. If the data that changed is at all mentioned in that renderer's filter, the main process IPC's over an action with data: { updated: {...}, deleted: {...} }
  9. The renderers that receive that action will then merge in that data, thereby staying in sync with the main process

So to answer your curiosity, it runs the actual reduction on the renderer that causes the action and the main process, but all other renderers will only have to merge in a payload that describes what actual data changed.

The filter was implemented in order for renderers to only receive updates for data that they specifically care about.

With your situation in mind, I wonder if the excludeUnfilteredState option should default to true. For a large initial payload, it would limit the IPC calls to only the data which passes through the filter. I added it for the argument of "I don't want to accidentally reference data that I'm not keeping in sync, so if its undefined I won't make that mistake", but since I didn't want to force that on people I didn't make it enabled by default.

Since it looks like the main culprit is IPC, I'll close this issue, but please feel free to comment more for any new questions or discoveries 👍 .