Closed matthewdking closed 5 years ago
Thanks @matthewdking. Do you know the root cause of the performance problem? It is that object-path-immutable
itself was very expensive on the given structure, or was its use causing something else unnecessary to happen?
I know you mentioned a lot of calls to fetch users by their IDs on message receipt. Was that triggered by this, or was it a red-herring?
So from my understanding the flow is something along the lines of:
users_by_ids
which is what I spoke to you about but these calls are ~120msset
from object-path-immutableaddMessage is being called multiple times on subscribe (due to lots of rooms with lots of messages) and this seems to be causing it to slow down due to set taking ~3000ms
I had a quick look at the code for set
this morning which calls another function called changeImmutable
and this looks to be doing some recursion. So I would imagine that when the messages key in state gets really big (lots of rooms and messages) the performance would take a hit. Disclaimer that was a five min glance and I don't actually know what that code does... https://github.com/mariocasciaro/object-path-immutable/blob/master/index.js#L125-L152
Can you clarify the need to make calls to users_by_ids
for every message? It seems excessive? Is there a way to structure it so it does one request to get all the users in one go for that room?
I just noticed the docs are missing the senderId
property on the message properties. https://docs.pusher.com/chatkit/reference/javascript#message-properties
Ah, then I think I understand - if object-path-immutable
is maintaining a persistent datastructure, then the initial load will be creating a lot of immediately abandoned copies of the structure. That could be slow and stress the GC out.
Can you clarify the need to make calls to
users_by_ids
for every message? It seems excessive? Is there a way to structure it so it does one request to get all the users in one go for that room?
It should be called once for all members of the room when the membership subscription gets the initial state. So then it should be called once for each message, but only hit the network if the user wasn't fetched in the initial batch (basically, is no longer a member of the room).
However, I suspect that if the message subscription completes first and delivers a bunch of messages in to the SDK before we receive the members list then we hit a degerate case there, and possible even race several calls for each user entity because we don't track which we're already sending. I'll follow up on that.
Thanks very much for the PR.
FYI: https://github.com/pusher/chatkit-client-js/blob/master/src/user-store.js#L40-L44
We do make sure we don't race several requests for the same user, at least.
Thanks for clarifying a couple of points there, that is really useful.
We do make sure we don't race several requests for the same user, at least.
Yeah that helps!
I used this code as an example when creating an app however when we had a considerable number of users and messages in rooms we found that there was a huge lag in on subscription.
After a little bit of digging we found the
addMessage
function was taking approx 3000ms and thatset
fromobject-path-immutable
seemed to be causing the problem which was used to update a deeply nested object in state.I've rewritten
addMessage
,isTyping
andnotTyping
without object-path-immutable. A quick look ataddMessage
in the performance dev tools saw the time drop a fair amount to ~4msI've haven't got around to testing the functions for typing so if someone could look into that it would be awesome. I'm not actually sure they need changing 🤷♂️
I guess that update pattern could be factored out and reused as they all follow the same pattern but not sure it would actually help with readability.
@mdpye if you could take a look and give me a shout if you have feedback or changes?
EDIT: It's worth mentioning that the
notTyping
action used to be unset/delete the key in the typing object but is now setting it to false. This is probably not ideal. And I'm still unsure if they actually need changing from what they were before anyway.