Yellow-Dog-Man / Resonite-Issues

Issue repository for Resonite.
https://resonite.com
134 stars 2 forks source link

Unpacking of large ProtoFlux node groups causes large lag spike in sessions with multiple users #1482

Open JackTheFoxOtter opened 6 months ago

JackTheFoxOtter commented 6 months ago

Describe the bug?

Whenever working on large ProtoFlux node groups in sessions with many users present, the act of unpacking causes a significant performance hitch for everyone in the world. The problem appears to be worse the more users are in the session.

I have the hunch that this is actually caused by multiple clients each trying to unpack the node group themselves, as the hitch is barely noticable in sessions I'm the only user in. In situations with larger network latency, I've also observed the nodes already unpacked on my end, only to have the big lag happen a couple moments afterwards, and the visuals briefly disappearing before re-appearing again.

Potentially related to #375

To Reproduce

Be in a session with multiple users and unpack large ProtoFlux groups. Note how the unpacking is significantly heavier than unpacking the same code in a world without other users.

Expected behavior

Unpacking of ProtoFlux nodes should not be significantly heavier when multiple users are around.

Screenshots

No response

Resonite Version Number

2024.3.12.87

What Platforms does this occur on?

Windows

What headset if any do you use?

Valve Index

Log Files

N/A

Additional Context

This was especially painful while working on Cloudscape Harvest for this year's MMC, which always had multiple users in the world. It got to a point where I had to "warn" people in advance that I'm about to unpack code, as everyone would freeze for a couple of seconds, which isn't very desirable.

Reporters

JackTheFoxOtter

JackTheFoxOtter commented 6 months ago

Attaching logs for this, hope they will help.

Started the game, made a new grid space, invited Mars, spawned an example object, unpacked some ProtoFlux, waited a bit, shut the game down. This was only with two players in the world, where the unpacking performance is still pretty managable, but if there is an issue that's intensified by player count, this should hopefully have enough info in the logs to track it down.

My Log: J4-C - 2024.3.12.87 - 2024-03-12 19_07_18.log

Mars's Log: DESKTOP-K2H0L1O_-2024.3.12.87-_2024-03-12_19_07_17.log

shiftyscales commented 6 months ago

This issue was referenced in #607 as well.

Is this issue only observable while in a multi-user session, @JackTheFoxOtter? Even when unpacking the exact same ProtoFlux?

Does the kind or complexity of the nodes matter in any meaningful way? E.g. would hundreds of unconnected nodes exhibit the same issue/poor scaling with the number of users present in the session?

TisFoolish commented 6 months ago

I've experienced the same issue and did a few small, unscientific tests before being discouraged to continue testing. This is what I remember from my preliminary testing months ago

  1. The issue is only observable in a multi user session
  2. The length of the hitch increases if the number of nodes unpack or the number of users increases 2a. Increasing just the number of nodes seems to make the hitching scale linearly 2b. Increasing the user count seems to also be linear but makes a much bigger difference. 2c. Increasing both does as expected and the hitch gets much longer.
  3. Unpacking locally disabled flux had no bearing on the hitch, but that was tested months ago and might have changed with the build that properly disables all disabled flux
JackTheFoxOtter commented 6 months ago

This issue was referenced in #607 as well.

Missed that one when I was searching for it, but I'm also pretty sure it's a bug, not a missing feature.

Is this issue only observable while in a multi-user session, @JackTheFoxOtter? Even when unpacking the exact same ProtoFlux?

From what I've seen - yes. During Cloudscape Harvest I frequently worked on >150 nodes node groups, and whenever I was alone in the world, I barely had problems, when we had the entire team around it was kind of unbearable.

Does the kind or complexity of the nodes matter in any meaningful way? E.g. would hundreds of unconnected nodes exhibit the same issue/poor scaling with the number of users present in the session?

I can't tell you, usually my nodes are connected to each other. But I see where you're coming from. When unpacking, the node graph shouldn't be re-built, at least I wouldn't think so. But even if it was, since building happens locally, my lag spike shouldn't get worse the more people are around. I'm pretty sure that there is an issue somewhere on the visual generating side.

Banane9 commented 6 months ago

This issue was referenced in #607 as well.

Missed that one when I was searching for it, but I'm also pretty sure it's a bug, not a missing feature.

That issue in general is a lot more broad and vague, also mentioning that moving a lot of nodes around cause performance issues - which has since been fixed as #1090 and #1104.

Banane9 commented 6 months ago

During Cloudscape Harvest I frequently worked on >150 nodes node groups, and whenever I was alone in the world, I barely had problems, when we had the entire team around it was kind of unbearable.

Did you notice any difference between hosting and just being a user in the world? From what I can tell from the code, only the unpacking user should generate the visuals. Could test this with a small mod too (making the protoflux node's EnsureVisual do nothing).

shiftyscales commented 6 months ago

Missed that one when I was searching for it, but I'm also pretty sure it's a bug, not a missing feature.

I agree- I just hadn't gotten around to triaging/processing that issue at the time- I just found it right then.

Thanks for the additional testing and info, @TisFoolish.

It would be nice to have more detail around point three you raised- unpacking on a slot that's either fully disabled or enabled just for one person (and testing if it also makes a difference if that one person is the host or another client) could be other useful info to know.