Yellow-Dog-Man / Resonite-Issues

Issue repository for Resonite.
https://resonite.com
134 stars 2 forks source link

Dash facets periodically take forever to load #1531

Open ProbablePrime opened 5 months ago

ProbablePrime commented 5 months ago

Describe the bug?

Periodically, Resonite's dash will take forever to load, during this time the dash appears to re-generate itself sometimes leading to squished or stretched facets as they load. Its quite frustrating, especially when you're late for the MMC 2024 award show!

Looking into this it seems its related to loading facets. Other endpoints seem to fail too (see log file) but the most impactful is facets because It makes the dash unresponsive.

My logs are spammed with:

05:35:03.575 (138 FPS)  Exception running GET request to https://api.resonite.com/groups/G-Resonite/records/root/Inventory/Facets/Templates/OnlineStatus. Remaining retries: 10. Elapsed: 34.20s
System.Threading.Tasks.TaskCanceledException: A task was canceled.
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x00026] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter`1[TResult].GetResult () [0x00000] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Net.Http.HttpClientHandler+<SendAsync>d__64.MoveNext () [0x004f5] in <7ebf3529ba0e4558a5fa1bc982aa8605>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0001a] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1+ConfiguredTaskAwaiter[TResult].GetResult () [0x00000] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Net.Http.HttpClient+<SendAsyncWorker>d__47.MoveNext () [0x000ca] in <7ebf3529ba0e4558a5fa1bc982aa8605>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0001a] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1+ConfiguredTaskAwaiter[TResult].GetResult () [0x00000] in <9577ac7a62ef43179789031239ba8798>:0 
  at SkyFrost.Base.ApiClient+<RunRequest>d__44`1[T].MoveNext () [0x001d8] in <58cd737cddfd4b6a9ad3f20762b6a068>:0 
05:35:03.576 (138 FPS)  Exception running GET request to https://api.resonite.com/groups/G-Resonite/records/root/Inventory/Facets/Templates/CustomNameplateVisibility. Remaining retries: 10. Elapsed: 34.20s
System.Threading.Tasks.TaskCanceledException: A task was canceled.
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x00026] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter`1[TResult].GetResult () [0x00000] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Net.Http.HttpClientHandler+<SendAsync>d__64.MoveNext () [0x004f5] in <7ebf3529ba0e4558a5fa1bc982aa8605>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0001a] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1+ConfiguredTaskAwaiter[TResult].GetResult () [0x00000] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Net.Http.HttpClient+<SendAsyncWorker>d__47.MoveNext () [0x000ca] in <7ebf3529ba0e4558a5fa1bc982aa8605>:0 
--- End of stack trace from previous location where exception was thrown ---
  at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw () [0x0000c] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess (System.Threading.Tasks.Task task) [0x0001a] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification (System.Threading.Tasks.Task task) [0x00028] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd (System.Threading.Tasks.Task task) [0x00008] in <9577ac7a62ef43179789031239ba8798>:0 
  at System.Runtime.CompilerServices.ConfiguredTaskAwaitable`1+ConfiguredTaskAwaiter[TResult].GetResult () [0x00000] in <9577ac7a62ef43179789031239ba8798>:0 
  at SkyFrost.Base.ApiClient+<RunRequest>d__44`1[T].MoveNext () [0x001d8] in <58cd737cddfd4b6a9ad3f20762b6a068>:0 

As you can see its throughout my log files(these are from the steam version and not my development copy): image There's a full file attached of course.

To Reproduce

The only reliable way to reproduce it seems to be restarting Resonite a lot. Which I do because Im testing things.

Expected behavior

Resonite loads more smoothly in this area Facets do not time out.

Screenshots

No response

Resonite Version Number

Latest

What Platforms does this occur on?

Windows

What headset if any do you use?

Index

Log Files

ALFRED - 2024.3.12.1169 - 2024-03-23 10_36_26.log

Additional Context

There's a PR here: https://github.com/Yellow-Dog-Man/FrooxEngine/pull/625

To make the log message not spam the whole stack trace as its not that useful.

Reporters

Me

ProbablePrime commented 5 months ago

Our default timeout for HTTP requests is 30 seconds so the logs makes sense as to why the tasks are being cancelled, but I'd love to know why these requests take so long sometimes.

Is it a server isssue? Is it rate limiting?

etc.

ProbablePrime commented 5 months ago

Here's what it looks liek when this happens: image

ProbablePrime commented 5 months ago

I added one of the facets to our Status page internals, this was to isolate the client and because it took a minute to add.

image

This does show spikes, I wonder if they'll occur on all of the facets?

shiftyscales commented 5 months ago

When you say forever- do you mean 'a long time' or 'forever', @ProbablePrime? I've had my dash take a while to load- and usually when it does- it seems to correlate to the local home's progress in being loaded. This makes me think it might be a thread/resource starvation issue as Frooxius indicated in comments on #732.

If the dash freezes while loading and other heavy stuff is happening - that's likely due to thread/resource starvation. However this should mean it will recover eventually, once the things are processed.

I have noticed similar unresponsiveness while joining sessions- so I suspect that in this case- it's just that something is causing a high load/resource starvation which causes it to take longer than normal to load.

ProbablePrime commented 5 months ago

Thank you for asking for some more specificity. I took a look through, the logs to get an idea.

Given a single face(https://api.resonite.com/groups/G-Resonite/records/root/Inventory/Facets/Templates/CustomNameplateVisibility)

We can find 3 occurences of it timing out:

Which is roughly 400 seconds for a single facet, this doesn't cover the actual successful request which would take at maximum 30 seconds. So at worse 430 seconds or 7 minutes to load the CustomNamePlateVisibility Facet.

If we times that by the facets that are available on the dash, (lets say roughly ten), and then understand that some are requesting in parallel. We get around 5-10 minutes when this occurs.

Searching through my log directory for my steam install there are 140 files, based on the earlier search image we see 24 files which is roughly 17%.

So on 17% of launches on my local install this happens. My 140 files go back to January 2024.

ProbablePrime commented 5 months ago

Some files provided by Raidriar, a video and log file of this occurring: https://i.probableprime.co.uk/240324-PdXIzgJQK0RfjMXj17dS.mp4 (Download, GitHub cant embed this and neither can chrome) Thanks Raid for using cutting edge(awesome) video codecs

THE7800_-2024.3.12.1169-_2024-03-24_19_29_52.log

shiftyscales commented 5 months ago

Interesting- I don't think I've ever had it take that long- so I don't believe I've ever experienced this issue.

ProbablePrime commented 5 months ago

image

Response time graph over 1 week.

shiftyscales commented 4 months ago

@ProbablePrime - I seem to have inadvertently found a possible cause for this issue.

While I was loading Resonite, I accidentally pulled out my ethernet cable. Trying to launch Resonite at all without an internet connection seemingly does not finish logging the user in, and as a consequence to that, several tabs are broken/empty.

We're probably going to need to do a pass on things at some point to ensure that Resonite can be used properly, even when offline.

https://github.com/Yellow-Dog-Man/Resonite-Issues/assets/54213390/6ec2075b-01e6-4a73-aa2a-4a62d4588e80

Frooxius commented 4 months ago

That's a separate issue. The facets and screens are loaded from online templates.

If you have no connection, they just won't load, becuase there's no way to retrieve them.

Problem here is why do they take a while when there is a connection.

charlie-sans commented 2 months ago

shouldn't resonite come with precached items like the dash and it's related items?

TisFoolish commented 2 months ago

Ever since the change that made it so Res skips loading the default facets and instead only loads the account's facets, I've not had this issue.

shiftyscales commented 2 months ago

Have you observed this issue occurring of late for yourself, @ProbablePrime?

I've still not encountered this one ever myself aside from under conditions shown above where my internet momentarily disconnects which I misconstrued as being this issue.

ProbablePrime commented 1 month ago

Lately I've been working with customized dash templates, alternative loading schemes of the dash etc. So I haven't had a chance to see if it is still occuring.

If it is I have some ideas based on my work to fix it. So standby for now.

jae1911 commented 1 month ago

Lately I've been working with customized dash templates, alternative loading schemes of the dash etc. So I haven't had a chance to see if it is still occuring.

I guess this would also help with #2456?

ProbablePrime commented 1 month ago

Potentially, I'll drop some comments over there.