amir20 / dozzle

Realtime log viewer for docker containers.
https://dozzle.dev/
MIT License
5.7k stars 287 forks source link

multiple concurrent /logs request #3113

Closed FrancYescO closed 1 month ago

FrancYescO commented 2 months ago

🔍 Check for existing issues

How is Dozzle deployed?

Standalone Deployment

📦 Dozzle version

8.0.5

🐛 Describe the bug / provide steps to reproduce it

when trying to load older logs

this will lead to:

💻 Environment

-

📸 If applicable, add screenshots to help explain your bug

image

📜 If applicable, attach your Dozzle logs. You many need to enable debug mode. See https://dozzle.dev/guide/debugging.

No response

amir20 commented 2 months ago

Why are you making my life difficult 😂

FrancYescO commented 2 months ago

stay safe cuz i have another bug/performance improvement ready to be managed on that related to what you explained here https://github.com/amir20/dozzle/issues/2993#issuecomment-2137661140 and the magic "300" number... idk if something recently changed on browsers but if this loadMore request is getting big (>100kbytes ?) the freeze of the webui is infinite for some of my containers (i was searching a way to optimize this thing, and than this bug come out)

amir20 commented 2 months ago

I haven't made a lot of changes in the UI for last month. Mostly focused on agents and swarm mode. I remember there was another issue similar to this one which I thought I fixed. I'll have to first see if I can reproduce it and then try to remember if the fix did something else.

amir20 commented 2 months ago

Oh yea, the other comment you linked is the other one. I am trying to investigate some other bugs right now though. Will keep this open if someone else can help.

FrancYescO commented 2 months ago

to help you reproduce, on pretty all container (that have at least a little of logs to load in the past) the Throttling option of chrome can be usefull

image
amir20 commented 2 months ago

Hmm I think https://github.com/amir20/dozzle/issues/2993 only fixed when containers changed. Right? I am able to reproduce this so that's half the problem.

FrancYescO commented 2 months ago

yes i can confirm that when changing the cointainer the pending requests will be cancelled but not if staying in the same

amir20 commented 2 months ago

Seems like the solution is to have a flag set while loading and abort if the flag is true. I did that at #3118. Can you test amir20/dozzle:pr-3118?

I didn't spend a whole a lot of time on this so might need a little more regression testing.

FrancYescO commented 2 months ago

hmm seems i'm able to reproduce also with pr-3118 ... looking at code not sure why... maybe the fetchingInProgress flag is reinizialized somehow

FrancYescO commented 2 months ago

btw i see some improvements that can be made on the parsing part, unfortunately i should find the way to fire up the dev environment to test, but the more evident is that the api is returning a json array but not in the json format({}\n{}\n instead of [{},{}]), that will cause you to use the .text() func and parse all as text calling the JSON.parse, this can be fully avoided giving back from the api a valid json

amir20 commented 2 months ago

hmm seems i'm able to reproduce also with pr-3118 ... looking at code not sure why... maybe the fetchingInProgress flag is reinizialized somehow

That's weird. I wasn't able to reproduce it. Can you maybe record a video?

If I return a huge JSON load, it might actually be slower. But when loading incrementally I only parse one line at a time so I am using the same code.

FrancYescO commented 2 months ago

don't ask me why but seems working now (pretty sure caused due to the browser cache)

but i still see something weird: if you try to replicate the bug, requests will not fire but you will lose the loading status in the webui (infinite symbol turns in "0%" and the spinner on top of the messages disappear, also if the request is ongoing in background)

If I return a huge JSON load, it might actually be slower. But when loading incrementally I only parse one line at a time so I am using the same code.

surely this is when i got the performance issue, as my loadmore are often pretty huge

FrancYescO commented 2 months ago

just to give some sizes when i say "huge":

image

at least in this case it will just fail instead of freezing the full WebUI trying to parse :D :

image

(but the UI will be broken equally as if i get this this stack size error, i lose the possibility to switch containers until a refresh)

amir20 commented 1 month ago

Sorry been away for a few days.

but i still see something weird: if you try to replicate the bug, requests will not fire but you will lose the loading status in the webui (infinite symbol turns in "0%" and the spinner on top of the messages disappear, also if the request is ongoing in background)

I think I know what is causing this. Will try to fix it.

just to give some sizes when i say "huge":

WOW! That's a lot of data. I have tested with dozens of logs per second. The logic for fetching logs for a time period is at https://github.com/amir20/dozzle/blob/565bfa302ad72f162fc92ae28796d17f856f07a4/assets/composable/eventStreams.ts#L192-L195

There might be a bug.

Would you mind opening performance bug and providing as much as details? eg. how many logs per second, what kind of logs, etc...

I think this is something that should be fixed. But not related to this bug.

amir20 commented 1 month ago

OK check master tomorrow and let me know if some of these bugs have been fixed. @FrancYescO