Open Dziggify opened 2 months ago
I'm in exactly the same boat, though new to Moonlight so can't really weigh in.
Streaming via ethernet to an Xbox Series X running Moonlight, getting around 14-15.5ms on average. Feels playable but am quite surprised when comparing my results to what others indicate they get on other devices.
Simply put, the avg rendering time is NOT the decode time but the time end-to-end from frame to presentation, and is bound to the input fps (hence you get 16ms on 60fps and 8ms on 120fps)
Simply put, the avg rendering time is NOT the decode time but the time end-to-end from frame to presentation, and is bound to the input fps (hence you get 16ms on 60fps and 8ms on 120fps)
Thank you, that's nicely explained! Lines up with what I was told on a Reddit thread I made yesterday asking about this.
If you can please forgive the likely stupid question - if my TVis 60hz (my monitor's 144hz I think) is it still worthwhile setting things to 120fps? From what I could see there didn't seem to be a difference, but does that 8ms equate to genuinely faster reaction times?
On Xbox Series S, and wired ethernet, I got about 15ms. (When it first connected, the stats shows 100ms-ish even more, but after settling in, it stabilizes at around 15-16ms). Although it's not bad, but I feel that I should be getting better latency on an Xbox and wired connection. I'm getting about 8 ms latency on my tablet on 5ghz wifi, for example. I tested on different resolution but it doesnt make a difference.