Open horaciobp opened 5 years ago
Just to answer my own question, the shield decode time shows as 1ms. It feels way better when playing FPS games (you can feel the aim overshoot with the FireTv 4k). The FireTv looks ok for slower games and playable for FPS (you can adapt but the lag feeling is always there). The big plus for the FireTv is the support for Dolby Vision, and it may be just my impression, but I feel it looks way better than HDR10. Another important point is that if you change to H.264 the FireTv decode time is in the 30ms ballpark as well (idk if is using software decode or this is the HW decode time), so there is no way to reduce that, at least for now. I'm sticking with the shield.
Were you using an Ethernet cable with the Fire Stick 4K?
Have 2 different fire tv's a 4K and normal Stick (both using WiFi AC) and both have the same issue. I think there's something on the Fire TV's that add too much delay for some reason... I tried on PC and even Android Phone via WiFi with no issues, but Fire TV is a pain for me too. Hope they could find the problem with them....
Yes, I was using an Ethernet cable, at the end I returned the Fire Stick and bought the Nvdia Shield.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs.
I can confirm this is still happening. I own a dozen Fire TV devices. I tested the latest 4K stick, both Wifi and ethernet using the USB adapter. I also tested the original Fire TV box, the 2nd gen box, the 3rd gen pendant, and a 2nd gen stick. All of them show ~23ms decode time.
Same server, same Moonlight settings, same games, I get under <5ms with an Nvidia Shield.
It’s a hardware limitation. The Fire TV decoders are slower than modern Exynos and Snapdragon devices.
It’s a hardware limitation. The Fire TV decoders are slower than modern Exynos and Snapdragon devices.
If that's true then Amazon really screwed themselves as far as Luna is concerned. I'm a little skeptical this is the cause given that the different Fire TV devices have different SoCs with very different performance specs yet they're all 23ms delay both h264 and h265 (when available).
There's also a YouTube video of a guy setting up Moonlight on a Fire Stick and it has a 23ms delay, then there's a jump cut in the vid and it's magically 6ms.
So I tried just every resolution. 480p/3mbps has the same exact delay (23ms) as 1440p/40mbps. I just don't see how this can be a physical hardware limitation given that fact. Is it possible the Fire OS audio post-processing is causing this?
The different Fire TV hardware does indeed vary. The early Qualcomm-based Fire TVs have different latency characteristics than the later Rockchip and Amlogic decoders.
HEVC does perform better than H.264 on the pendant Fire TV, which is why Moonlight uses it by default.
Remember that there’s more to hardware than just the physical silicon. The drivers also play a big role. Some drivers support low latency mode (based on profile detection or undocumented OMX/C2 parameters), while others don’t. Some drivers will not produce an output frame until they get some number of additional input frames.
Moonlight tries to patch up the SPS to hit the fast path on drivers/hardware where such optimizations are known (and requests low latency decode using the Android 11 API), but without documentation it’s not possible to do across the board.
Thanks. Yeah a driver limitation would certainly explain the behavior I've seen.
I'm still flabbergasted by that YouTube video, it's the only thing that gave me hope this could be resolved. I have these things all over the house and they now properly support Xbox/PS4 controllers. Bummer.
The different Fire TV hardware does indeed vary. The early Qualcomm-based Fire TVs have different latency characteristics than the later Rockchip and Amlogic decoders.
HEVC does perform better than H.264 on the pendant Fire TV, which is why Moonlight uses it by default.
Remember that there’s more to hardware than just the physical silicon. The drivers also play a big role. Some drivers support low latency mode (based on profile detection or undocumented OMX/C2 parameters), while others don’t. Some drivers will not produce an output frame until they get some number of additional input frames.
Moonlight tries to patch up the SPS to hit the fast path on drivers/hardware where such optimizations are known (and requests low latency decode using the Android 11 API), but without documentation it’s not possible to do across the board.
I am pretty sure Moonlight does stick to H264 on my 4k Fire stick from 2020, whatever option I choose. I am not sure why this happens though. Can anyone try forcing H265 and display the settings overlay during the stream? (My host PC runs on a GTX 1650 so it definitely supports H265)
I also get decoding times around 25ms btw.
The different Fire TV hardware does indeed vary. The early Qualcomm-based Fire TVs have different latency characteristics than the later Rockchip and Amlogic decoders.
HEVC does perform better than H.264 on the pendant Fire TV, which is why Moonlight uses it by default.
Remember that there’s more to hardware than just the physical silicon. The drivers also play a big role. Some drivers support low latency mode (based on profile detection or undocumented OMX/C2 parameters), while others don’t. Some drivers will not produce an output frame until they get some number of additional input frames.
Moonlight tries to patch up the SPS to hit the fast path on drivers/hardware where such optimizations are known (and requests low latency decode using the Android 11 API), but without documentation it’s not possible to do across the board.
I have the pendant and I can confirm that H.625 no longer works with moonlight (it worked initially, now just returns blank screen). I always assumed it was some app update.
GameStream HEVC does not decode properly on Amlogic SoCs anymore. The codec claims to accept the HEVC data without complaining, but it never produces an output frame (as you've seen with the black screen). I've reproduced the issue on Fire TV 4K and the ADT-2, so it's not specific to Amazon devices.
I haven't tracked down the cause yet because there are a ton of variables (could be GFE changes, GeForce driver changes, Amlogic BSP/driver changes, Moonlight changes, or a combination of any of these). In the meantime, I have set HEVC to off by default on Amlogic devices to fall back to H.264 which works fine.
I did some basic regression testing with an old version of GFE and an old known working Moonlight build (the build where I first enabled HEVC on the Fire TV 3) and still received the black screen, so it doesn't appear as simple as a GFE change or Moonlight regression. It will require much more investigation.
GameStream HEVC does not decode properly on Amlogic SoCs anymore. The codec claims to accept the HEVC data without complaining, but it never produces an output frame (as you've seen with the black screen). I've reproduced the issue on Fire TV 4K and the ADT-2, so it's not specific to Amazon devices.
I haven't tracked down the cause yet because there are a ton of variables (could be GFE changes, GeForce driver changes, Amlogic BSP/driver changes, Moonlight changes, or a combination of any of these). In the meantime, I have set HEVC to off by default on Amlogic devices to fall back to H.264 which works fine.
I did some basic regression testing with an old version of GFE and an old known working Moonlight build (the build where I first enabled HEVC on the Fire TV 3) and still received the black screen, so it doesn't appear as simple as a GFE change or Moonlight regression. It will require much more investigation.
Tried also on my Galaxy S21 and was still doing H264 (it shows AVC as decoder). I have also seen somewhere that it was a specific bug for the GTX 1650 but can't find it any more. I will soon get an Nvidia shield 2019 model, so I will be able to check if H265 works there with the exact same setup.
Tried also on my Galaxy S21 and was still doing H264 (it shows AVC as decoder). I will soon get an Nvidia shield 2019 model, so I will be able to check if H265 works there with the exact same setup.
Yes, this is by design. There are a complex set of tradeoffs that Moonlight balances when making a codec selection.
If you have a Qualcomm SoC and you're streaming over a non-metered connection (unmetered WiFi, Ethernet) but not with HDR or over 4K, you will get H.264 because the Qualcomm H.264 codec supports reference picture invalidation which better conceals packet loss.
If you have a Qualcomm SoC and you're streaming over a metered connection (LTE) or using HDR or over 4K, you will get HEVC because the enhanced efficiency of HEVC will let you stream at lower bitrate with similar image quality. HEVC is also required to stream HDR or resolutions above 4K.
Likewise, there are a some HEVC codecs that work totally fine with GameStream HEVC (Exynos, some Rockchip and MediaTek, Qualcomm, Nvidia) while others do not. Assuming you have one of these SoCs and the above rules do not apply, then Moonlight will prefer HEVC if available.
All of this automatic logic can be overridden by changing the H.265 options in Moonlight. The logic also depends on your host GPU supporting HEVC encoding obviously.
I will soon get an Nvidia shield 2019 model, so I will be able to check if H265 works there with the exact same setup.
Yes, this is by design. There are a complex set of tradeoffs that Moonlight balances when making a codec selection.
If you have a Qualcomm SoC and you're streaming over a non-metered connection (unmetered WiFi, Ethernet) but not with HDR or over 4K, you will get H.264 because the Qualcomm H.264 codec supports reference picture invalidation which better conceals packet loss.
If you have a Qualcomm SoC and you're streaming over a metered connection (LTE) or using HDR or over 4K, you will get HEVC because the enhanced efficiency of HEVC will let you stream at lower bitrate with similar image quality. HEVC is also required to stream HDR or resolutions above 4K.
Likewise, there are a some HEVC codecs that work totally fine with GameStream HEVC (Exynos, some Rockchip and MediaTek, Qualcomm, Nvidia) while others do not. Assuming you have one of these SoCs and the above rules do not apply, then Moonlight will prefer HEVC if available.
All of this automatic logic can be overridden by changing the H.265 options in Moonlight. The logic also depends on your host GPU supporting HEVC encoding obviously.
I just tried on my LTE connection on my Exynos S21, I still see OMX.Exynos.avc.dec as decoder. Also, When trying to connect from my Ubuntu laptop using the Moonlight Ubuntu app and forcing it to use HEVC, I see a small notification before it connects saying that my GPU does not support HEVC and that 900 series Maxwell or newer are required. And I use a GTX 1650 Turing GPU... As I mentioned before, it seems to be an Nvidia problem.
Here is the other issue https://github.com/moonlight-stream/moonlight-qt/issues/275
Has anyone tested the new Fire Stick 4k MAX - https://developer.amazon.com/docs/fire-tv/device-specifications-fire-tv-streaming-media-player.html
Curious if this is likely to still have the same decoding latency issue as the original 4k Fire Stick.
Has anyone tested the new Fire Stick 4k MAX - https://developer.amazon.com/docs/fire-tv/device-specifications-fire-tv-streaming-media-player.html
Curious if this is likely to still have the same decoding latency issue as the original 4k Fire Stick.
It's possibly worst. I have just bought a new Fire Stick 4K Max and I got an average of 335 ms of decoding latency at 1080p@60fps which is completely abysmal.
I am trying to figure it out if I did something wrong.
It should not be related but I see that even if my PC is correctly connecting to the 5GHz network on channel 44, the Fire Stick keeps connecting to the 2.4 GHz on channel 10. PC, router and Fire Stick are all WiFi 6 compatible. Yea, I could split the SSID but it should not happen. However again, the avg network latency is just 9 ms so it should not really matter.
My disappointment is immeasurable and my day is ruined.
@emanuelecasadio my friend I was fishing for info ; was about to pull the trigger on a fire stick 4k max to stream games because the black friday deal was good. I definitely won't, thanks to your feedback.
@emanuelecasadio found a solution? I'm astonished that it's that bad, since the 4K MAX is supposed to have an upgraded GPU.
@emanuelecasadio found a solution? I'm astonished that it's that bad, since the 4K MAX is supposed to have an upgraded GPU.
Not yet but I am starting to think that my device has got some issues. I experience strange glitches even while playing Disney+ content. For instance, sudden loud white noise and squared yellow-ish boxes.
I have re-tried some tests with Moonlight while connected to 5GHz network. Now I get 110 ms average decoding latency BUT nearly 70% packages dropped at 10 Mbps which is a total disaster.
I have got -44 dB attenuation and free sight antenna-to-antenna so it really makes no sense.
Wanted to make sure it is not a router issue so I tried using my 2017 MacBook Pro connected to the TV and it is very good instead. So it is something wrong on the Fire TV.
@emanuelecasadio found a solution? I'm astonished that it's that bad, since the 4K MAX is supposed to have an upgraded GPU.
Not yet but I am starting to think that my device has got some issues. I experience strange glitches even while playing Disney+ content. For instance, sudden loud white noise and squared yellow-ish boxes.
I have re-tried some tests with Moonlight while connected to 5GHz network. Now I get 110 ms average decoding latency BUT nearly 70% packages dropped at 10 Mbps which is a total disaster.
I have got -44 dB attenuation and free sight antenna-to-antenna so it really makes no sense.
Wanted to make sure it is not a router issue so I tried using my 2017 MacBook Pro connected to the TV and it is very good instead. So it is something wrong on the Fire TV.
I'm testing it with a Fire TV 4k MAX at the moment and my decoding latency is at ~56ms with HEVC and at ~23ms for AVC
I'm testing it with a Fire TV 4k MAX at the moment and my decoding latency is at ~56ms with HEVC and at ~23ms for AVC
This is sad. I get 24ms with my old Fire TV 4K Stick :(
I have comparable performances as @SircasticFox, streaming at 1080p 60fps and I get 21-22ms with AVC and about 45-50ms with HEVC. This is sad, I got the firestick for casual gaming too but if feels unplayable right now, I can play only slow-paced games such as Death Stranding
4k max here , i get 38ms with hevc and 21 with h264.
Occasionally i get 6ms though , but only for a minute, then it ramps up again. so something is kicking in in the background. its amazon so maybe even something evil like alexa directly spying on the sound output and introducing some lag there . would be nice if someone with lineago os on his firestick could test it
Fire TV 4K Max user here. I also get a high latency around 40 ms. In the display settings of the device there is a low latency game mode switch and also on the Amazon specs site https://developer.amazon.com/docs/fire-tv/device-specifications-fire-tv-streaming-media-player.html is mentioned "Plus switch to low-lag cloud gaming with Auto Low Latency Mode support."
I don't know if that switch also impacts video decoding time latency or just some HDMI output latency. Also the question would be what API is used for that to activate
I was able to disassemble binaries in the Luna APK and finally figured out what magic they're using to get lower latency. It turns out there's an undocumented MediaTek-specific option you can pass in the MediaFormat object called "vdec-lowlatency". If you set this value when you're configuring the decoder, the MediaTek decoder switches into a higher performance mode that reduces latency significantly.
I have implemented this change in https://github.com/moonlight-stream/moonlight-android/commit/6f9021a5e6f002b661379d065f88485db1885564. In my testing on my Fire TV 4K Max, I'm seeing decode latency at 1080p hovering around 8ms now, even when streaming in HDR. It also improves performance on my Fire HD 8 tablet. There's a good chance most MediaTek-based Android devices will see a benefit from this change.
Here's a test build if you want to try and report your results: https://drive.google.com/file/d/12og9HoIkZ_bVLasQQBsRmhnA51pUd9Rw/view?usp=sharing
Wow, that’s a huge difference in the latency. Much better experience in gaming. I also get around 8 ms. Good work, Thank you
Great work @cgutman Curious if 8ms is the best you'll ever get with this decoder, or they've optimised it even further for their own service (Amazon Luna).
Here's a test build if you want to try and report your results: https://drive.google.com/file/d/12og9HoIkZ_bVLasQQBsRmhnA51pUd9Rw/view?usp=sharing
I'm seeing similar excellent performance with this test build. Thanks so much @cgutman!
After some testing on my Fire TV 3, I was able to determine that Amazon has made some custom Android modifications that make "vdec-lowlatency" work on Amlogic devices too. Not only does this option reduce latency to 8ms, it even fixes the issue with the Fire TV 3's HEVC decoder not outputting any frames. It really is a magic option!
Here's another build that should perform the same as the previous build on MediaTek devices but includes the latency improvements for Amlogic devices (like the Fire TV 3 or Fire TV Cube): https://drive.google.com/file/d/1Li4o7-CWudm2PEFs3ZxsVpk6Yvh1wM1P/view?usp=sharing
Curious if 8ms is the best you'll ever get with this decoder, or they've optimised it even further for their own service (Amazon Luna).
8ms is probably the best we'll get out it. That's incredibly good for an Android device. It roughly matches the performance of current flagship Exynos, Qualcomm, and Google Tensor SoCs. Only an Nvidia Shield can really top it.
I get 3-4 ms average of video decoding latency using frame pacing option set to Balanced with FPS limit, so even better. The feeling is pretty comparable with directly connected HDMI cable from PC (compared by switching input sources Fire TV and PC on the tv)
thanx a lot! i get around 5ms decoding latency and about 3ms network latency with the testbuild on 1080p HEVC .. its stil easy to feel the lag but to play a bit of mariokart on the tv its enough i guess
edit: i also get 5ms decoding time with h264 on this build..
Awesome. Thanks a lot for this fix. This makes the fire TV 4k max really the best/most capable and cheapest moonlight client. With this low latency shooters become very playable. Why spend a lot of money on a shield when fire TV 4k max has better hardware for a fraction of the price. I'm playing on 4k60 with 5ms latency.
This also enables "Instant game response" on my LG TV which is supposed to further reduce latency.
Anyone tested with this new build in CCWGTV ? Chromecast with GoogleTV ?
Anyone tested with this new build in CCWGTV ? Chromecast with GoogleTV ?
Looks like these changes made it into the latest release of Moonlight on the Play Store, I gave it a try and got roughly 10ms decode time on a CCWGTV. Felt like maybe 60ms total latency, probably due to the cheap TV I'm using. Definitely much better than it was before.
Anyone tested with this new build in CCWGTV ? Chromecast with GoogleTV ?
Looks like these changes made it into the latest release of Moonlight on the Play Store, I gave it a try and got roughly 10ms decode time on a CCWGTV. Felt like maybe 60ms total latency, probably due to the cheap TV I'm using. Definitely much better than it was before.
With CCWGTV the normal latency with version 10.3 is around 10ms AVC H.264 and 20ms with NEVC 👌
Yep, CCWGTV should be working well out of the box on v10.4. We now default to H.264 on CCWGTV at 1080p or below because it's lower latency. 4K streams will use HEVC because the Amlogic H.264 decoder isn't capable of handling anything beyond 1080p.
Did anyone notice any improvement on Xiomi MiBox S 4k? As I can see it is Amlogic S905L device but I didn't notice any change. :(
any difference between fire tv 4k max and fire tv cube 4k 2nd gen? in moonlight performance? (4k)
since recently its not streaming at all anymore, i always get a decoder crash on my 4k max. is it still working for you guys ? can anyone update firestick version maybe ?
edit: it sems i needed to uninstall this customversion and then reinstall from the amazon store .. the fix is merged already.
any difference between fire tv 4k max and fire tv cube 4k 2nd gen? in moonlight performance? (4k)
Fire cube 2nd gen works great! Playing from Germany on a gaming PC in the UK using Gamestream+Moonlight and the cube connected via wifi with no latency noticeable (at least not for Call of Duty, Serious Sam and Mario Kart)
Yep, CCWGTV should be working well out of the box on v10.4. We now default to H.264 on CCWGTV at 1080p or below because it's lower latency. 4K streams will use HEVC because the Amlogic H.264 decoder isn't capable of handling anything beyond 1080p.
Does the decoder in the Firestick 4K Max have the ability to go beyond 1080p while maintaining the lower latency or are both it and CCWGTV limited in this way? Thanks!
I believe the 4K Max does stay low latency above 1080p, but it's been a while since I've tried it.
I believe the 4K Max does stay low latency above 1080p, but it's been a while since I've tried it.
Hello. I've 2 Firesticks. One of them is a MAX. The latency problem still remains. Also if sometimes it goes down to 9-10, is still too much. I tested moonlight in an old tablet with Windows 10 that can't even make Moonlight work with HEVC and the latency was inexistent. So, why still so much latency on Fire TV Sticks? I've tried your 2 mods of Moonlight but both of them can't connect to the host PC.
I believe the 4K Max does stay low latency above 1080p, but it's been a while since I've tried it.
Hello. I've 2 Firesticks. One of them is a MAX. The latency problem still remains. Also if sometimes it goes down to 9-10, is still too much. I tested moonlight in an old tablet with Windows 10 that can't even make Moonlight work with HEVC and the latency was inexistent. So, why still so much latency on Fire TV Sticks? I've tried your 2 mods of Moonlight but both of them can't connect to the host PC.
probably a bad wlan spot .. mine is 4,5ms
I'm getting ~16ms average decode time on the fire stick 4k, resolution seems to have no impact. I'm running the current latest version of moonlight v12.0 downloaded from the amazon app store. 🤔 I wonder why the difference, it seems like @cgutman's changes got merged and released as far as I can tell?
I'm getting ~16ms average decode time on the fire stick 4k, resolution seems to have no impact. I'm running the current latest version of moonlight v12.0 downloaded from the amazon app store. 🤔 I wonder why the difference, it seems like @cgutman's changes got merged and released as far as I can tell?
yes its merged .. this thread should be closed
Dear all,
First of all, I would like to thank you guys for the amazing job, I've been enjoying the IOS and Windows clients a lot. I actually have two questions one regarding the FireStick and the second regarding the windows client.
With the FireStick I've been seeing decode latency of 30ms (1440p/60Hz/HDR on) which adds a lot of input lag, is this normal or there is a way to reduce that? If not does anyone know how much is the decode latency on the Shield (I will then return the firestick and get a shield)? On my laptop the decode latency is 0.5ms but there is no HDR, any plans to add that to the windows client?
Again thanks for the amazing software.