adamrehn / pixel-streaming-linux

Issue tracker repository for Pixel Streaming for Linux
https://adamrehn.com/articles/pixel-streaming-in-linux-containers/
22 stars 7 forks source link

More effective use of NVENC for Pixel Streaming. #45

Closed lukehb closed 3 years ago

lukehb commented 3 years ago

There is some speculation from our side and from community members that the current use of NVENC for Pixel Streaming may not be configured for ideal usage. The purpose of this issue to is discuss and propose alternate usages and configurations of NVENC for Pixel Streaming that may provide either more configurability or better defaults for Pixel Streaming.

lukehb commented 3 years ago

I will address some comments that @sasmaster wrote in https://github.com/adamrehn/pixel-streaming-linux/issues/31#issuecomment-825734835

I was working on the integration of NVENC func for video files export (render sequencer to mp4 ).The first thing I did was to check the NVENC impl by Epic inside AVEncoder module. It doesn't look good.

The NVENC impl inside AVEncoder is mostly written to cater to the WebRTC use-case, which is much different that typical offline video encoding. For example, you mentioned rendering to MP4, as you have found the defaults they have chosen in AVEncoder are pretty terrible for that because they are trying to encode as fast as possible for a real-time streaming use case. That all said, I'm certainly not saying that the configurations Epic have chosen are ideal (hence, this issue), haha.

I will also mention that my first impression when I read AVEncoder was similar, from the sounds of it, to yours. That is, I thought that the implementation didn't look like it was good in terms of picking ideal settings. However, on further inspection many of the configuration options they have picked are due to WebRTC, such as choosing baseline profile (because this is the only H.264 profile guranteed to supported by devices using WebRTC). Additionally, many of the hacky things they are doing in AVEncoder are likely a side effect of them trying to work around issues they were having WebRTC M70, which as discussed in #31 shouldn't be a problem anymore, so when we move to a new WebRTC version the other thing that can happen is we can remove a lot of those hacks from the encoder.

I tried its output and the HD quality was very average. They use CONST QP by default.

From memory when they actually do Pixel Streaming they configure it to use CBR by default, which as per NVIDIA's recommendations for game streaming is the correct rate control mode to use.

I also found this:

if (Config.bFillerDataHack)
{
  // Bitrate generated by NVENC is not very stable when the scene doesnt have a lot of movement
  // outputPictureTimingSEI enables the filling data when using CBR so that the bitrate generated
  // is much closer to the requested one and bandwidth estimation algorithms can work better.
  // Otherwise in a static scene it can send 50kbps when configuring 300kbps and it will never ramp up.
  if (NvEncConfig.rcParams.averageBitRate < 5000000)
  {
    NvEncConfig.encodeCodecConfig.h264Config.outputPictureTimingSEI = 1;
    NvEncConfig.rcParams.enableMinQP = 0;
  }
  else
  {
    NvEncConfig.encodeCodecConfig.h264Config.outputPictureTimingSEI = 0;
    NvEncConfig.rcParams.enableMinQP = 1;
    NvEncConfig.rcParams.minQP = { 20, 20, 20 };
  }
}

I don't understand what's Config.bFillerDataHack but you can't use bitrate settings in Const QP mode as those get ignored.

Pretty sure in this hack they were trying to overcome some bitrate degradation problems in WebRTC M70. I think the idea is WebRTC was requesting a certain bitrate, then if the scene had little interframe movement the encoder would way undershoot the requested bitrate, then WebRTC would not upgrade the quality because it assumed the encoding machine was not able to handle a higher bitrate so users would get stuck, sometimes, at low bitrate because NVENC was trying to be efficient. Thus, the workaround hack they tried was to transmit outputPictureTimingSEI, which I believe they thought would pad out the transmitted frame. This is a pretty dirty hack and I'm not sure if it is necessary if we do a WebRTC upgrade.

I don't know what are requirements from WebRTC side but VBR would provide much better results. CBR can be fine as well, but will result in bitrate waste. I frames intervals is usually what's important for video streaming.

From NVIDIA's recommendations I think CBR will be better for low latency use case like Pixel Streaming, even if VBR is better quality (or less bitrate). I believe encode time of CBR must be faster.

I ended up implementing a separate version of the NVENC using the latest Video SDK (it also provides better presets API and support for 4K and HVEC) since their encoder setup doesn't look good to me.

If you think it produced better results for Pixel Streaming and are able to share it then please do :)

I have been working with NVidia Video SDK a lot outside Unreal and if you need any help there let me know, I am also interested to see Pixel streaming working well. I need it on Linux ;)

We would like to go as close to possible to the best default NVENC settings for Pixel Streaming that will work for the most people out of the box. WebRTC imposes some limitations but I think a good approach might be for us to create some new presets for Pixel Streaming encoding, such as: