Closed shi-yan closed 3 years ago
Yes, the 4.23 version of Pixel Streaming for Linux supports offscreen rendering using OpenGL. No additional flags are required, the Unreal Engine will detect the offscreen environment automatically.
When the 4.25 version of Pixel Streaming for Linux is released it will support offscreen rendering using Vulkan. For this version you will need to use the -RenderOffscreen
flag.
If by "RTX rendering" you mean raytracing then it is worth noting that the Unreal Engine does not currently support raytracing under Linux, regardless of whether OpenGL or Vulkan is used. This limitation is not specific to Pixel Streaming but is a general limitation of the Unreal Engine itself, and must be addressed upstream by Epic Games.
Thank you very much for the reply. Yes, I indeed want to try raytracing. Thank you for the information!
unfortunately I ran into a seg fault if I enable the opengl context.
Assertion failed: PlatformOpenGLCurrentContext(OpenGLRHI->PlatformDevice) == CONTEXT_Shared [File:/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/OpenGLDrv/Private/OpenGLViewport.cpp] [Line: 241]
[2020.12.21-03.49.52:814][ 0]LogCore: Error: appError called: Assertion failed: PlatformOpenGLCurrentContext(OpenGLRHI->PlatformDevice) == CONTEXT_Shared [File:/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/OpenGLDrv/Private/OpenGLViewport.cpp] [Line: 241]
Signal 11 caught.
Malloc Size=65538 LargeMemoryPoolOffset=65554
CommonUnixCrashHandler: Signal=11
Malloc Size=65535 LargeMemoryPoolOffset=131119
Malloc Size=89744 LargeMemoryPoolOffset=220880
[2020.12.21-03.49.52:828][ 0]LogCore: === Critical error: ===
Unhandled Exception: SIGSEGV: invalid attempt to write memory at address 0x0000000000000003
[2020.12.21-03.49.52:828][ 0]LogCore: Assertion failed: PlatformOpenGLCurrentContext(OpenGLRHI->PlatformDevice) == CONTEXT_Shared [File:/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/OpenGLDrv/Private/OpenGLViewport.cpp] [Line: 241]
0x000000000269abe2 teststreaming!FGenericPlatformMisc::RaiseException(unsigned int) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Core/Private/GenericPlatform/GenericPlatformMisc.cpp:413]
0x00000000029905c6 teststreaming!FUnixErrorOutputDevice::Serialize(char16_t const*, ELogVerbosity::Type, FName const&) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Core/Private/Unix/UnixErrorOutputDevice.cpp:70]
0x000000000280172b teststreaming!FOutputDevice::LogfImpl(char16_t const*, ...) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Core/Private/Misc/OutputDevice.cpp:71]
0x00000000027b8354 teststreaming!AssertFailedImplV(char const*, char const*, int, char16_t const*, __va_list_tag*) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Core/Private/Misc/AssertionMacros.cpp:101]
0x00000000027b81e4 teststreaming!FDebug::CheckVerifyFailedImpl(char const*, char const*, int, char16_t const*, ...) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Core/Private/Misc/AssertionMacros.cpp:442]
0x0000000003d79545 teststreaming!FOpenGLViewport::FOpenGLViewport(FOpenGLDynamicRHI*, void*, unsigned int, unsigned int, bool, EPixelFormat) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/OpenGLDrv/Private/OpenGLViewport.cpp:241]
0x0000000003d783d1 teststreaming!FOpenGLDynamicRHI::RHICreateViewport(void*, unsigned int, unsigned int, bool, EPixelFormat) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/OpenGLDrv/Private/OpenGLViewport.cpp:71]
0x000000000414ce9a teststreaming!FSlateRHIRenderer::CreateViewport(TSharedRef<SWindow, (ESPMode)0>) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/SlateRHIRenderer/Private/SlateRHIRenderer.cpp:334]
0x000000000414dad9 teststreaming!FSlateRHIRenderer::UpdateFullscreenState(TSharedRef<SWindow, (ESPMode)0>, unsigned int, unsigned int) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/SlateRHIRenderer/Private/SlateRHIRenderer.cpp:429]
0x0000000002d3fc60 teststreaming!SWindow::SetWindowMode(EWindowMode::Type) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/SlateCore/Private/Widgets/SWindow.cpp:2010]
0x000000000494522a teststreaming!UGameEngine::CreateGameWindow() [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Engine/Private/GameEngine.cpp:571]
0x000000000446ced4 teststreaming!FDefaultGameMoviePlayer::Initialize(FSlateRenderer&, TSharedPtr<SWindow, (ESPMode)0>) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/MoviePlayer/Private/DefaultGameMoviePlayer.cpp:187]
0x00000000021e4495 teststreaming!FEngineLoop::PreInit(char16_t const*) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Launch/Private/LaunchEngineLoop.cpp:2654]
0x00000000021f409c teststreaming!GuardedMain(char16_t const*) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Launch/Private/Launch.cpp:131]
0x00000000057b1ffd teststreaming!CommonUnixMain(int, char**, int (*)(char16_t const*), void (*)()) [/home/shiy/UnrealEngine-4.23.1-pixelstreaming/Engine/Source/Runtime/Unix/UnixCommonStartup/Private/UnixCommonStartup.cpp:239]
0x00007f1a7ba56bf7 libc.so.6!__libc_start_main(+0xe6)
0x00000000021df029 teststreaming!_start()
[2020.12.21-03.49.52:850][ 0]LogExit: Executing StaticShutdownAfterError
[2020.12.21-03.49.52:872][ 0]LogCore: Warning: Unable to statfs('/home/shiy/Documents/Unreal Projects/teststreaming/Saved/Crashes/crashinfo-teststreaming-pid-23211-080AEB74C09E4E1CB14A961FE22D3E93'): errno=2 (No such file or directory)
Malloc Size=69295 LargeMemoryPoolOffset=290191
Engine crash handling finished; re-raising signal 11 for the default handler. Good bye.
Segmentation fault (core dumped)
I'm on ubuntu 18 with 1080 ti 450 driver. Not sure if you have seen something similar?
@shi-yan can you please post the complete log output so I can better diagnose the cause of the segfault?
Thank you.
I debugged further, the issue seems to be here:
EOpenGLCurrentContext PlatformOpenGLCurrentContext( FPlatformOpenGLDevice* Device )
{
SDL_HGLContext hGLContext = Linux_GetCurrentContext();
if (LIKELY(hGLContext == Device->RenderingContext.hGLContext)) // most common case
{
return CONTEXT_Rendering;
}
else if (hGLContext == Device->SharedContext.hGLContext)
{
return CONTEXT_Shared;
}
else if (hGLContext)
{
return CONTEXT_Other;
}
printf("SDL_Init failed: %s\n", SDL_GetError());
return CONTEXT_Invalid;
}
a call to SDL_GL_GetCurrentContext returned zero (hGLContext) and therefore the above returned CONTEXT_Invalid. This is strange to me, because the SDL context seems to have been created properly (from the log).
The UE4 editor runs fine for me. Just the packaged game that has a problem.
According to SDL's doc, SDL_GetError should print out more information about the failure, but it didn't.
full log is attached
this happened when initializing the movie player. I guess I need to call SDL_GL_MakeCurrent somewhere?
Very odd. Could you please try forcing offscreen rendering by prefixing your command with DISPLAY=''
and let me know if you still see an error:
# This assumes the command to run the packaged project is "./teststreaming.sh"
DISPLAY='' ./teststreaming.sh
Hi @adamrehn , Thank you so much, I guess I know what's wrong. I'm very new to unreal. I ran the binary under Binaries/Linux directly. And after seeing your message, I realized there is a shell script. I tried it, and it runs fine!
Thank you very much!
I'm very sorry, I don't know if I can bother you with another question. I know in your commit, you have enabled the build target Linux for the webrtcproxy. However, I couldn't find the built binary:
I only found the exe version. I don't know if it cross built a windows version or that's the linux binary with a wrong extension?
I'm not familiar with UE4's build system. not sure how I can build for the linux target.
I figured it out, just need to run make WebRTCProxy
Awesome, I'm glad you were able to figure it out!
Thank you very much for the help. Yes, yesterday I was able to stream the 3rd person shooter template game.
I noticed that with the default settings, it's easy to cause video artifacts. My resolution is at 2560x1440 and I noticed that it streams at 60fps, with a bitrate at 65535. I changed it to offscreen rendering with 1024x768 and hardcoded 30fps in WebRTCProxy, the quality has improved. I wanted to increase the bitrate too. But I noticed that in the code, the birate is transferred to UE4 using an int16. So 65535 is the maximum already.
I also don't understand the purpose of WebRTCProxy, I think the video stream can come out directly from UE4, I don't know why we need a separate process. I guess I will have to check its code.
I'm really looking forward to the new release with vulkan! Thank you.
I changed it to offscreen rendering with 1024x768 and hardcoded 30fps in WebRTCProxy
Note that it's not necessary to hard-code any of the parameters related to Pixel Streaming, since these can configured dynamically. For newer versions of Pixel Streaming, you can easily use command-line parameters, whilst in the 4.23 version you'll need to use console commands inside your project to set the available configuration values. I'm not necessarily recommending you actually use this if hard-coding is simpler (particularly for the 4.23 version), but I wanted that information to be visible to anyone else who reads this issue in the future.
I also don't understand the purpose of WebRTCProxy, I think the video stream can come out directly from UE4, I don't know why we need a separate process. I guess I will have to check its code.
You are absolutely correct, and in Unreal Engine 4.24 the WebRTCProxy
code was actually merged into the Pixel Streaming plugin itself, eliminating the unnecessary external process. This was one of the numerous architectural changes that Epic Games made to Pixel Streaming which made it impossible to simply port our Linux support over from 4.23, resulting in the long process of starting over for the upcoming 4.25 version.
On linux, there is a bug with Lightmass lightmap baking. I had to upgrade to 4.26.
I looked at 4.26 code, indeed, it's quite different. There is now a video encoder module that is shared between pixel streaming and game recording. I want to spend sometime looking at how to implement that part.
NvEncoder doesn't seem to support VkImage directly, I guess I will need to convert a vkimage to a cuda array or something.
@shi-yan you're correct that converting Vulkan device memory to a CUarray
is necessary in order to feed the frames to NVENC. This is precisely the approach my colleague @ImmortalEmperor took when implementing the 4.25 version of Pixel Streaming for Linux: https://github.com/ImmortalEmperor/UnrealEngine/tree/4.25-pixelstreaming
Thank you for this awesome library!
Can I run this on amazon ec2 without a monitor attached?
I want to try some RTX rendering, but my local linux doesn't have a decent gpu.