Closed arazadaz closed 1 year ago
I'm wondering if there's a workaround/patch that can be made for this. It's not a huge problem, but it is a limitation.
Ideally you would be able to just import linear dmabuf on any GPU, but this doesn't seem to work on nvidia?
If for example the game is on the dgpu(sink providing) output, you would need to launch obs on the same gpu to capture it. However, this would mean that screen/window capture(pipewire) would instead be black as it follows a similar limitation. Obs needs to Be launched on the primary(sink source) gpu in order for screen/window capture(pipewire) to work.
Sure, but it shouldn't be much of an issue. Unless you want to show both screen AND game capture in one scene.
It would be ideal if we could capture the game with obs running on the primary(sink source) gpu regardless of what gpu the game is running on. This would allow us to use screen/window capture(pipewire) and game capture at the same time always. There's not many scenarios I can think of where that's needed, but it allows us the option if needed.
Sure, but I have no multi-gpu machine (let alone any nvidia hardware). Maybe you could make an extra GL context in OBS for the GPU that the game is running, import the game buffer there and then copy it to the OBS gpu. But for this, again, you would need to export dmabuf and import it on the other gpu - and this doesn't seem to work? Last option would be to do CPU copy, but that will most likely kill the perf.
Outside of that, it will also save a handful of people who are wondering why gamecapture isn't working & is just showing a black screen a headache since by default, obs launches on the primary(sink source) gpu. It might be worth noting this on the main page.
There's a FAQ for this = run on the same gpu in multi-gpu setups.
There's a FAQ for this = run on the same gpu in multi-gpu setups.
Ah. I must have been blind then. I somehow glossed over it as I saw the other troubleshooting notes, just not that one.
Sure, but it shouldn't be much of an issue. Unless you want to show both screen AND game capture in one scene.
I was thinking more of people who overlay a window on top of their game, for whatever reason using window capture.
Maybe you could make an extra GL context in OBS for the GPU that the game is running, import the game buffer there and then copy it to the OBS gpu. But for this, again, you would need to export dmabuf and import it on the other gpu - and this doesn't seem to work? / Ideally you would be able to just import linear dmabuf on any GPU
How could I do this?
Sure, but I have no multi-gpu machine (let alone any nvidia hardware).
I can provide you with all the information you need since I do have a multi-gpu laptop. If it seems reasonable.
|
I want to say it's halfway there already if you consider the anomaly I pointed out in the original post. I added it in on an edit & you replied super fast, so you might have missed it.
I can provide you with all the information you need since I do have a multi-gpu laptop. If it seems reasonable.
I have some ideas to try, so it would be great if you are willing to test them.
Please try this
diff --git a/src/vklayer.c b/src/vklayer.c
index 53a9488..cbc5f71 100644
--- a/src/vklayer.c
+++ b/src/vklayer.c
@@ -754,6 +754,9 @@ static inline bool vk_shtex_init_vulkan_tex(struct vk_data *data,
bool allocated = false;
uint32_t mem_req_bits = VK_MEMORY_PROPERTY_DEVICE_LOCAL_BIT;
+ if (!same_device) {
+ mem_req_bits = VK_MEMORY_PROPERTY_HOST_VISIBLE_BIT;
+ }
if (map_host) {
mem_req_bits = VK_MEMORY_PROPERTY_HOST_VISIBLE_BIT | VK_MEMORY_PROPERTY_HOST_COHERENT_BIT | VK_MEMORY_PROPERTY_HOST_CACHED_BIT;
}
or this
diff --git a/src/vklayer.c b/src/vklayer.c
index 53a9488..49c87a4 100644
--- a/src/vklayer.c
+++ b/src/vklayer.c
@@ -754,6 +754,9 @@ static inline bool vk_shtex_init_vulkan_tex(struct vk_data *data,
bool allocated = false;
uint32_t mem_req_bits = VK_MEMORY_PROPERTY_DEVICE_LOCAL_BIT;
+ if (!same_device) {
+ mem_req_bits = VK_MEMORY_PROPERTY_HOST_VISIBLE_BIT | VK_MEMORY_PROPERTY_HOST_COHERENT_BIT;
+ }
if (map_host) {
mem_req_bits = VK_MEMORY_PROPERTY_HOST_VISIBLE_BIT | VK_MEMORY_PROPERTY_HOST_COHERENT_BIT | VK_MEMORY_PROPERTY_HOST_CACHED_BIT;
}
on top of master and post OBS output.
I tried the first one & it worked, so I skipped trying the second one.
It gives a new error, but it still works:
error: Cannot create EGLImage: Arguments are inconsistent (for example, a valid context requires buffers not supplied by a valid surface). warning: [linux-vkcapture] Asking client to create texture no modifiers
Here's the full obs output:
debug: Found portal inhibitor
debug: Attempted path: share/obs/obs-studio/locale/en-US.ini
debug: Attempted path: /usr/share/obs/obs-studio/locale/en-US.ini
debug: Attempted path: share/obs/obs-studio/locale.ini
debug: Attempted path: /usr/share/obs/obs-studio/locale.ini
debug: Attempted path: share/obs/obs-studio/themes/Yami.qss
debug: Attempted path: /usr/share/obs/obs-studio/themes/Yami.qss
info: Platform: Wayland
info: CPU Name: Intel(R) Core(TM) i7-9750H CPU @ 2.60GHz
info: CPU Speed: 4120.398MHz
info: Physical Cores: 6, Logical Cores: 12
info: Physical Memory: 31825MB Total, 28499MB Free
info: Kernel Version: Linux 6.2.8-arch1-1
info: Distribution: EndeavourOS Unknown
info: Session Type: wayland
info: Qt Version: 6.4.3 (runtime), 6.4.2 (compiled)
info: Portable mode: false
qt.core.qmetaobject.connectslotsbyname: QMetaObject::connectSlotsByName: No matching signal for on_tbar_position_valueChanged(int)
info: OBS 29.0.2-2 (linux)
info: ---------------------------------
info: ---------------------------------
info: audio settings reset:
samples per sec: 48000
speakers: 2
max buffering: 960 milliseconds
buffering type: dynamically increasing
info: ---------------------------------
info: Initializing OpenGL...
info: Using EGL/Wayland
info: Initialized EGL 1.5
info: Loading up OpenGL on adapter Intel Mesa Intel(R) UHD Graphics 630 (CFL GT2)
info: OpenGL loaded successfully, version 4.6 (Core Profile) Mesa 23.0.0, shading language 4.60
info: ---------------------------------
info: video settings reset:
base resolution: 2560x1440
output resolution: 1920x1080
downscale filter: Bicubic
fps: 30/1
format: NV12
YUV mode: Rec. 709/Partial
info: NV12 texture support not available
info: P010 texture support not available
info: Audio monitoring device:
name: Default
id: default
info: ---------------------------------
warning: Failed to load 'en-US' text for module: 'decklink-captions.so'
warning: Failed to load 'en-US' text for module: 'decklink-output-ui.so'
libDeckLinkAPI.so: cannot open shared object file: No such file or directory
warning: A DeckLink iterator could not be created. The DeckLink drivers may not be installed
warning: Failed to initialize module 'decklink.so'
error: os_dlopen(/usr//lib/obs-plugins/frontend-tools.so->/usr//lib/obs-plugins/frontend-tools.so): libluajit-5.1.so.2: cannot open shared object file: No such file or directory
error: os_dlopen(/usr//lib/obs-plugins/frontend-tools.so->/usr//lib/obs-plugins/frontend-tools.so): libluajit-5.1.so.2: cannot open shared object file: No such file or directory
warning: Module '/usr//lib/obs-plugins/frontend-tools.so' not loaded
info: [pipewire] Available captures:
info: [pipewire] - Desktop capture
info: [pipewire] - Window capture
warning: v4l2loopback not installed, virtual camera disabled
info: [linux-vkcapture] plugin loaded successfully (version 1.3.0)
info: NVENC supported
error: VAAPI: Failed to initialize display in vaapi_device_h264_supported
error: VAAPI: Failed to initialize display in vaapi_device_h264_supported
info: FFmpeg VAAPI H264 encoding not supported
info: ---------------------------------
info: Loaded Modules:
info: text-freetype2.so
info: rtmp-services.so
info: obs-x264.so
info: obs-vst.so
info: obs-transitions.so
info: obs-outputs.so
info: obs-libfdk.so
info: obs-filters.so
info: obs-ffmpeg.so
info: linux-vkcapture.so
info: linux-v4l2.so
info: linux-pulseaudio.so
info: linux-pipewire.so
info: linux-jack.so
info: linux-capture.so
info: linux-alsa.so
info: image-source.so
info: decklink-output-ui.so
info: decklink-captions.so
info: ---------------------------------
info: ==== Startup complete ===============================================
info: All scene data cleared
info: ------------------------------------------------
info: pulse-input: Server name: 'PulseAudio (on PipeWire 0.3.67) 15.0.0'
info: pulse-input: Audio format: s32le, 48000 Hz, 2 channels
info: pulse-input: Started recording from 'alsa_output.pci-0000_00_1f.3.analog-stereo.monitor' (default)
info: [Loaded global audio device]: 'Desktop Audio'
info: pulse-input: Server name: 'PulseAudio (on PipeWire 0.3.67) 15.0.0'
info: pulse-input: Audio format: s16le, 48000 Hz, 1 channels
info: pulse-input: Started recording from 'alsa_input.usb-BLUE_MICROPHONE_Blue_Snowball_797_2018_11_06_69254-00.mono-fallback'
info: [Loaded global audio device]: 'Mic/Aux'
info: - filter: 'Noise Gate' (noise_gate_filter)
error: ext_screencopy_manager_v1 not available
info: Switched to scene 'Scene'
info: ------------------------------------------------
info: Loaded scenes:
info: - scene 'Scene':
info: - source: 'Game Capture' (vkcapture-source)
info: ------------------------------------------------
info: adding 42 milliseconds of audio buffering, total audio buffering is now 42 milliseconds (source: Desktop Audio)
info: [linux-vkcapture] Creating texture from dmabuf 2560x1440 modifier:216172782120099860
info: [linux-vkcapture] [0] fd:47 stride:10240 offset:0
error: Cannot create EGLImage: Arguments are inconsistent (for example, a valid context requires buffers not supplied by a valid surface).
warning: [linux-vkcapture] Asking client to create texture no modifiers
info: [linux-vkcapture] Creating texture from dmabuf 2560x1440 modifier:72057594037927935
info: [linux-vkcapture] [0] fd:48 stride:10240 offset:0```
Thanks, so it works correctly now.
In prime system setups, where the system makes use of two graphics cards, an igpu(sink source) and dgpu(sink provider), vkcapture will only capture the game if obs is launched on the gpu the game is running on, otherwise it will only show a black screen.
I'm wondering if there's a workaround/patch that can be made for this. It's not a huge problem, but it is a limitation.
If for example the game is on the dgpu(sink providing) output, you would need to launch obs on the same gpu to capture it. However, this would mean that screen/window capture(pipewire) would instead be black as it follows a similar limitation. Obs needs to Be launched on the primary(sink source) gpu in order for screen/window capture(pipewire) to work.
It would be ideal if we could capture the game with obs running on the primary(sink source) gpu regardless of what gpu the game is running on. This would allow us to use screen/window capture(pipewire) and game capture at the same time always. There's not many scenarios I can think of where that's needed, but it allows us the option if needed.
Outside of that, it will also save a handful of people who are wondering why gamecapture isn't working & is just showing a black screen a headache since by default, obs launches on the primary(sink source) gpu & their game is likely on the dgpu(sink provider). It might be worth noting this on the main page.
I want to also say that in my experience, this behavior isn't always consistent. While trying to capture diablo 4 which was running on my dgpu, it did actually show on obs even when I ran it on my igpu, but would crash when trying to record/stream with a out of memory error with obs on the igpu. I think it tried allocating memory from the igpu which would explain that, but the point here is that this anomaly shows that it might be possible to get a proper workaround/patch for this. Other games that run on the dgpu show a black screen though in my testing.