HaveAGitGat / Tdarr

Tdarr - Distributed transcode automation using FFmpeg/HandBrake + Audio/Video library analytics + video health checking (Windows, macOS, Linux & Docker)
Other
3.07k stars 96 forks source link

Intel 11th gen iHD driver not working? (Docker) #452

Closed Boosh1 closed 2 years ago

Boosh1 commented 3 years ago

EDIT - Workaround solution found in comments below.

Just want to have this highlighted for any others who stumble upon this while it's still open

Describe the bug I've got an Intel i5-11400 in my Unraid system (UHD Graphics 730) and I am running Tdarr as a Docker. I've tried to enable hardware encoding with a Tdarr Node but had no luck so far. By the looks of it the current drivers inside the docker image don't support it, and I've been unable to manually update them to anything usable.

I've looked at issue #375 and tried the suggestion there but that did not help (Everything claims to be upto date if I run "apt-get update" inside the container). I have also checked discord, but while there seemed to be acknowledgement there were issues with 10th/11th gen, I've not seen a issue entered here for 11th gen specifically.

I use other docker containers which CAN use HW transcoding fine (i.e Jellyfin server & Plex Server) so I do not believe the issue to be with setup and I believe I am correctly passing the device through to Tdarr

For some extra info on docker setup. I've added these extra variables to try and force use of the iHD driver.

Variables LIBVA_DRIVERS_PATH:/usr/lib/x86_64-linux-gnu/dri LIBVA_DRIVER_NAME:iHD

Device /dev/dri:/dev/dri

To Reproduce Steps to reproduce the behavior:

Expected behavior Tdarr should use & support upto date intel drivers to support new processors.

Screenshots VAinfo

Please provide the following information: -Worker error: Tdarr_Node_WorkerError.txt

Tdarr Docker VAINFO error: Tdarr_Node_VainfoError.txt

For comparison. Jellyfin Docker which works and Vainfo correctly grabs the driver: Jellyfin_VainfoSuccess.txt

Additional context edit Just to keep this part upto date. Following advice below I tried adding a different repro to the docker to try and get a new driver. Unfortunately that still doesn't work. The main thing I've noticed during all this is that the VA-API version in the docker never changes despite installing new drivers, and updating libva2, vainfo etc. I use linuxserver/jellyfin (where hw transcoding works for me) and in their container the VA-API version is (1.11.0). However no matter what I do, or what driver I install in Tdarr the VA-API version never changes from (1.8.0). I suspect this is the issue of why vainfo can't identify the driver but I don't have enough experience of docker to know.

EvilTactician commented 3 years ago

Hey Boosh1.

For what it's worth, I used to be a backer of Tdarr and quite excited about the project but I was the one who was using 10th gen Intel. I've never got it to work, despite some attempts on Discord to help out and get things sorted.

If you do find a solution, I'd love to hear it as I'm really eager to get Tdarr operational, but I have no interest in either running a dedicated GPU or to use our gaming PCs as nodes as the whole point of this is meant to be a solution which runs in the background with minimal fuss.

FredHaa commented 3 years ago

Have you tried to install intel-media-va-driver-non-free in the container? It should be in the tdarr_node I guess. I have recently fixed hw transcoding in my jellyfin instance by installing this.

I use a 10th generation processor.

Boosh1 commented 3 years ago

Yup pretty sure I tried that already. Just tried again and this is the result

root@21f42c01039e:/# apt-get install intel-media-va-driver-non-free
Reading package lists... Done
Building dependency tree       
Reading state information... Done
The following packages will be REMOVED:
  intel-media-va-driver
The following NEW packages will be installed:
  intel-media-va-driver-non-free
0 upgraded, 1 newly installed, 1 to remove and 6 not upgraded.
Need to get 5084 kB of archives.
After this operation, 27.7 MB of additional disk space will be used.
Do you want to continue? [Y/n] y
Get:1 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 intel-media-va-driver-non-free amd64 20.1.1+ds1-1build1 [5084 kB]
Fetched 5084 kB in 1s (6875 kB/s)                         
dpkg: intel-media-va-driver:amd64: dependency problems, but removing anyway as you requested:
 va-driver-all:amd64 depends on intel-media-va-driver | intel-media-va-driver-non-free; however:
  Package intel-media-va-driver:amd64 is to be removed.
  Package intel-media-va-driver-non-free is not installed.

(Reading database ... 17863 files and directories currently installed.)
Removing intel-media-va-driver:amd64 (20.1.1+dfsg1-1) ...
Selecting previously unselected package intel-media-va-driver-non-free:amd64.
(Reading database ... 17859 files and directories currently installed.)
Preparing to unpack .../intel-media-va-driver-non-free_20.1.1+ds1-1build1_amd64.deb ...
Unpacking intel-media-va-driver-non-free:amd64 (20.1.1+ds1-1build1) ...
Setting up intel-media-va-driver-non-free:amd64 (20.1.1+ds1-1build1) ...
root@21f42c01039e:/# vainfo
error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.8.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_7
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so init failed
libva info: va_openDriver() returns 1
vaInitialize failed with error code 1 (operation failed),exit
root@21f42c01039e:/# 

And after running a vaapi plugin inside Tdarr

Metadata:

BPS-eng : 99

DURATION-eng : 00:20:56.823000000

NUMBER_OF_FRAMES-eng: 415

NUMBER_OF_BYTES-eng: 15647

_STATISTICS_WRITING_APP-eng: mkvmerge v55.0.0 ('Waiting For Space') 64-bit

_STATISTICS_WRITING_DATE_UTC-eng: 2021-03-22 17:16:49

_STATISTICS_TAGS-eng: BPS DURATION NUMBER_OF_FRAMES NUMBER_OF_BYTES

[AVHWDeviceContext @ 0x558a7eedb300] libva: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so init failed

[AVHWDeviceContext @ 0x558a7eedb300] Failed to initialise VAAPI connection: 1 (operation failed).

Device creation failed: -5.

[h264 @ 0x558a7ee24f00] No device available for decoder: device type vaapi needed for codec h264.

Stream mapping:

Stream #0:0 -> #0:0 (h264 (native) -> hevc (hevc_vaapi))

Stream #0:1 -> #0:1 (copy)

Stream #0:2 -> #0:2 (copy)

Device setup failed for decoder on input stream #0:0 : Input/output error

I don't think this method gets the drivers needed for 11th gen. Possibly and issue with the ubuntu version inside the docker? Since the other dockers containers I have seem to be able to update to a higher VA-API version as well.

FredHaa commented 3 years ago

Weird, my container didn't have any intel-media-va-driver installed. Neither free or non-free. I could just install it without hickups and then use a VAAPI plugin. Not QSV though, as the ffmpeg binary in the container is not compiled to support it.

Boosh1 commented 3 years ago

Can't recall if my Tdarr_Node had any driver to start with tbh, and honestly not super familiar with dockers inner workings, I think I mainly just tried installing different intel drivers to see if anything would work. On that point it would be nice to just install the docker and it'd come with a generic driver, but there'd be a variable you could set which would set it up with the right driver depending on cpu generation etc. Also getting the included ffmpeg compiled with qsv support would be great as well tbh, but that's a separate matter, just want to get Vaapi working to start with.

AndreaPro commented 3 years ago

I have the same issue too on an Intel 10th gen processor, I am using QSV with success on a Jellyfin docker container. I think that is an issue related to the package version of the Intel drivers, on Jellyfin is 21.xx here even with a manual update there is a 20.xx

FredHaa commented 3 years ago

The official jellyfin container is based on Debian, which paradoxically has a newer version of the va-drivers in the repo than Ubuntu. If you have a look at the linuxserver-jellyfin dockerfile which is based on Ubuntu 20.04 just like Tdarr, you can see that they add the external Intel repo:

curl -s https://repositories.intel.com/graphics/intel-graphics.key | apt-key add -

echo 'deb [arch=amd64] https://repositories.intel.com/graphics/ubuntu focal main' > /etc/apt/sources.list.d/intel-graphics.list

After this you can apt update, and the 21.xx version of the Intel drivers should be available.

Boosh1 commented 3 years ago

Gave that a shot to see what would happen. That seems to be installing a more upto date driver for me. Claims I've installed this "21.2.1+i571~u20.04". However vainfo still can't run it, but I believe vainfo is not on the right version anyway. Install claims it's fully upto date but it's on a clearly different version compared to jellyfin which is odd.

So this might not be the solution for 11th gen, but might work for 10th gen?

Running this a my update method btw

curl -s https://repositories.intel.com/graphics/intel-graphics.key | apt-key add - && \
echo 'deb [arch=amd64] https://repositories.intel.com/graphics/ubuntu focal main' > /etc/apt/sources.list.d/intel-graphics.list && \ 
apt-get update && \
apt-get install -y --no-install-recommends \
intel-media-va-driver-non-free \
mesa-va-drivers && \
vainfo

Then what it reports back

root@c1a324bf4fe6:/# apt-get update && \
>  apt-get install -y --no-install-recommends \
> intel-media-va-driver-non-free \
> vainfo \
> mesa-va-drivers && \
> vainfo
Hit:1 http://ppa.launchpad.net/kisak/kisak-mesa/ubuntu focal InRelease
Hit:2 https://deb.nodesource.com/node_14.x focal InRelease                                                                 
Hit:3 http://archive.ubuntu.com/ubuntu focal InRelease                                                                     
Hit:4 http://archive.ubuntu.com/ubuntu focal-updates InRelease      
Hit:5 http://archive.ubuntu.com/ubuntu focal-backports InRelease    
Hit:6 http://security.ubuntu.com/ubuntu focal-security InRelease    
Hit:7 https://repositories.intel.com/graphics/ubuntu focal InRelease
Reading package lists... Done
Reading package lists... Done
Building dependency tree       
Reading state information... Done
intel-media-va-driver-non-free is already the newest version (21.2.1+i571~u20.04).
vainfo is already the newest version (2.11.1+i571~u20.04).
mesa-va-drivers is already the newest version (21.1.2~kisak1~f).
0 upgraded, 0 newly installed, 0 to remove and 98 not upgraded.
error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.8.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so has no function __vaDriverInit_1_0
libva info: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit
AndreaPro commented 3 years ago

Gave that a shot to see what would happen. That seems to be installing a more upto date driver for me. Claims I've installed this "21.2.1+i571~u20.04". However vainfo still can't run it, but I believe vainfo is not on the right version anyway. Install claims it's fully upto date but it's on a clearly different version compared to jellyfin which is odd.

So this might not be the solution for 11th gen, but might work for 10th gen?

Running this a my update method btw

curl -s https://repositories.intel.com/graphics/intel-graphics.key | apt-key add - && echo 'deb [arch=amd64] https://repositories.intel.com/graphics/ubuntu focal main' > /etc/apt/sources.list.d/intel-graphics.list && \ apt-get update && apt-get install -y --no-install-recommends intel-media-va-driver-non-free vainfo mesa-va-drivers && \

Then what it reports back

root@c1a324bf4fe6:/# apt-get update && \
>  apt-get install -y --no-install-recommends \
> intel-media-va-driver-non-free \
> vainfo \
> mesa-va-drivers && \
> vainfo
Hit:1 http://ppa.launchpad.net/kisak/kisak-mesa/ubuntu focal InRelease
Hit:2 https://deb.nodesource.com/node_14.x focal InRelease                                                                 
Hit:3 http://archive.ubuntu.com/ubuntu focal InRelease                                                                     
Hit:4 http://archive.ubuntu.com/ubuntu focal-updates InRelease      
Hit:5 http://archive.ubuntu.com/ubuntu focal-backports InRelease    
Hit:6 http://security.ubuntu.com/ubuntu focal-security InRelease    
Hit:7 https://repositories.intel.com/graphics/ubuntu focal InRelease
Reading package lists... Done
Reading package lists... Done
Building dependency tree       
Reading state information... Done
intel-media-va-driver-non-free is already the newest version (21.2.1+i571~u20.04).
vainfo is already the newest version (2.11.1+i571~u20.04).
mesa-va-drivers is already the newest version (21.1.2~kisak1~f).
0 upgraded, 0 newly installed, 0 to remove and 98 not upgraded.
error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.8.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so has no function __vaDriverInit_1_0
libva info: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit

Even in my case vainfo gives the same error, is curious that if I run apt-get update and apt-get install intel-media-va-driver-non-free without adding the Jellyfin repo vainfo doesn't gives an error and recognizes the iHD driver. On Intel 10th gen i3 10100

Boosh1 commented 3 years ago

Thought it was worth just double checking that on my end. So to clarify it looks like I get slightly different vainfo errors depending on the driver I try.

Standard driver install via method above. No extra repros added then just running apt-get update and apt-get install intel-media-va-driver-non-free

libva info: VA-API version 1.8.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_7
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so init failed
libva info: va_openDriver() returns 1
vaInitialize failed with error code 1 (operation failed),exit

My understanding of the above is that it finds a function it can init but then fails, I assume because 11th gen is too new for this.

Then with the more up to date driver via adding that repo.

libva info: VA-API version 1.8.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so has no function __vaDriverInit_1_0
libva info: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit

This one though it seems to completely fail to find anything to init, possibly because the newer driver needs a newer vainfo?

Edit - On thinking again, this might not be vainfo which is out of date, but that libva claims the installed VA-API version is just 1.8.0, but we know there is a newer version inside linuxserver/jellyfin (1.11.0). Do they compile their own version perhaps? Or maybe the driver we're grabbing is missing something.

DJScias commented 3 years ago

Hello there,

I'm running an Intel NUC having a 10th gen i7-10710U which also has iHD drivers. Having the same errors as you I figured I'd add the environment variable LIBVA_DRIVER_NAME:iHD as well.

Unfortunately even with that, vainfo was giving me trouble so I went inside of the container (I use portainer, but just look for docker exec). I ran the following command:

ls -al /usr/lib/x86_64-linux-gnu/dri/

To my surprise, iHD_drv_video.so was missing from that list INSIDE the docker container even though my linux server had it sitting right there when I checked it with that command outside of the container.

So I went and ran the following INSIDE the docker container:

sudo apt install vainfo intel-media-va-driver-non-free -y
ls -al /usr/lib/x86_64-linux-gnu/dri/

And voila, it now showed up as well (img attached is from container console) and GPU encoding stopped failing with VAAPI device errors. image

I hope the above turns out to be what you're also missing OP and that it might fix it.. If not, I'm sorry and I suppose I just got lucky that this was the fix for my variant of this issue.

Boosh1 commented 3 years ago

Thanks for the info. I realise now I've never actually verified the driver file was there, but unfortunately yes it was so still no luck on my end.

So to confirm, yes you're correct that by default it seems the docker container does not contain the iHD driver. If you need the iHD driver it seems you have to manually install it right now by installing the non free drivers (intel-media-va-driver-non-free).

As in previous posts I have installed this driver before but I'll confirm now that yes the driver is present if I run the command above (See below). Vainfo unfortunately still gives me an error and unfortunately I'm not knowledgeable enough with docker or linux to diagnose why that's the case. Again I'll highlight that the main thing I can note on my end is that despite being fully upto date vainfo reports back that we've got VA-API 1.8, however a container which has a functioning iHD driver like linuxserver/jellyfin reports back VA-API 1.11. Why that's the case though, I have no clue.

What I get with the "ls -al /usr/lib/x86_64-linux-gnu/dri/" command.

root@e375c82107c4:/# ls -al /usr/lib/x86_64-linux-gnu/dri/
total 374896
drwxr-xr-x  1 root root      580 Aug 13 12:24 .
drwxr-xr-x  1 root root    18234 Aug 13 12:24 ..
-rw-r--r--  5 root root 14775976 Dec 16  2020 i915_dri.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 i965_dri.so
-rw-r--r--  1 root root  8098968 Feb  6  2020 i965_drv_video.so
-rw-r--r--  1 root root 36742760 Jul 20 17:16 iHD_drv_video.so <<<<<<<<<<< Just to Highlight it here
-rw-r--r-- 10 root root 22760256 Dec 16  2020 iris_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 kms_swrast_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 nouveau_dri.so
-rw-r--r--  3 root root 12510464 Aug 10 02:23 nouveau_drv_video.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 nouveau_vieux_dri.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 r200_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 r300_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 r600_dri.so
-rw-r--r--  3 root root 12510464 Aug 10 02:23 r600_drv_video.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 radeon_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 radeonsi_dri.so
-rw-r--r--  3 root root 12510464 Aug 10 02:23 radeonsi_drv_video.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 swrast_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 virtio_gpu_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 vmwgfx_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 zink_dri.so
marine1988 commented 3 years ago

Hi ther im facing same issue with 10th ghen intel please help!

YannickG1 commented 3 years ago

Same issue running tdarr on Unraid and Intel 11600. Plex transcodes just fine on the iGPU so that's not the issue here. I tried every comment above but nothing appears to make it work.

devlinandrew commented 3 years ago

I was stuck on this forever, and this thread helped me resolve. I have an i7 10700k. What worked for me running in a docker on Unraid so, Linux based):

  1. Make sure in the Node Options for your QSV tdarr node you have the hardware encoding type set to "vaapi" and NOT "qsv".
  2. Run the above command in the docker container (apt install vainfo intel-media-va-driver-non-free)

This solved my issues getting Tdarr GPU encoding working on my 10th gen Intel CPU in a docker on Unraid.

renedis commented 3 years ago

Thanks guys.

Installing with "apt install vainfo intel-media-va-driver-non-free" in the tdarr-node docker image did the trick for me on my i9-10900.

devlinandrew commented 3 years ago

@HaveAGitGat can we please have these drivers added to the container?

TomStarren commented 3 years ago

I tried the steps mentioning above (I5-11400), but still cannot use the integrated GPU to transcode.

error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.8.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri//iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_7
libva error: /usr/lib/x86_64-linux-gnu/dri//iHD_drv_video.so init failed
libva info: va_openDriver() returns 1
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri//i965_drv_video.so
libva info: Found init function __vaDriverInit_1_6
libva error: /usr/lib/x86_64-linux-gnu/dri//i965_drv_video.so init failed
libva info: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit

When running the command to install the drivers I get the following output

Reading package lists... Done
Building dependency tree       
Reading state information... Done
vainfo is already the newest version (2.6.0+ds1-1).
The following NEW packages will be installed:
  intel-media-va-driver-non-free libigdgmm11
0 upgraded, 2 newly installed, 0 to remove and 111 not upgraded.
Need to get 5196 kB of archives.
After this operation, 35.6 MB of additional disk space will be used.
Get:1 http://archive.ubuntu.com/ubuntu focal/universe amd64 libigdgmm11 amd64 20.1.1+ds1-1 [111 kB]
Get:2 http://archive.ubuntu.com/ubuntu focal/multiverse amd64 intel-media-va-driver-non-free amd64 20.1.1+ds1-1build1 [5084 kB]
Fetched 5196 kB in 0s (29.8 MB/s)                       
Selecting previously unselected package libigdgmm11:amd64.
(Reading database ... 17858 files and directories currently installed.)
Preparing to unpack .../libigdgmm11_20.1.1+ds1-1_amd64.deb ...
Unpacking libigdgmm11:amd64 (20.1.1+ds1-1) ...
Selecting previously unselected package intel-media-va-driver-non-free:amd64.
Preparing to unpack .../intel-media-va-driver-non-free_20.1.1+ds1-1build1_amd64.deb ...
Unpacking intel-media-va-driver-non-free:amd64 (20.1.1+ds1-1build1) ...
Setting up libigdgmm11:amd64 (20.1.1+ds1-1) ...
Setting up intel-media-va-driver-non-free:amd64 (20.1.1+ds1-1build1) ...
Processing triggers for libc-bin (2.31-0ubuntu9.1) ...
/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libz.so.1 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libxml2.so.2 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libxcb.so.1 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libxcb-xfixes.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libxcb-shm.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libwayland-client.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libsystemd.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libstdc++.so.6 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libpng16.so.16 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/liblzma.so.5 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libjpeg.so.8 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libicuuc.so.66 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libicudata.so.66 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libgomp.so.1 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libglib-2.0.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libexpat.so.1 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libdbus-1.so.3 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libbsd.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libXxf86vm.so.1 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libXfixes.so.3 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libXext.so.6 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libXdmcp.so.6 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libXau.so.6 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libX11.so.6 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libFLAC.so.8 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libpcre.so.3 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/liblz4.so.1 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libgpg-error.so.0 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libgcrypt.so.20 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libffi.so.7 is not a symbolic link

/sbin/ldconfig.real: /lib/x86_64-linux-gnu/libbz2.so.1.0 is not a symbolic link
devlinandrew commented 3 years ago

I get the “not a symbolic link” error as well but the driver install works.

After you run the driver install, do you still get the libva error: /usr/lib/x86_64-linux-gnu/dri//i965_drv_video.so init failed ? The driver install should resolve this error.

The X server error is related to the GUI, not the transcoding. Something to do with the docker installation.

TomStarren commented 3 years ago

@devlinandrew Yes, see output below

# apt-get install vainfo intel-media-va-driver-non-free
Reading package lists... Done
Building dependency tree       
Reading state information... Done
vainfo is already the newest version (2.6.0+ds1-1).
intel-media-va-driver-non-free is already the newest version (20.1.1+ds1-1build1).
0 upgraded, 0 newly installed, 0 to remove and 111 not upgraded.
# vainfo
error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.8.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri//iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_7
libva error: /usr/lib/x86_64-linux-gnu/dri//iHD_drv_video.so init failed
libva info: va_openDriver() returns 1
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri//i965_drv_video.so
libva info: Found init function __vaDriverInit_1_6
libva error: /usr/lib/x86_64-linux-gnu/dri//i965_drv_video.so init failed
libva info: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit
# 
devlinandrew commented 3 years ago

Did you confirm inside the container that the files are there after installing this driver? i.e. checking in /usr/lib/x86_x64-linux-gnu/dri for the .so files?

One more thing to check if you are on Unraid (as I am): are the permissions set to 777 on your /dev/dri device? The renderD128 line item in there?

TomStarren commented 3 years ago

@devlinandrew I'm running TDARR on Unraid and Plex and Unmanic are running great on the GPU, so the permissions on the dev/dri should be ok.

What do I need to check for the files? I'm opening the console by clicking on the tdarr_node on the docker tab and then Console. Could you share your unraid docker settings? Maybe I'm missing something there.

devlinandrew commented 3 years ago

I had the same experience as you. Plex transcoding worked fine on QSV but not Tdarr. I am on a 10th gen vs. 11th gen CPU though, so there’s potentially a big difference there (UHD 630 vs. 730 GPU).

In the tdarr_node console, run this “ls” command. Here is my output. Notice that the .so files are there:

ls -la /usr/lib/x86_64-linux-gnu/dri/

Output: total 372860 drwxr-xr-x 1 root root 580 Oct 8 05:36 . drwxr-xr-x 1 root root 18220 Oct 8 05:36 .. -rw-r--r-- 5 root root 14775976 Dec 16 2020 i915_dri.so -rw-r--r-- 5 root root 14775976 Dec 16 2020 i965_dri.so -rw-r--r-- 1 root root 8098968 Feb 6 2020 i965_drv_video.so -rw-r--r-- 1 root root 35102696 Apr 21 2020 iHD_drv_video.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 iris_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 kms_swrast_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 nouveau_dri.so -rw-r--r-- 3 root root 12361904 Jul 28 20:43 nouveau_drv_video.so -rw-r--r-- 5 root root 14775976 Dec 16 2020 nouveau_vieux_dri.so -rw-r--r-- 5 root root 14775976 Dec 16 2020 r200_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 r300_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 r600_dri.so -rw-r--r-- 3 root root 12361904 Jul 28 20:43 r600_drv_video.so -rw-r--r-- 5 root root 14775976 Dec 16 2020 radeon_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 radeonsi_dri.so -rw-r--r-- 3 root root 12361904 Jul 28 20:43 radeonsi_drv_video.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 swrast_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 virtio_gpu_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 vmwgfx_dri.so -rw-r--r-- 10 root root 22760256 Dec 16 2020 zink_dri.so

Also, in your Unraid console, run this command to check the permissions on your QSV device:

ls -la /dev/dri/ Output: total 0 drwxrwxrwx 3 root root 100 Sep 28 08:08 ./ drwxr-xr-x 16 root root 4240 Sep 28 08:15 ../ drwxrwxrwx 2 root root 80 Sep 28 08:08 by-path/ crw-rw---- 1 root video 226, 0 Oct 6 11:51 card0 crwxrwxrwx 1 root video 226, 128 Sep 28 08:08 renderD128

Note that my renderD128 device has permissions crwxrwxrwx. Does yours say the same? If not, run this command in the Unraid console: chmod 777 /dev/dri/renderD128 and then check the permissions again with the ls -la /dev/dri/ command to see if the permissions match mine.

Once done, restart the tdarr_node docker and try again.

Edit: fixed directory listing rendering.

TomStarren commented 3 years ago

Output folder

ls -la /usr/lib/x86_64-linux-gnu/dri/
total 372860
drwxr-xr-x  1 root root      580 Oct 11 15:23 .
drwxr-xr-x  1 root root    18220 Oct 11 15:23 ..
-rw-r--r--  5 root root 14775976 Dec 16  2020 i915_dri.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 i965_dri.so
-rw-r--r--  1 root root  8098968 Feb  6  2020 i965_drv_video.so
-rw-r--r--  1 root root 35102696 Apr 21  2020 iHD_drv_video.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 iris_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 kms_swrast_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 nouveau_dri.so
-rw-r--r--  3 root root 12361904 Jul 29 02:43 nouveau_drv_video.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 nouveau_vieux_dri.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 r200_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 r300_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 r600_dri.so
-rw-r--r--  3 root root 12361904 Jul 29 02:43 r600_drv_video.so
-rw-r--r--  5 root root 14775976 Dec 16  2020 radeon_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 radeonsi_dri.so
-rw-r--r--  3 root root 12361904 Jul 29 02:43 radeonsi_drv_video.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 swrast_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 virtio_gpu_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 vmwgfx_dri.so
-rw-r--r-- 10 root root 22760256 Dec 16  2020 zink_dri.so

Output renderD128 permissions

ls -la /dev/dri/
total 0
drwxrwxrwx  3 root root       100 Sep  8 18:09 ./
drwxr-xr-x 15 root root      3400 Oct  7 17:21 ../
drwxrwxrwx  2 root root        80 Sep  8 18:09 by-path/
crwxrwxrwx  1 root video 226,   0 Sep  8 18:09 card0
crwxrwxrwx  1 root video 226, 128 Sep  8 18:09 renderD128
devlinandrew commented 3 years ago

Here is a screenshot of my node docker config. Note the privileged setting and the extra parameters.

image

image

TomStarren commented 3 years ago

@devlinandrew I have the same settings but still no luck transcoding. What for settings do you have in TDARR plugins etc?

Output of the failed transcode:

NodeID: QuicSync Node
Command:
ffmpeg -hwaccel vaapi -hwaccel_device /dev/dri/renderD128 -hwaccel_output_format vaapi -i /mnt/movies/_input/<Name of the serie>/Season 01/<Name of the serie>.S01E03.HDTV-720p.Sonarr.mkv -map 0 -map -0:d -c:v hevc_vaapi -b:v 3436k -minrate 2405k -maxrate 4466k -bufsize 5155k -map -v:1 -map -v:2 -map -v:3 -map -v:4 -c:a copy -c:s copy -max_muxing_queue_size 4096 /temp/<Name of the serie>.S01E03.HDTV-720p.Sonarr-TdarrCacheFile-Zx5z6-FiA.mkv
Last 200 lines of CLI log:
ffmpeg version 4.2 Copyright (c) 2000-2019 the FFmpeg developers
built with gcc 9 (Ubuntu 9.3.0-10ubuntu2)
configuration: --extra-cflags=-I/usr/local/cuda/include --extra-ldflags=-L/usr/local/cuda/lib64 --nvccflags='-gencode arch=compute_52,code=sm_52 -O2' --disable-debug --disable-doc --disable-ffplay --disable-static --enable-cuda-nvcc --enable-cuda-sdk --enable-cuvid --enable-ffprobe --enable-gpl --enable-libaom --enable-libass --enable-libfdk_aac --enable-libfreetype --enable-libkvazaar --enable-libmp3lame --enable-libnpp --enable-libopencore-amrnb --enable-libopenjpeg --enable-libopus --enable-libtheora --enable-libv4l2 --enable-libvidstab --enable-libvmaf --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libx265 --enable-libxml2 --enable-libxvid --enable-nonfree --enable-nvdec --enable-nvenc --enable-openssl --enable-shared --enable-small --enable-stripping --enable-vaapi --enable-vdpau --enable-version3
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
Input #0, matroska,webm, from '/mnt/movies/_input/<Name of the serie>/Season 01/<Name of the serie>.S01E03.HDTV-720p.Sonarr.mkv':
Metadata:
title : <Name of the serie>.S01E03.1080p.WEB-DL.DD5.1.H.264-RARBG
encoder : libebml v1.3.1 + libmatroska v1.4.2
creation_time : 2015-10-07T12:14:19.000000Z
Duration: 00:42:01.69, start: 0.000000, bitrate: 5405 kb/s
Stream #0:0(eng): Video: h264, yuv420p(tv, bt709, progressive), 1916x1076 [SAR 1:1 DAR 479:269], 23.98 fps, 23.98 tbr, 1k tbn, 2k tbc (default)
Metadata:
title : <Name of the serie>.S01E03.1080p.WEB-DL.DD5.1.H.264-RARBG
Stream #0:1(eng): Audio: ac3, 48000 Hz, 5.1(side), fltp, 384 kb/s (default)
Metadata:
title : <Name of the serie>.S01E03.1080p.WEB-DL.DD5.1.H.264-RARBG
Stream #0:2(eng): Subtitle: subrip
Stream #0:3: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 120x176, 90k tbr, 90k tbn, 90k tbc (attached pic)
Metadata:
filename : small_cover.jpg
mimetype : image/jpeg
Stream #0:4: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 213x120, 90k tbr, 90k tbn, 90k tbc (attached pic)
Metadata:
filename : small_cover_land.jpg
mimetype : image/jpeg
Stream #0:5: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 600x882, 90k tbr, 90k tbn, 90k tbc (attached pic)
Metadata:
filename : cover.jpg
mimetype : image/jpeg
Stream #0:6: Video: mjpeg, yuvj444p(pc, bt470bg/unknown/unknown), 1067x600, 90k tbr, 90k tbn, 90k tbc (attached pic)
Metadata:
filename : cover_land.jpg
mimetype : image/jpeg
[AVHWDeviceContext @ 0x563c6a8a7100] libva: /usr/lib/x86_64-linux-gnu/dri//iHD_drv_video.so init failed
[AVHWDeviceContext @ 0x563c6a8a7100] libva: /usr/lib/x86_64-linux-gnu/dri//i965_drv_video.so init failed
[AVHWDeviceContext @ 0x563c6a8a7100] Failed to initialise VAAPI connection: -1 (unknown libva error).
Device creation failed: -5.
[h264 @ 0x563c6a868e80] No device available for decoder: device type vaapi needed for codec h264.
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> hevc (hevc_vaapi))
Stream #0:1 -> #0:1 (copy)
Stream #0:2 -> #0:2
(copy)
Device setup failed for decoder on input stream #0:0 : Input/output error
Altycoder commented 3 years ago

@devlinandrew

when you say:

  1. Make sure in the Node Options for your QSV tdarr node you have the hardware encoding type set to "vaapi" and NOT "qsv".

Is this in the tdarr gui or in the docker command? I can't see in the tdarr docs where this would be specified or how

devlinandrew commented 3 years ago

@devlinandrew

when you say:

  1. Make sure in the Node Options for your QSV tdarr node you have the hardware encoding type set to "vaapi" and NOT "qsv".

Is this in the tdarr gui or in the docker command? I can't see in the tdarr docs where this would be specified or how

In the Tdarr GUI. Take these steps:

  1. Click on the main Tdarr tab
  2. Expand the Nodes section
  3. Click on the node you want to edit
  4. Click on the "Options" button above the Transcode/Health Check worker counts
  5. The first available dropdown is "Specify the hardware encoding type for 'GPU' workers... >> change this to "vaapi"

"QSV" is used in Windows, but "VAAPI" is used in Linux.

devlinandrew commented 3 years ago

@devlinandrew I have the same settings but still no luck transcoding. What for settings do you have in TDARR plugins etc?

Sorry, I don't think I can take you any further, unfortunately. The error you are getting isn't plugin related...it's the node's ability to connect to the GPU. I would want to know if ANYONE has successfully enabled QuickSync GPU encoding on an 11th gen CPU in Tdarr.

My plugin base was "JB - QSV(vaapi), H265, AAC, MKV, bitrate optimized". I made some modifications to it once I got the GPU working, but not before.

Altycoder commented 3 years ago

@devlinandrew many thanks - where is this stuff even documented? It's not obvious at all

Altycoder commented 3 years ago

Well I have an i3-8100 and I've just got it transcoding finally with the help of @devlinandrew but it's doing it in software not hardware so after 2-3 hours of faffing around I'm giving up for the weekend.

I really appreciate open source software, but this has to be the most poorly documented docker tool I've come across.

Getting quick sync working in plexmedia server was a doddle comparatively.

devlinandrew commented 3 years ago

Well I have an i3-8100 and I've just got it transcoding finally with the help of @devlinandrew but it's doing it in software not hardware so after 2-3 hours of faffing around I'm giving up for the weekend.

I really appreciate open source software, but this has to be the most poorly documented docker tool I've come across.

Getting quick sync working in plexmedia server was a doddle comparatively.

Do you get an error when you turn on GPU transcode workers and turn off CPU transcode workers?

As for documentation, it is what it is...it's a free product! I think it's amazing that it exists and does what it does. It just takes some work to figure out.

Altycoder commented 3 years ago

I didn't have any CPU workers enabled just the one GPU worker and it was showing 400% CPU on my server and HEVC software when I clicked on it. I've run out of time now this week so may pick it up again on Monday. There are no errors in my log relating to transcoding either for the server or node.

All the other docker containers I use (18 of them) all have much better docs so it's a shame about tdarr.

devlinandrew commented 3 years ago

I didn't have any CPU workers enabled just the one GPU worker and it was showing 400% CPU on my server and HEVC software when I clicked on it. I've run out of time now this week so may pick it up again on Monday. There are no errors in my log relating to transcoding either for the server or node.

Interesting, it may be a plugin question at this point. I know that some plugins will change to software encoding if certain codecs are detected. Did it use CPU for all types of files you gave it? Are you using a QSV/vaapi enabled plugin?

All the other docker containers I use (18 of them) all have much better docs so it's a shame about tdarr.

To each their own. It comes off like you aren't appreciative of the work that @HaveAGitGat has done for Tdarr and I don't think that's how you feel.

marine1988 commented 3 years ago

I have 10th generation cpu and can´t use hw decoding on tdarr docker

HaveAGitGat commented 3 years ago

Well I have an i3-8100 and I've just got it transcoding finally with the help of @devlinandrew but it's doing it in software not hardware so after 2-3 hours of faffing around I'm giving up for the weekend.

I really appreciate open source software, but this has to be the most poorly documented docker tool I've come across.

Getting quick sync working in plexmedia server was a doddle comparatively.

Hi, shame you’ve had some trouble.

Using QSV on Windows is pretty straight forward. You just add a QSV plugin or ffmpeg/handbrake transcode arguments and launch a GPU worker and that’s all. You don’t need to change the Node Options hardware encoder option from the default ‘Any’.

The above is simple enough that there hasn’t really been a need for documentation for those using it.

On Linux it’s the same if you have the drivers sorted. I suppose it hasn’t been too much of a priority because a) not many people use it in the first place and b) those who are familiar with VAAPI just add the drivers themselves (like above) and get going which is all that’s needed in most cases but as you’ve seen it can be a bit of a pain sometimes.

I don’t have any QSV/VAAPI capable hardware at the moment so can’t test anything I add in but could add the suggestion above.

On the one hand I have people complaining there’s too much stuff in the container and on the other hand some complain that there’s not enough stuff in so can’t win 😅. You can’t really compare Tdarr to Plex. Plex has 100+ employees whereas Tdarr has just 1 guy 😁.

But yeah if someone wants to put aside some time to test VAAPI driver additions to the container then let me know.

devlinandrew commented 3 years ago

But yeah if someone wants to put aside some time to test VAAPI driver additions to the container then let me know.

I'm working from Docker on Unraid and would be happy to test the driver changes that I suggested above.

devlinandrew commented 3 years ago

I have 10th generation cpu and can´t use hw decoding on tdarr docker

Can you share more info? Are you on Linux, Unraid, Windows, MacOS? What error are you getting?

EvilTactician commented 3 years ago

I suppose it hasn’t been too much of a priority because a) not many people use it in the first place and b) those who are familiar with VAAPI just add the drivers themselves (like above) and get going which is all that’s needed in most cases but as you’ve seen it can be a bit of a pain sometimes.

I don’t have any QSV/VAAPI capable hardware at the moment so can’t test anything I add in but could add the suggestion above.

Isn't this also somewhat self-fulfilling? Whilst it is difficult to get working (or, doesn't work in the majority of cases), not many people are going to use it.

However, on unRaid especially, a lot of users use Intel because it offers the easiest hw transcoding at the lowest cost - without needing a discrete GPU. Being able to relatively easily get Tdarr up and running would open it up to a much wider audience - which longer-term can only be a good thing for the project overall.

I was an early backer for quite some time as I really believed in this project, but never actually transcoded a single file with Tdarr as I went straight in with Intel 10th gen. I suspect the majority of Tdarr users use Plex or equivalent in some fashion, so it's not going to exactly be an uncommon set-up to see people only using an Intel CPU / iGPU. However the 'this doesn't work' style posts on here, Reddit and elsewhere obviously will turn many of those users away from Tdarr.

It would also be interesting to evaluate how many people use multiple nodes vs. people using just a single one and re-evaluating whether or not a "plug & play" single docker container with both server + node in it would be appropriate. If a huge part of the userbase uses Tdarr like that, it would eliminate a lot of set-up niggles.

Similarly, a 'default' plug-in set-up for a couple of different 'common' set-ups in a drop-down which users could select would also be helpful in getting the majority up and running quickly and with less support required from your end. (Users with just an intel CPU, users with a discrete GPU, etc.)

Altycoder commented 3 years ago

I'm using tdarr on docker/linux and adding the driver didn't work, but even that step isn't documented anywhere. My hardware is an i3-8100 which works with plex just fine.

I would urge the developer to stop developing tdarr until the documentation is drastically improved otherwise tdarr is likely to become more difficult and frustrating to use as more and more features are added.

devlinandrew commented 3 years ago

I would urge the developer to stop developing tdarr until the documentation is drastically improved

Friendly reminder that this app is free, and there is nothing forcing you to use it over any other tool to optimize your video library. It’s aimed towards a technical audience that can make driver changes and config updates to match their system needs.

If you need a more user friendly, point and click tool to convert your library, I’m sure there are some paid options to meet your needs. Don’t shit on this developer’s work because you don’t have the time or skills to make it work for you.

Boosh1 commented 3 years ago

With all the activity here I thought it was worth having another go at trying to figure this out so I've had another go at fiddling with the docker to try and get the right drivers for this. Unfortunately still no luck but I'm listing a general overview of what I've done here in case it helps. Just to be clear, I don't recommend anyone follows these steps, just briefly mentioning what I did.

I edited the file inside docker to adjust the sources used. i.e changed /etc/apt/sources.list

Basically, just replaced everything there with what's used in the linuxserver/jellyfin docker

deb http://archive.ubuntu.com/ubuntu/ focal main restricted
deb-src http://archive.ubuntu.com/ubuntu/ focal main restricted
deb http://archive.ubuntu.com/ubuntu/ focal-updates main restricted
deb-src http://archive.ubuntu.com/ubuntu/ focal-updates main restricted
deb http://archive.ubuntu.com/ubuntu/ focal universe multiverse
deb-src http://archive.ubuntu.com/ubuntu/ focal universe multiverse
deb http://archive.ubuntu.com/ubuntu/ focal-updates universe multiverse
deb-src http://archive.ubuntu.com/ubuntu/ focal-updates universe multiverse
deb http://archive.ubuntu.com/ubuntu/ focal-security main restricted
deb-src http://archive.ubuntu.com/ubuntu/ focal-security main restricted
deb http://archive.ubuntu.com/ubuntu/ focal-security universe multiverse
deb-src http://archive.ubuntu.com/ubuntu/ focal-security universe multiverse

deb [trusted=yes] https://repositories.intel.com/graphics/ubuntu focal main
deb [trusted=yes] https://repo.jellyfin.org/ubuntu focal main

These last two are setup like this with trusted just for ease, again I don't suggest you setup up like this.

With this setup I installed intel-media-va-driver-non-free, libva2 and vainfo as well as upgrading other packages.

intel-media-va-driver-non-free (21.3.5+ds1-0ubuntu1~20.04)
vainfo (2.13.0+ds1-0ubuntu1~20.04) 
libva2 (2.13.0-1~20.04)

Comparatively Jellyfin versions are here

intel-media-va-driver-non-free (21.3.3+i620~u20.04)
vainfo (2.12.0+i620~u20.04)
libva (2.12.0+i620~u20.04)

So these are very much upto date. However, running vainfo, same error as always, and still seems to be on an old version of VA-API. I'm convinced the fact that it's stating version 1.8.0 is the reason it can't recognise the driver.

error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.8.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva error: /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so has no function __vaDriverInit_1_0
libva info: va_openDriver() returns -1
vaInitialize failed with error code -1 (unknown libva error),exit

Again, Jellyfin comparison

error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.12.0
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_12
libva info: va_openDriver() returns 0
vainfo: VA-API version: 1.12 (libva 2.12.0)
vainfo: Driver version: Intel iHD driver for Intel(R) Gen Graphics - 21.3.3 (6fdf88c)
vainfo: Supported profile and entrypoints
      VAProfileNone                   : VAEntrypointVideoProc
      VAProfileNone                   : VAEntrypointStats
      VAProfileMPEG2Simple            : VAEntrypointVLD
      VAProfileMPEG2Simple            : VAEntrypointEncSlice
      VAProfileMPEG2Main              : VAEntrypointVLD
      VAProfileMPEG2Main              : VAEntrypointEncSlice
      VAProfileH264Main               : VAEntrypointVLD
      VAProfileH264Main               : VAEntrypointEncSlice
      VAProfileH264Main               : VAEntrypointFEI
      VAProfileH264Main               : VAEntrypointEncSliceLP
      VAProfileH264High               : VAEntrypointVLD
      VAProfileH264High               : VAEntrypointEncSlice
      VAProfileH264High               : VAEntrypointFEI
      VAProfileH264High               : VAEntrypointEncSliceLP
      VAProfileVC1Simple              : VAEntrypointVLD
      VAProfileVC1Main                : VAEntrypointVLD
      VAProfileVC1Advanced            : VAEntrypointVLD
      VAProfileJPEGBaseline           : VAEntrypointVLD
      VAProfileJPEGBaseline           : VAEntrypointEncPicture
      VAProfileH264ConstrainedBaseline: VAEntrypointVLD
      VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
      VAProfileH264ConstrainedBaseline: VAEntrypointFEI
      VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP
      VAProfileHEVCMain               : VAEntrypointVLD
      VAProfileHEVCMain               : VAEntrypointEncSlice
      VAProfileHEVCMain               : VAEntrypointFEI
      VAProfileHEVCMain               : VAEntrypointEncSliceLP
      VAProfileHEVCMain10             : VAEntrypointVLD
      VAProfileHEVCMain10             : VAEntrypointEncSlice
      VAProfileHEVCMain10             : VAEntrypointEncSliceLP
      VAProfileVP9Profile0            : VAEntrypointVLD
      VAProfileVP9Profile0            : VAEntrypointEncSliceLP
      VAProfileVP9Profile1            : VAEntrypointVLD
      VAProfileVP9Profile1            : VAEntrypointEncSliceLP
      VAProfileVP9Profile2            : VAEntrypointVLD
      VAProfileVP9Profile2            : VAEntrypointEncSliceLP
      VAProfileVP9Profile3            : VAEntrypointVLD
      VAProfileVP9Profile3            : VAEntrypointEncSliceLP
      VAProfileHEVCMain12             : VAEntrypointVLD
      VAProfileHEVCMain12             : VAEntrypointEncSlice
      VAProfileHEVCMain422_10         : VAEntrypointVLD
      VAProfileHEVCMain422_10         : VAEntrypointEncSlice
      VAProfileHEVCMain422_12         : VAEntrypointVLD
      VAProfileHEVCMain422_12         : VAEntrypointEncSlice
      VAProfileHEVCMain444            : VAEntrypointVLD
      VAProfileHEVCMain444            : VAEntrypointEncSliceLP
      VAProfileHEVCMain444_10         : VAEntrypointVLD
      VAProfileHEVCMain444_10         : VAEntrypointEncSliceLP
      VAProfileHEVCMain444_12         : VAEntrypointVLD
      VAProfileHEVCSccMain            : VAEntrypointVLD
      VAProfileHEVCSccMain            : VAEntrypointEncSliceLP
      VAProfileHEVCSccMain10          : VAEntrypointVLD
      VAProfileHEVCSccMain10          : VAEntrypointEncSliceLP
      VAProfileHEVCSccMain444         : VAEntrypointVLD
      VAProfileHEVCSccMain444         : VAEntrypointEncSliceLP
      VAProfileAV1Profile0            : VAEntrypointVLD
      VAProfileHEVCSccMain444_10      : VAEntrypointVLD
      VAProfileHEVCSccMain444_10      : VAEntrypointEncSliceLP

Unfortunately again, my docker knowledge is poor and I don't know any further way to diagnose this.

I do also agree with others here that it'd be worth getting this working. While many technical people use Tdarr the easier and more plug and play it is the better really. I've been able to sort out plenty of issues in other software but having to dig in to a docker and manually change things doesn't seem like the right way to approach this stuff.

HaveAGitGat commented 3 years ago

Can someone please try the following and check if QSV is working (credit nickp85):


"I restored the image to try and refine what is necessary and it actually seems to be pretty easy... ran these and it worked.

sudo apt-get install ccache flex bison cmake g++ git patch zlib1g-dev autoconf xutils-dev libtool pkg-config libpciaccess-dev libz-dev clinfo

CMake will automatically detect the platform you're on and enable the platform-specific hooks needed for a working build.

Create a library config file for the iMSDK:

sudo nano /etc/ld.so.conf.d/imsdk.conf Content: /opt/intel/mediasdk/lib /opt/intel/mediasdk/plugins

Then run: sudo ldconfig -vvvv


Thanks :)

Boosh1 commented 3 years ago

@HaveAGitGat - I feel like I might be missing something here. I had a go on my end with a fully restored container and had no luck I'm afraid. Vainfo still errors even if I try updating the driver again.

Perhaps someone else can have a go and see if they have any luck?

Everything installed fine by the looks of it, nano wasn't present so had to install that and entered the lines mentioned. I then got these errors running the final command. Can also list out everything else it reported back if you'd like.

/sbin/ldconfig.real: Can't stat /opt/intel/mediasdk/lib: No such file or directory
/sbin/ldconfig.real: Can't stat /opt/intel/mediasdk/plugins: No such file or directory
/sbin/ldconfig.real: Can't stat /usr/local/lib/x86_64-linux-gnu: No such file or directory
/sbin/ldconfig.real: Path `/usr/lib/x86_64-linux-gnu' given more than once
/sbin/ldconfig.real: Path `/lib/x86_64-linux-gnu' given more than once
/sbin/ldconfig.real: Path `/usr/lib/x86_64-linux-gnu' given more than once
/sbin/ldconfig.real: Path `/usr/lib' given more than once
Boosh1 commented 3 years ago

Edit

I confirmed QSV would work with 11th gen here originally. Main culprit seems to have been needing this variable to correct a path export LD_LIBRARY_PATH="/usr/lib/x86_64-linux-gnu:${APP_DIR}" This was fortunately found in a comment on #253

Now changing this so it is more like a guide.

So I got 11th gen working with QSV with the below steps:

Set the Container to be Privileged

You'll most likely need to ensure the container is running with "Privileged" enabled. In Unraid this is a toggle switch on the container settings page. If you're not using Unraid then I'm unsure if you need this.

Docker variables

First we need to set variables for the docker.

LD_LIBRARY_PATH: /usr/lib/x86_64-linux-gnu
LIBVA_DRIVERS_PATH: /usr/lib/x86_64-linux-gnu/dri
ffmpegPath: /usr/lib/jellyfin-ffmpeg/ffmpeg

_Also if for whatever reason you have issues with the "LD_LIBRARY_PATH", try setting it to "LD_LIBRARY_PATH: /usr/lib/x86_64-linux-gnu:${APP_DIR}" instead. It should work without the :$[APPDIR} part but it's hard to verify without others input.

If you don’t want to set ffmpeg path as a variable then edit the config file and change to this: "ffmpegPath": "/usr/lib/jellyfin-ffmpeg/ffmpeg"

If you don't want to use jellyfin ffmpeg then just ignore anything concerning it and install your own. Just make sure to path Tdarr to it and that it supports QSV!

Install drivers

Finally, we need to install drivers and ffmpeg. Run the below inside the container.

curl -s https://repositories.intel.com/graphics/intel-graphics.key | apt-key add - && \
echo 'deb [arch=amd64] https://repositories.intel.com/graphics/ubuntu focal main' > /etc/apt/sources.list.d/intel-graphics.list && \
apt-get update && \
apt-get install -y --no-install-recommends \
intel-media-va-driver-non-free \
mesa-va-drivers && \
wget https://repo.jellyfin.org/releases/server/ubuntu/versions/jellyfin-ffmpeg/4.3.2-1/jellyfin-ffmpeg_4.3.2-1-focal_amd64.deb && \
apt install -y \
./jellyfin-ffmpeg_4.3.2-1-focal_amd64.deb && \
vainfo

That’s it. Now vainfo should report back something like this with no errors

error: XDG_RUNTIME_DIR not set in the environment.
error: can't connect to X server!
libva info: VA-API version 1.12.0
libva info: User environment variable requested driver 'iHD'
libva info: Trying to open /usr/lib/x86_64-linux-gnu/dri/iHD_drv_video.so
libva info: Found init function __vaDriverInit_1_12
libva info: va_openDriver() returns 0
vainfo: VA-API version: 1.12 (libva 2.6.0)
vainfo: Driver version: Intel iHD driver for Intel(R) Gen Graphics - 21.3.3 (6fdf88c)
vainfo: Supported profile and entrypoints
      VAProfileNone                   : VAEntrypointVideoProc
      VAProfileNone                   : VAEntrypointStats
      VAProfileMPEG2Simple            : VAEntrypointVLD
      VAProfileMPEG2Simple            : VAEntrypointEncSlice
      VAProfileMPEG2Main              : VAEntrypointVLD
      VAProfileMPEG2Main              : VAEntrypointEncSlice
      VAProfileH264Main               : VAEntrypointVLD
      VAProfileH264Main               : VAEntrypointEncSlice
      VAProfileH264Main               : VAEntrypointFEI
      VAProfileH264Main               : VAEntrypointEncSliceLP
      VAProfileH264High               : VAEntrypointVLD
      VAProfileH264High               : VAEntrypointEncSlice
      VAProfileH264High               : VAEntrypointFEI
      VAProfileH264High               : VAEntrypointEncSliceLP
      VAProfileVC1Simple              : VAEntrypointVLD
      VAProfileVC1Main                : VAEntrypointVLD
      VAProfileVC1Advanced            : VAEntrypointVLD
      VAProfileJPEGBaseline           : VAEntrypointVLD
      VAProfileJPEGBaseline           : VAEntrypointEncPicture
      VAProfileH264ConstrainedBaseline: VAEntrypointVLD
      VAProfileH264ConstrainedBaseline: VAEntrypointEncSlice
      VAProfileH264ConstrainedBaseline: VAEntrypointFEI
      VAProfileH264ConstrainedBaseline: VAEntrypointEncSliceLP
      VAProfileHEVCMain               : VAEntrypointVLD
      VAProfileHEVCMain               : VAEntrypointEncSlice
      VAProfileHEVCMain               : VAEntrypointFEI
      VAProfileHEVCMain               : VAEntrypointEncSliceLP
      VAProfileHEVCMain10             : VAEntrypointVLD
      VAProfileHEVCMain10             : VAEntrypointEncSlice
      VAProfileHEVCMain10             : VAEntrypointEncSliceLP
      VAProfileVP9Profile0            : VAEntrypointVLD
      VAProfileVP9Profile0            : VAEntrypointEncSliceLP
      VAProfileVP9Profile1            : VAEntrypointVLD
      VAProfileVP9Profile1            : VAEntrypointEncSliceLP
      VAProfileVP9Profile2            : VAEntrypointVLD
      VAProfileVP9Profile2            : VAEntrypointEncSliceLP
      VAProfileVP9Profile3            : VAEntrypointVLD
      VAProfileVP9Profile3            : VAEntrypointEncSliceLP
      VAProfileHEVCMain12             : VAEntrypointVLD
      VAProfileHEVCMain12             : VAEntrypointEncSlice
      VAProfileHEVCMain422_10         : VAEntrypointVLD
      VAProfileHEVCMain422_10         : VAEntrypointEncSlice
      VAProfileHEVCMain422_12         : VAEntrypointVLD
      VAProfileHEVCMain422_12         : VAEntrypointEncSlice
      VAProfileHEVCMain444            : VAEntrypointVLD
      VAProfileHEVCMain444            : VAEntrypointEncSliceLP
      VAProfileHEVCMain444_10         : VAEntrypointVLD
      VAProfileHEVCMain444_10         : VAEntrypointEncSliceLP
      VAProfileHEVCMain444_12         : VAEntrypointVLD
      VAProfileHEVCSccMain            : VAEntrypointVLD
      VAProfileHEVCSccMain            : VAEntrypointEncSliceLP
      VAProfileHEVCSccMain10          : VAEntrypointVLD
      VAProfileHEVCSccMain10          : VAEntrypointEncSliceLP
      VAProfileHEVCSccMain444         : VAEntrypointVLD
      VAProfileHEVCSccMain444         : VAEntrypointEncSliceLP
      VAProfileAV1Profile0            : VAEntrypointVLD
      VAProfileHEVCSccMain444_10      : VAEntrypointVLD
      VAProfileHEVCSccMain444_10      : VAEntrypointEncSliceLP

Plugins

So to get going with Tdarr & QSV you'll also need a plugin. I've attached mine which I've put together. It's a heavily modified Migz plugin so it works with QSV and does a bunch of stuff for my needs. It's an "as is" plugin so I can't support bug fixes or suggestions. You are welcome to take it and modify it as you need though. Tdarr_Plugin_Boosh_FFMPEG_QSV.zip

Boosh1 commented 3 years ago

OK bad piece of news. The files I've tested so far have all come out as garbage. So the drivers working, the plugins are working with vaapi, but result currently is a mess and unusable. Unsure if that's a driver, plugin or ffmpeg problem though...

Boosh1 commented 3 years ago

Another update. Vaapi just seems to produce garbage... On the other hand qsv works fine as long as you have ffmpeg which supports it. I installed jellyfins ffmpeg & modified a plug in to use pure qsv and it appears to be working so far.

I've updated my post above with the commands needed to install jellyfin ffmpeg.

ffmpeg cmd something like this: -hwaccel qsv -c:v h264_qsv,-map 0 -c:v hevc_qsv -b:v 2048k -minrate 1433k -maxrate 2662k -bufsize 2639k -look_ahead 1 -look_ahead_depth 100 -c:a copy -c:s copy -max_muxing_queue_size 9999

ppops commented 3 years ago

I've got a 11600K and I'm getting the same results as you @Boosh1. Vaapi produces garbage but qsv works great with Jellyfin's ffmpeg. Cheers!

HaveAGitGat commented 3 years ago

Thanks @Boosh1 , will have a look at that later today (Sunday) and make some changes to the acceptance containers.

tdarr_acc
tdarr_node_acc
TomStarren commented 3 years ago

@Boosh1 / @ppops can you share which plugin you have used and what you changed on this plugin to make it work?