VirtualGL / virtualgl

Main VirtualGL repository
https://VirtualGL.org
Other
701 stars 106 forks source link

Able to run tests, but vglrun freezes #179

Closed rkoyama1623-2021 closed 3 years ago

rkoyama1623-2021 commented 3 years ago

Hi, I'm trying to use virtualgl with Ubuntu 18.04 on Intel NUC (NUC8i7HVKVA, link), whose GPU is Radeon RX Vega M GH. I could once successfully install virtualgl on it, but I cannot again after clean install. Please give me your advice.

Problem

I cannot run glxinfo or glxgears. No error, but looks like freezing.

$ vglrun +v /usr/bin/glxinfo
[VGL] Shared memory segment ID for vglconfig: 32825
[VGL] VirtualGL v2.6.95 64-bit (Build 20211022)
[VGL] NOTICE: Replacing dlopen("libX11.so.6") with dlopen("libvglfaker.so")
$ vglrun +v /usr/bin/glxgears 
[VGL] Shared memory segment ID for vglconfig: 32828
[VGL] VirtualGL v2.6.95 64-bit (Build 20211022)
[VGL] NOTICE: Replacing dlopen("libX11.so.6") with dlopen("libvglfaker.so")

I can run glxinfo attached with VirtualGL.

$ /opt/VirtualGL/bin/glxinfo |grep -i vendor
server glx vendor string: AMD
client glx vendor string: AMD
OpenGL vendor string: Advanced Micro Devices, Inc.

I can run glxspheres64.

$ /opt/VirtualGL/bin/glxspheres64  # also spheres appear.
Polygons in scene: 62464 (61 spheres * 1024 polys/spheres)
GLX FB config ID of window: 0xb2 (8/8/8/8)
Visual ID of window: 0x126
Context is Direct
OpenGL Renderer: AMD Radeon Graphics

I can run glxinfo without vglrun.

$ glxinfo | grep -i vendor
server glx vendor string: AMD
client glx vendor string: AMD
OpenGL vendor string: Advanced Micro Devices, Inc.

Test commands can be executed.

$ /opt/VirtualGL/bin/glreadtest

GLreadtest v2.6.95 (Build 20211022)

/opt/VirtualGL/bin/glreadtest -h for advanced usage.
FB Config = 0xd5
Rendering to Pbuffer (size = 701 x 701 pixels)
Using 1-byte row alignment

>>>>>>>>>>  PIXEL FORMAT:  RGB  <<<<<<<<<<
glDrawPixels():   757.3 Mpixels/sec
glReadPixels():   790.4 Mpixels/sec (min = 359.2, max = 877.4, sdev = 65.62)
glReadPixels() accounted for 99.99% of total readback time

>>>>>>>>>>  PIXEL FORMAT:  RGBA  <<<<<<<<<<
glDrawPixels():   621.7 Mpixels/sec
glReadPixels():   643.7 Mpixels/sec (min = 257.0, max = 714.2, sdev = 51.18)
glReadPixels() accounted for 99.99% of total readback time

>>>>>>>>>>  PIXEL FORMAT:  BGR  <<<<<<<<<<
glDrawPixels():   983.7 Mpixels/sec
glReadPixels():   781.1 Mpixels/sec (min = 347.6, max = 875.9, sdev = 80.31)
glReadPixels() accounted for 99.99% of total readback time

>>>>>>>>>>  PIXEL FORMAT:  BGRA  <<<<<<<<<<
glDrawPixels():   635.9 Mpixels/sec
glReadPixels():   632.4 Mpixels/sec (min = 282.3, max = 709.0, sdev = 62.29)
glReadPixels() accounted for 99.99% of total readback time

>>>>>>>>>>  PIXEL FORMAT:  ABGR  <<<<<<<<<<
glDrawPixels():   629.5 Mpixels/sec
glReadPixels():   601.7 Mpixels/sec (min = 285.2, max = 711.2, sdev = 62.07)
glReadPixels() accounted for 99.99% of total readback time

>>>>>>>>>>  PIXEL FORMAT:  ARGB  <<<<<<<<<<
glDrawPixels():   628.9 Mpixels/sec
glReadPixels():   630.5 Mpixels/sec (min = 309.2, max = 703.0, sdev = 54.88)
glReadPixels() accounted for 99.99% of total readback time

>>>>>>>>>>  PIXEL FORMAT:  RED  <<<<<<<<<<
glDrawPixels():   2556 Mpixels/sec
glReadPixels():   1385 Mpixels/sec (min = 801.7, max = 1584, sdev = 129.4)
glReadPixels() accounted for 99.98% of total readback time

How I installed VirtualGL

Download amdgpu-pro-21.20-1274019-ubuntu-18.04.tar.xz from here.

./amdgpu-install --opencl=legacy,rocr --pro
sudo apt install lightdm # Select lightdm for display-manager
# reboot
sudo dpkg -i virtualgl_2.6.95_amd64.deb
sudo /opt/VirtualGL/bin/vglserver_config 
# select 1 -> n -> n -> y -> x

Thank you.

dcommander commented 3 years ago

That's because you are logged into the 3D X server. VirtualGL does basically two things:

  1. VirtualGL redirects OpenGL rendering away from a "2D X server" (an X proxy such as VNC that lacks GPU-accelerated OpenGL capabilities, or an X server that is remote from the point of view of the 3D application) to a "3D X server" that has GPU-accelerated OpenGL and is local from the point of view of the 3D application. (NOTE: VirtualGL 3.0 doesn't actually need a 3D X server. It can redirect OpenGL rendering to a DRI device instead.)
  2. At designated times (such as when the 3D application calls glXSwapBuffers()), VirtualGL reads back the OpenGL-rendered frames from GPU memory and transports them to the 2D X server.

Thus, if you are running a 3D application on a GPU-accelerated X server that is local from the point of view of the 3D application, then VirtualGL serves no useful purpose. If you need to do that for testing purposes, then you'll need to set VGL_DISPLAY=:1.0 or pass -d :1.0 to vglrun if your operating system is relatively modern. The default value of VGL_DISPLAY (:0.0) is appropriate for using VirtualGL when the 3D X server is sitting at the login prompt (that is the purpose of vglserver_config-- to grant a particular set of users access to the 3D X server when it is at the login prompt.) However, when you log in, GDM freezes Display :0.0, and only Display :1.0 can be used (and it can only be used by the user that is logged in.)

rkoyama1623-2021 commented 3 years ago

Thank you for the detailed information. Sorry for my lack of study. I have a better understanding now.