Witko / nvidia-xrun

Utility to run separate X with discrete nvidia graphics with full performance
GNU General Public License v2.0
488 stars 69 forks source link

Investigate functionality without login to separate tty #4

Open Witko opened 9 years ago

Witko commented 9 years ago

Logging in into separate tty is a bit troublesome. Investigate possibilities to spawn new X directly from active session.

Fincer commented 8 years ago

Yeah I find this script bit troublesome as well. Or at least, bring support for Xephyr - a nested X server session. At the current state, the nvidia-xrun script is not very practical for daily usage because it requires a separate tty which is totally separated from your current X session. However, there is a great potential in nvidia-xrun and the performance boost compared to pure Bumblebee solution is tremendous.

For basic info about Xephyr, see:

https://en.wikipedia.org/wiki/Xephyr https://wiki.archlinux.org/index.php/Xephyr

Witko commented 8 years ago

Thanks mate! I will definetely have a look into Xephyr. But with a brief look I see there that hardware acceleration is only in some forked version. Also it might be not possible to use different video card than the "hosting" X server. But definetely will have a deeper look. Thanks!

Fincer commented 8 years ago

No probs!

Hmm... unfortunate to hear that. So it might be a bit challenging to get nvidia-xrun to work that way. At least, I hope you give a shot for Xephyr and could figure this issue out sooner or later. It would significantly boost the usability of your script in daily usage.

Lack of hardware acceleration support in the original Xephyr is something that shouldn't be there. Seriously. It can be a major and serious obstacle for getting rid of separate tty session that is a requirement for nvidia-xrun (and Optimus stuff overall) at the moment.

However...

Not sure if this is any help for you but anyway... I've used Xephyr + optirun with the following script:

DISPLAY=:0 Xephyr +extension GLX -br -ac -screen 1280x960 :1 & export DISPLAY=:1 optirun $HOME/.nwn/nwn

The script launches a new Xephyr window (Display :1) with 1280x960 resolution and points Neverwinter Nights executable (nwn) into that window using optirun command.

Fincer commented 8 years ago

Allright, I have a rude workaround for this issue. I'm using KDE (Qt4). Simply switching to another TTY session and executing

nvidia-xrun startkde

gives me a secondary KDE session (desktop) which uses discrete Nvidia card for any program I use there. If I don't need the Nvidia card anymore, I simply log out from that KDE session and switch back to the primary TTY session which uses Intel card instead.

If you use different desktop environment, use corresponding command for launching the desktop environment in your secondary TTY session.

XFCE: nvidia-xrun startxfce4 LXDE: nvidia-xrun startlxde Gnome: nvidia-xrun gnome-session

etc.

This is still not a great or user-friendly solution but, at least, it's better than nothing at all.


And if anyone reads this post, stay tuned! Valuable information ahead.

The nvidia-xrun script gives a significant boost to any programs using built-in Nvidia Optimus card.

For example, glxgears performs as follows:

with optirun/primusrun:

6798 frames in 5.0 seconds = 1359.468 FPS 6740 frames in 5.0 seconds = 1347.828 FPS 6885 frames in 5.0 seconds = 1376.830 FPS 6841 frames in 5.0 seconds = 1368.175 FPS 6813 frames in 5.0 seconds = 1362.424 FPS 6887 frames in 5.0 seconds = 1377.235 FPS 6814 frames in 5.0 seconds = 1362.665 FPS

and with nvidia-xrun:

71415 frames in 5.0 seconds = 14282.940 FPS 77207 frames in 5.0 seconds = 15441.392 FPS 77668 frames in 5.0 seconds = 15533.493 FPS 77224 frames in 5.0 seconds = 15444.704 FPS 76825 frames in 5.0 seconds = 15364.848 FPS 75690 frames in 5.0 seconds = 15137.942 FPS 76683 frames in 5.0 seconds = 15336.475 FPS

As it can be seen, the performance boost (increase) with nvidia-xrun script is roughly 1015% (average value), or 11 times higher than with optirun/primusrun.

I've also tested your script with Wine + Warhammer 40,000: Dawn of War II, and the performance difference is... like from another planet. With primus/optirun I used the lowest settings I could set and got like 10-20 FPS and warnings about multiplayer lag. With nvidia-xrun I can play the game with high/ultra settings and still get 35-50 FPS.

I'm using Asus N56JR laptop with Nvidia Gerforce GTX 760M card.

Just before using the script, make sure you have taken care of the laptop cooling because heat can be a significant problem in long run.

Thanks for the script, Witko! I hope you keep developing it because it truly has great potential.

Witko commented 8 years ago

Hi Fincer! thanks for you enthusiasm! I had the same issue - i have GTX 980M and I was pretty excited to see the games fluent on ultra - and i saw worse performance than on integrated graphics. So i had to do something to get it working as the steam already brings many games to linux and i dont want to switch to windows for it. As for you workaround - i ussually run openbox cos i had issues with running steam directly and i didnt want to spend days on fixing it. This works ok(as for you) but still its not 100% convenient as i would like it. For me it was good enough but as i can see there are more of us now :) So hopefully we will find some way to make it more userfiendly. BTW do you use archlinux?

Fincer commented 8 years ago

Np. Your script is targeting on real performance issues and handles them very well. It is not a perfect one in usability as discussed above but I'm quite happy with the current state of it as what comes to the performance it offers for the games and other graphic-intensive programs.

To be honest, your script is the only one I've found so far targeting on Optimus issue - excluding bumblebee/ironhide (which both are pretty dead now, no updates for 3/4 years) and Nvidia official support (which is bad because, as far as I understand, it forces user to log in/out in order to switch between cards).

I still wonder why didn't bumblebee devs focused more on the performance issue because it still exists there and affects thousands of users, at least. Too many users out there without a real answer to the performance issue they still have in daily computer usage.

To summarise, I'm not satisfied with the current official solution by Nvidia, nor with bumblebee because of very poor performance. Your script is a good one taking care of the performance issue and, simultaneously, it eliminates the need for totally logging in/out like apparently in nvidia-prime. In other words: I can use both Nvidia & Intel cards efficiently, not in the same session, but I still can switch between two sessions. If you can find a user-friendly solution to this usability issue (TTY session stuff), the script would be perfect.

What comes to Optimus technology overall, I've read Wayland would bring end to the usage of these hacky scripts:

https://blogs.gnome.org/uraeus/2015/08/19/fedora-workstation-next-steps-wayland-and-graphics/

However, I think that's pretty much still a few years ahead because Wayland is still under development. And waiting for a few years...nah...

For your question. How did you know? :D Yeah, I use Arch Linux. Pretty happy with it though setting it up is a story of its own.

Witko commented 8 years ago

Yop you are right about the current state. Optirun/prismrun is so slow cos it starts other x and then it copies the frame to the primary x as far as i know(or sth similar). So it hardly will work well. Im also looking forward to wayland but it seems its gonna take some more time. As for the archlinux - its the only distro where i created package. Its in AUR. And it seems to me that this script is hard to use without it - you need to know where to put the stuff, etc.

Fincer commented 8 years ago

Yeah, that's the case. The way optirun/primusrun handles drawing of graphics on the screen has, unfortunately, major performance drawbacks as we all know.

I installed this script using AUR with success (yes, it works). I though it was someone else who created that AUR package. Anyway, thumbs up for Arch Linux! One happy user here.

I'm pretty sure someone could make, for example, a debian package as well. Still, as you are the developer, you know it better than me so I'm not going to argue about that with you... Automated, easy installation would make this script more attractive in the eyes of many users, however.

Witko commented 8 years ago

Hi Andennn, please create an issue for this. And please write there also whether you use the AUR package or not. Thanks!

Witko commented 8 years ago

It seems the Xephyr runs on top of underlying X in terms of graphics. So I believe its not going to work. Another option might be Xnest. But this seems to be atop of host X too.

Witko commented 8 years ago

I found another option: setsid sh -c 'exec nvidia-xrun openbox-session <> /dev/tty3 >&0 2>&1' This looks promising, but there are quite some issues to solve:

hlechner commented 7 years ago

Hey @Witko thanks for the previous looking into this issue.

I've made some tests to run it directly on already running X through terminal emulator.

First to be able to run it outside tty you need to change/create the file: /etc/X11/Xwrapper.config, adding the following lines:

allowed_users = anybody
needs_root_rights = no

and just to test it you need to fake a virtual console on nvidia-xrun file (it must run in a different console than the already running X).

So you can change from:

LVT=`fgconsole`

to

LVT="8" 

However the xinit is crashing when use the argument -config nvidia-xorg.conf (only when running outside of tty)


log:

(==) Log file: "/var/log/Xorg.1.log", Time: Mon Dec 19 01:21:36 2016
(++) Using config file: "/etc/X11/nvidia-xorg.conf"
(==) Using system config directory "/usr/share/X11/xorg.conf.d"
Xorg: privates.c:385: dixRegisterPrivateKey: Assertion `!global_keys[type].created' failed.
xinit: giving up
xinit: unable to connect to X server: Connection refused
xinit: server error
hlechner commented 7 years ago

I have found a solution with openvt and also I've sent to you a pull request.

I hope you like it.

nvidiaswitch commented 2 years ago

I have made a script similar to nvidia-xrun, but without needing to change TTY. I believe my alternative fixes this problem.