Open ArchangeGabriel opened 13 years ago
About a week ago, I accidentally succeeded in accessing a pure Nvidia desktop using hybrid-windump.
I cannot recall exactly how it happened, but I had the dual-head xorg.conf setup with gebart's windump and was screwing around with mplayer2.
The next thing I know, my mouse is trapped in the dumped Nvidia screen and can't escape. I didn't think this was much of a problem, but I had solved the problem of not being able to interact with the Nvidia desktop once the window was dumped.
Do note that metacity must be started from the terminal or tty in order for the mouse to work on the Intel screen.
Further research is needed.
Okay, I've figured out how to control the Nvidia screen on demand.
Assuming that the Nvidia screen is set to match the Intel screen's native resolution in the xorg.conf, moving my mouse to the left side of the screen will bring it into the dumped Nvidia screen.
Going forward, I have a few proposals.
Also, something of note.
The windump approach does expose the NV17 Video Texture output over Xv in SMPlayer.
Some big developments.
Thanks to Fabio Giovagnini's post on the mailing list, I've been able to trim the requisite xorg.conf to something far simpler and manageable: http://paste.ubuntu.com/695277/
This configuration allows windump to work with distro-packaged versions of the nvidia binary OR nouveau (no OpenGL with nvc3 but X11 works).
Edit: NV17 is exposed to the Intel display in dual-head xorg configurations such as the one linked. Windump is not necessary. Also, bumblebee will load the above xorg.conf if you use it as xorg.conf.nvidia.
i was just about to try it, but don't want to fry it... is it safe to use? like this: /etc/bumblebee/xorg.conf.nvidia
I should clarify that using the xorg.conf as xorg.conf.nvidia does not enable VDPAU. I was just saying that Bumblebee loads it with no problems.
so its safe to use for enable external monitor hdmi. i will try...
On 09/23/2011 10:37 PM, G wrote:
so its safe to use for enable external monitor hdmi. i will try...
If you have muxless Optimus, your HDMI port may or may not be physically hooked up the Nvidia GPU directly.
If it is hooked up, you might be able to use the Nvidia card exclusively on that display, but not the LVDS.
i tried few different ways, didnt work. the hdmi tv monitor was recognised in nvidia-settings panel, as disabled. there was option in config to choose as single x-screen (of which there is only one), or Twinview. afraid to click "Save settings to xconfig" option as i'm not sure which xconfig it will change. it would be handy to have 2 monitors going, or even just a big one.
On 09/23/2011 11:35 PM, G wrote:
i tried few different ways, didnt work. the hdmi tv monitor was recognised in nvidia-settings panel, as disabled. there was option in config to choose as single x-screen (of which there is only one), or Twinview. afraid to click "Save settings to xconfig" option as i'm not sure which xconfig it will change. it would be handy to have 2 monitors going, or even just a big one.
I think you're getting a bit ahead of yourself and beyond the scope of this bug entry.
There are still gl_conf assignments I need to work out before even attempting windump on a production machine (which I do because I'm a masochist). I only get this to work because I shift the assignment back and forth using dpkg'ing of libgl1-mesa-glx and nvidia-current (assuming Ubuntu and its derivatives).
Besides, I don't know what hardware you are running or what you are trying to do. However, as a rule of thumb, an unmodified file generated by nvidia-xconfig is not going to be useful or usable.
Please keep in mind that the frame of reference for this entry assumes a muxless DSM-only notebook with no MXMX/MXDS calls. It is also assumed that the HDMI ports are wired to the Intel chip and that we are only seeking Nvidia output on the LVDS. This is all of course to keep things simple and eliminate variables.
Screenshot: http://i.imgur.com/AxQEV.jpg mplayer2 output: http://pastebin.com/6D6m99D0 Screencast: http://www.youtube.com/watch?v=pxziIAPFIFY
What is that? It's VDPAU through Bumblebee+Windump!
Now, there are still a few problems that need to be addressed. The main one is the presence of two cursors that don't always align and no keyboard support. The second is that Bumblebee doesn't have a comprehensive list of modeplines (1080p especially) like Ironhide's xorg.conf.nvidia does so the dumped window may be limited to 1024x768 or 1360x768. Even if the dumped window matches your LVDS resolution, there are still issues of window overlap, window priority, and Compiz being a bit fickle. gnome-panel and gnome-shell seem to also interfere with fullscreening the dumped window. There are also minor cosmetic issues like theming, but whatever.
Want to try? Don't worry! This shouldn't hose your system like previous approaches.
NB: This also works with Ironhide. Also, unlike the previous post, dumping is needed to expose NV17.
Video of progress that highlights the dumping of single windows using the latest version of hybrid-windump http://www.youtube.com/watch?v=WVBMLdeRoUU
I think we can start thinking about using hybrid-windump as an alternative backend for Bumblebee. It seems more "hardwarelly" than the network approach. Thanks for the progress on this one @LLStarks. We'll try to catch up
To that end, I've forked harp1n's windump to make rendering through Bumblebee possible again. The latest commits broke that functionality.
https://github.com/LLStarks/hybrid-windump
I'm finding Bumbledump to be far more fickle than Windump by itself and a lot of work needs to be done to get the xorg.conf(.nvidia) to correctly handle mouse/keyboard events and have a wide array of resolutions. The modelines Ironhide uses shoud suffice for the latter. There's also a question of whether the two-screen behavior is appropriate. I see little need to have the Nvidia screen to the left or right of the Intel screen.
At any rate, here's a wiki page featuring demos of Windump by itself and through Bumblebee. https://github.com/Bumblebee-Project/Bumblebee/wiki/Bumbledump
About modelines, whe should not use those from IronHide, however we can get them entirely from intel driver, read this: https://github.com/Bumblebee-Project/Bumblebee/issues/67#issuecomment-1917728
I'm trying to found an external monitor to play with, I may have found one.
I have a few little heads-up in this one:
Tried the @LLStarks hybrid-windump, and I have a really great news: It improves performance, I mean A LOT! Despite the really unusable for any purpose of current windump, this will serve as a proof of concept.
Running on Nouveau all the test are great.
$ optirun glxspheres
Polygons in scene: 62464
Visual ID of window: 0x21
Context is Direct
OpenGL Renderer: Gallium 0.4 on NVA8
30.888309 frames/sec - 27.538780 Mpixels/sec
not good huh? Well that's because VirtualGL "renders" twice: one on the server side (what we want to offload graphics) and one 2D rendering on the client side, which unfortunately loads the main CPU core and is awfull in that. But now look what happens when using windump:
DISPLAY=:8 glxspheres
Polygons in scene: 62464
Visual ID of window: 0x11c
Context is Direct
OpenGL Renderer: Gallium 0.4 on NVA8
147.114528 frames/sec - 164.179813 Mpixels/sec
Here the window is not dumped yet. When I dump it:
137.518458 frames/sec - 153.470599 Mpixels/sec
So, minimum overhead on the main core and the rendering is pretty good. And this is under Gallium3D!!!!
The hybrid-windump as it is, is unusable, but will get better i think :)
In @LLStarks we trust
I will take a look at all these interestings things here, but I'm very busy currently.
@LLStarks, just to notice something: to be able to use the keyboard on the dumped window you need to comment out the option AutoAddDevices
or set it to true
. That will add all the control devices including keyboard and mouse. that option is set to false
so the X server will start faster.
...
Section "ServerLayout"
Identifier "Layout0"
# Option "AutoAddDevices" "false"
EndSection
...
Yup. Keyboard now works.
I haven't been able to get the mouse working for the past few days though.
On 09/30/2011 01:01 PM, Joaquín Ignacio Aramendía wrote:
@LLStarks, just to notice something: to be able to use the keyboard on the dumped window you need to comment out the option
AutoAddDevices
or set it totrue
. That will add all the control devices including keyboard and mouse. that option is set tofalse
so the X server will start faster.... Section "ServerLayout"up. Identifier "Layout0" # Option "AutoAddDevices" "false" EndSection ...
Yup. Keyboard now works.
I haven't been able to get the mouse working for the past few days though.
Has anyone been able to get glxgears, glxspheres, or mplayer gl output to work?
@LLStarks: I did, the problems I found:
On the bright side: it improves performance by 4 to 5 times
On 10/02/2011 05:03 PM, Joaquín Ignacio Aramendía wrote:
@LLStarks: I did, the problems I found:
- Couldn't get a single window dump
- The dump is fixed to 1024x768 and no background or compositor (you can run them but the truth is taht it only themes the cursor and lowers performance)
- When the window is closed, the app still runs in the second server (this is the major problem)
On the bright side: it improves performance by 4 to 5 times
Single-window dump can be done as follows. Until we add larger resolutions to Bumblebee, the dumped window cannot be larger than 1024x768.
- Start Xserver optirun glxinfo
- Start program on :8 DISPLAY=:8 glxgears
- Find window ID hex DISPLAY=:8 xwininfo -root -children
- Dump window ./windump -w 1 -i 0x200002 :8 :0
In order to streamline this, we'd need a method to dump every parent window created on the second X server EXCEPT the root if it has not already been dumped. Yet to the best of my knowledge, we cannot know the window ID before we launch the application we want to run on the second X server.
LLStarks found an interesting and somewhat complete solution in xpra
. The performance is the same as windump, but has the only problem that the transport is via jpeg. The frames are rendered faster than are displayed and that causes some graphics corruption/flickering. For those who want to test this:
The trunk svn also has png and rgb24 modes, but they don't seem to help so far.
Also, I think it might be more appropriate to do xpra upgrade :8 instead of start.
Despite the unusable jpeg problem I found a way to set the Virtual screen the Nouveau driver uses as a monitor to render:
adding a minimal Screen section and matching that to the resolution of the LVDS improves performance a little (at least in the numbers):
Section "Screen"
Identifier "screen1"
DefaultDepth 24
SubSection "Display"
Depth 24
Virtual 1366 768
EndSubSection
EndSection
A thought occurred.
It's already very easy to screw up gl_conf assignments by installing/removing/reinstalling mesa glx, nvidia glx, or Bumblebee. If we replace VirtualGL as a backend we'd need to ensure that GL works on both X servers otherwise we'd encounter "Error: couldn't get an RGB, Double-buffered visual" messages for the Nvidia one.
I think we should test this stuff in a less problematic environment (i.e. not nvidia driver). Nouveau performs well at least to make things work.
Btw, to ensure hardware acceleration, use "xpra start :8 --use-display".
I'm a bit curious as to how VirtualGL handles cursors and resizing.
The two primary pitfalls of using Xpra and Windump are that neither of the above are handled properly.
How do you create a nested X server that can handle input (mouse AND touchpad) yet not create a second cursor?
Also, I really Windump's -i flag. Not knowing the window hex ID until you run the dumped app isn't much of a problem, though it would be more ideal to refer to said app by the window name or PID of it's root window. What I would like to implement, if I ever have time, is a method for Windump to account for changes in window size or hex ID as programs like Mplayer2 go in and out fullscreen.
As for Xpra, it simply doesn't have enough memory bandwidth even with the new mmap transport. Using it as I did was a novel idea, but it is too VNC-like and suffers from its principles.
Btw, proof of concept status update. Most of this is this just me playing around with no care for implementation.
One important thing to note is that I didn't need to break Intel 3D to achieve any of them. All I did was use the Windump/Xpra approach and the requisite library paths.
Hi.
I was wondering if there was any advances in the support of Vpdau through Windump and its integration with Bumblebee. I browsed the web looking at different walkthrough you detailed, but none of them seems up-to-date, and so far I have failed to decode video with Vpdau via Windump. Could you recap what there is to know on the matter as of now ?
There's no point right now.
Prime and drvscreen are almost ready to land upstream.
is there any progress on that?
This is a feature request.
The goal we try to reach is to have a --vdpau option for optirun.
The most interesting way looks to be hybrid-windump for now.
Old discussion is available here : https://github.com/MrMEEE/bumblebee-Old-and-abandoned/issues/531.
I will make a summary of it here as soon as possible.