z0rc / debumblebee

DEPRECATED. Don't use it anymore — Optimus graphics support for Debian through VirtualGL
http://suwako.nomanga.net/
GNU General Public License v3.0
26 stars 2 forks source link

Error when using optirun with wine #48

Closed zephyr91 closed 12 years ago

zephyr91 commented 12 years ago

Hi z0rc,

I'm getting this error when I try to open anything in wine with the optirun.

XIO: fatal IO error 11 (Resource temporarily unavailable) on X server ":8.0" after 187 requests (186 known processed) with 0 events remaining.

My wine is the last unstable version and the debumblebee is the last one too.

How can i fix it ?

My optirun works fine with applications that doesn't involve wine. At least I guess it works.

Thanks for your great work with debumblebee.

z0rc commented 12 years ago

Hi,

This looks interesting. Does simple optirun glxgears works? What is in your /var/log/Xorg.8.log? Also could you try to run wine like optirun -v wine ./game.exe and provide the output?

z0rc commented 12 years ago

Also could you clarify, are you using wine-unstable from main Debian repository or from http://dev.carbon-project.org/debian/wine-unstable/?

zephyr91 commented 12 years ago

Yes, the optirun glxgears works good. I analyzed the xorg.log and I found that when the optirun is loading the module, the nvidia module doesn't recognize my screens size, even if the option is in the xorg.conf.nvidia.

The out put of the command is here: http://pastebin.com/QJ3ZbcUJ

The Xorg.8.log is here: http://pastebin.com/tTs2Wx2H

Also I have to tell you, when I tried to run the command above the wine asked me if I wanted to install the wine-gecko as a standard I answered no.

Yes, I'm using the last unstable wine( $ wine --version wine-1.3.34), I downloaded the last source code and compile by myself. But a few packages was missing that i couldn't find which packages have the devs requested.

These are the packges: configure: gstreamer-0.10 base plugins 32-bit development files not found, gstreamer support disabled configure: OSS sound system found but too old (OSSv4 needed), OSS won't be supported. configure: libgsm 32-bit development files not found, gsm 06.10 codec won't be supported.

The thing is, the game looks that is going to start, but it crashes. I hope you could help me.

zephyr91 commented 12 years ago

Ops, wrong click.

So there are the outputs you need I guess. If you need anything else that I need to provide you just say it OK ?

Thanks z0rc

z0rc commented 12 years ago

What's the game it is? Maybe I have it and can test. Unrecognized screen size can be the cause, but I'm really not sure. You can check this buy running any other fullscreen application. Also it's worth to try lauch game in window mode. Can you try to launch this game without optirun? Also do any other games work with optirun wine? Beacuse this looks like crash inside Xlib, which is used by wine itself.

Configure options look normal, the only thing I can recommend is to build wine with gcc-4.5.

One last thing is you can try to experiment with optirun -c switch, maybe other compression methods could help.

z0rc commented 12 years ago

Another guess is to set option ON_DEMAND=no. Maybe your are just using some launcher, which launches another process with actual game and instantly closes after this. So optirun thinks the program has finished and also stops the Nvidia X server.

zephyr91 commented 12 years ago

The game is, Warcraft 3 Frozen Throne. I can launch it without the optirun, but i need to use -opengl to run smoothly.

No other game is working with optirun, only the ones that don't need wine, like HoN. My wine was compiled with gcc-4.6.

In window mode doesn't work too. I tried more then one game. (tried, starcraft2, warcraf3, warcraf3-frozen throne. I'm also trying to install Diablo2 but it doesn't work too. I'm running the programs directly from the terminal. The on_demand option is needed only when I'm on battery right?

I'll wait your fix, but I'm going to try another things like install the other packges that are missing from wine full configuration.

zephyr91 commented 12 years ago

z0rc,

Can you clarify it for me why optirun don't use these options in xorg ?

Option      "AddARGBGLXVisuals" "True"
Option      "RenderAccel"   "True"
Option      "AllowGLXWithComposite" "True"
Option      "backingstore"  "True"
Option      "TripleBuffer"  "True"
Option      "DisableGLXRootClipping"    "True"

Section "Extensions" Option "Composite" "Enable" EndSection

z0rc commented 12 years ago

(tried, starcraft2, warcraf3, warcraf3-frozen throne. I'm also trying to install Diablo2 but it doesn't work too.

Aha, I saw something similar with Diablo 2. That's what my second guess about. You are starting launcher, not the game binary. Launcher uses strange mechanism which disallows tracking of child processes, so X server shuts down before the game binary launched, leading to the following error. In case of diablo, there is two binaries in its folder: "Diablo II.exe" is launcher, while "Game.exe" is actual game binary. Starting launcher leads to your problem, while Game.exe just works.

So this mainly the problem of Blizzard games, and their love of launchers.

Option ON_DEMAND=no also helps. The difference is that Nvidia X server will be running constantly, eating some power, leading to less time when you are on battery. Also power save option works only when ON_DEMAND=yes.

Can you clarify it for me why optirun don't use these options in xorg ?

Have you actually read Nvidia documentation about these options? Most of them are set automatically, two are deprecated, one isn't even exist. Anyway feel free to "tune" your xorg.conf.nvidia as you want. I'm goint to stay with defaults as this configuration should work everywhere.

zephyr91 commented 12 years ago

I see. What about the auto-detection of the screen size ? Did you had any progress ?

I tested the optirun to start Heroes of Newerth and Doom3, HoN started OK, but doom3 i didin't really tested because I don't have the cd-key.

I'll try again to install those games and see if they will work. I just need to get Wine to work ok.

About the options i mentioned, the "composite" don't work I guess because I tested and when the X server starts it frozen on a black screen and can't even switch to tty's.

Thanks for your work.

z0rc commented 12 years ago

What about the auto-detection of the screen size ? Did you had any progress ?

Ask Nvidia. debumblebee doesn't autodetect anything. It just provides list of common screen resolutions for nvidia module, but in your case module doesn't accept them as suitable. What is your screen size? If it's listed in xorg.conf.nvidia then there is nothing I can help.

Lekensteyn commented 12 years ago

Does it also crash without using optirun? I've been trying here to get COD2 to work, but it crashes with a XIO (lost connection) error after going fullscreen.

z0rc commented 12 years ago

@Lekensteyn as I understand the meaning of this error, the X server stopped while application still was running.

Lekensteyn commented 12 years ago

@z0rc That was not the case for me, if I kill the .exe program, the GUI suddenly works again (although the resolution is messed up)

zephyr91 commented 12 years ago

Yes z0rc is listed there 1366x768(16:9) but is there a way to force the xorg to use these resolution ? Is there something you can help me doing z0rc?

z0rc commented 12 years ago

Run optirun nvidia-settings -c :8 check which available screen resolutions you have there.

zephyr91 commented 12 years ago

O.O it only recognize 640x480 and less then that... that is strange...

edit: But i saw on the advanced option that resolution is the nvidia-auto-select. And as you saw in the log the auto-select don't recognize my resolution.

Lekensteyn commented 12 years ago

@zephyr91 You can put extra ModeLines in the /etc/X11/xorg.conf.nvidia file. Example data: https://github.com/Bumblebee-Project/Bumblebee/issues/67#issuecomment-1917728

zephyr91 commented 12 years ago

I didn't quite understand this @Lekensteyn. There they say to switch the UseEDID from false to true, and what should I do with the information gived by the xrandr --verbose ?

I get this from there: 1366x768 (0x43) 69.3MHz +HSync -VSync *current +preferred h: width 1366 start 1414 end 1446 total 1456 skew 0 clock 47.6KHz v: height 768 start 771 end 777 total 793 clock 60.0Hz

I think this is what you was saying to configure the Modeline. but how I know in which order to put in the xorg.conf?

edit: Even adding the modeline that is gived by the command above, it wont worked..

this is the modeline i formed from the command, can you tell me if is something wrong ?

"1366x768" 69.3 1366 1414 1446 1456 768 771 777 793 +hsync -vsync

Lekensteyn commented 12 years ago

@zephyr91 The link showed an example of where to put directives. Use cvt to calculate the data. Example for 1366x768 with a refresh rate of 60:

$ cvt 1366 768 60
# 1368x768 59.88 Hz (CVT) hsync: 47.79 kHz; pclk: 85.25 MHz
Modeline "1368x768_60.00"   85.25  1368 1440 1576 1784  768 771 781 798 -hsync +vsync
zephyr91 commented 12 years ago

Well @Lekensteyn my output is the same of yours and did not change anything.. I dont understand this.

edit: I put the modeline in xorg.conf and it didn't change.

Lekensteyn commented 12 years ago

@zephyr91 Have you restarted the daemon after the change?

zephyr91 commented 12 years ago

No @Lekensteyn , because the optimus is ON_DEMAND option, so it starts when i start the optirun. Do I need to restart it ?

Lekensteyn commented 12 years ago

Nope, that won't help. How does your xorg.conf.nvidia look like?

zephyr91 commented 12 years ago

My xorg.conf.nvidia its now the standard of debumblebee, but when I was trying to fix the modeline he was like this:

http://pastebin.com/32nCHCyi

But I'm not sure if its OK. Well certainly it didn't work.

edit: Well, I created the modeline trough the xrandr --verbose command and it recognize only the 640x480 mode.

This is how my xorg.conf looks like now:

http://pastebin.com/raw.php?i=r12N5vAN

Maybe its something wrong with VGL ?

If it helps here is the optirun -v

david@ZEUS:~$ optirun -v nvidia-settings -c :8 [VGL] NOTICE: Replacing dlopen("/lib/x86_64-linux-gnu/libdl.so.2") with dlopen("libdlfaker.so") [VGL] Shared memory segment ID for vglconfig: 4161556 [VGL] VirtualGL v2.3 64-bit (Build 20111202) [VGL] Opening local display :8 [VGL] NOTICE: Replacing dlopen("libGL.so.1") with dlopen("librrfaker.so") [VGL] NOTICE: Replacing dlopen("libGL.so.1") with dlopen("librrfaker.so")

Lekensteyn commented 12 years ago

You need to add the names of the modelines to your Modes directive as well.

zephyr91 commented 12 years ago

But the cname are the "words" between quotes.

On Dec 17, 2011 6:51 AM, "Peter" < reply@reply.github.com> wrote:

You need to add the names of the modelines to your Modes directive as well.


Reply to this email directly or view it on GitHub: https://github.com/z0rc/debumblebee/issues/48#issuecomment-3188428

Lekensteyn commented 12 years ago

Oh, I missed the comment characters. If this does not work, I'm clueless. Does xrandr -display :8 -q show all available modes which you set in the xorg file? Any errors in /var/log/Xorg.8.log?

z0rc commented 12 years ago

Those modelines are irrelevant to reported problem and solution is already provided. Closing.