MathewWi / desmumewii

Automatically exported from code.google.com/p/desmumewii
GNU General Public License v3.0
0 stars 0 forks source link

Remove OpenGL #8

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
Just a note to self. Remove OpenGL from project, it's not needed!

Original issue reported on code.google.com by iamsca...@gmail.com on 27 Mar 2010 at 5:11

GoogleCodeExporter commented 9 years ago
... How so? I'm under the impression that OpenGL is the main 3d render func for
DeSmuME. So, unless you're planning on a port over to GX or something like such
OpenGL support is kind of needed... Correct me if I'm wrong.

Original comment by kazuhiro...@gmail.com on 27 Mar 2010 at 6:33

GoogleCodeExporter commented 9 years ago
I mean... Even the 2d gfx are drawn in OpenGL. >_>

Original comment by kazuhiro...@gmail.com on 27 Mar 2010 at 6:35

GoogleCodeExporter commented 9 years ago
OpenGL is only there for hardware rendering on PC.  Whether you use SDL or GX, 
SDL is
a wrapper of GX it make no difference.  Remember this is an emulator the end 
result
of the emulation ends up in a buffer "GPU_screen" all that's needed then is to 
blit
this buffer to the screen.

Original comment by iamsca...@gmail.com on 27 Mar 2010 at 6:37

GoogleCodeExporter commented 9 years ago
Ahh, gothca. Sorry I assumed otherwise. I'm not exactly a hardcore programmer 
like
you guys over on this team. Btw, great progress. You've just been dishin' out 
updates
Scanff. Keep it up. ^_~

Original comment by kazuhiro...@gmail.com on 27 Mar 2010 at 6:48

GoogleCodeExporter commented 9 years ago
I would still recommend you guys try to use GX for rendering. SDL allocates TWO
32-bit screen-sized buffers, one for your data, and one for converting it to the
weird 4x4 pixel format the Wii uses. That adds up to around 2.5 MB for the 
screen
alone, and when you're emulating games that can use up to 512 MB of data, you 
need
all the space you can get.

For example: If you allocate four 256x192 textures (one for each screen, and 
two per
screen for conversion), you use up 786,432 bytes, while SDL's two 640x480 
textures
use up 2,457,600 bytes, over three times as many. This can also make it easier 
to do
fancy layering (like seamless transition top the bottom screen if viewing 
fullscreen)
for the screens without having to do all the blending in software, not to 
mention
hardware scaling. Although I'm kind of getting ahead of the current status of 
the
emulator.

Original comment by baby.lueshi@gmail.com on 28 Mar 2010 at 5:43

GoogleCodeExporter commented 9 years ago
Yep I know.  I agree int the long term Not to use SDL, the sound system is also 
using
SDL.  Converting everything to GX right now just adds more bugs.

Original comment by iamsca...@gmail.com on 28 Mar 2010 at 6:28

GoogleCodeExporter commented 9 years ago
Here's an sort-of-working patch for ditching SDL for video and using 
straight-up GX.
It works, except for a couple bugs:

* Colors are mixed up because I can't find any solid clues on how the screen 
buffer
formats its pixels. Somebody more knowledgeable about DeSmuME than me should be 
able
to fix this easily.

* The console flickers horribly, most likely due to using two framebuffers. This
patch turns it off by default. Once again, if somebody knows how to fix this, 
feel free.

* Because SDL stores a lot of its data in the stack, this patch alone isn't 
really
freeing up memory. However, this kind of puts us halfway there since all we 
really
need SDL for now is sound.

Original comment by baby.lueshi@gmail.com on 29 Mar 2010 at 7:31

Attachments:

GoogleCodeExporter commented 9 years ago
Your patch is awesome. It does way more good than bad (which I am now trying to 
work
on fixing the console, colors, etc.). I've committed it and added you as a 
committer
to the project.

Original comment by castleva...@yahoo.com on 29 Mar 2010 at 2:04

GoogleCodeExporter commented 9 years ago
The colors are stored in 5,5,5 in the u16 of the GPU_screen if you look at the 
masks
from the SDL code it will give you some clues on how to do this.  There's also 
code
for screen capture bmp/png which will help you out.  I'll take a look later too.

Original comment by iamsca...@gmail.com on 29 Mar 2010 at 4:30

GoogleCodeExporter commented 9 years ago
Like this: {0 RRRRR GGGGG BBBBB}? I've been trying all day to make a function 
for 
converting the pixels, but failed. (I've tried both 0 {RRRRR GGGGG BBBBB} and 
{RRRRR 
GGGGG BBBBB 0}) I don't know if it's the framebuffer pixel format that fools me 
or if 
I just screwed up the function.

Do you know the framebuffer pixel format BTW? If you don't here's what my 
research 
has led to:

Format:
M III RRRR GGGG BBBB

Where I seems to be the intensity color intensity. The strange M-bit, seems to 
mess 
up everything when it's set (which is why I named it M). Is it a flag to set 
the 
pixel format or what? I'm totally lost here.

Original comment by profetylen@gmail.com on 29 Mar 2010 at 9:35

GoogleCodeExporter commented 9 years ago
Since we've done this now, I'm going to close this issue.

Original comment by castleva...@yahoo.com on 4 Apr 2010 at 7:21