Closed GoogleCodeExporter closed 9 years ago
8 bits also show problems (see #2, although some of it is my fault). 16 bits
seem to
work fine though.
The game Tantric works on seems to use 16 bit depth (at least the upstream
source) :-)
Original comment by simon.ka...@gmail.com
on 29 Apr 2009 at 4:06
To add a little info. to this -
I think the issue is in WII_FlipHWSurface() and the calculation:
int rowpitch = (this->hidden->pitch >> 4) * 3;
I think this is specifically for 16bit surfaces.
I know that GX stores the image 4x4 tiles for optimization. We'd need to find
and
equation that supports 8,24 and 32bit.
Original comment by iamsca...@gmail.com
on 29 Apr 2009 at 7:54
It's more complex than that, but you are on the right spot. I've just committed
partly working (i.e., broken but in a different way!) 8-bit support. Zooming is
wrong
and the bottom half of the window is missing, but it seems to handle my palettes
quite OK now [1].
With 16-bit being the "native" mode, 32-bit should be possible to implement in a
similar way as 8-bit support. For 8-bit, I lookup pixel values from a palette,
while
the 32-bit would simply "compress" the 32-bit pixel value into 16-bit RGB565.
There are still stuff I haven't quite understood though :-)
[1] The color blue shows beautifully!
Original comment by simon.ka...@gmail.com
on 29 Apr 2009 at 8:01
OK, I've been looking over this issue and wanted to add support for 24 and
32bit.
The way I see it there are two ways to do this. Please add more ideas if you
have
them.
First would be to change the GX texture to GX_TF_RGBA8 for RGB and RGBA
surfaces.
Not sure how this would perform or handle if you mix bit depths within your
program.
Secondly would be to convert RGB/A to 16bit YCbCr. This would be a big
performance
hit if you use 24/32bit surfaces as they would need to be converted each blit.
Also is there any reason why 16bit seems to be the native depth of SDL? I
noticed
this even in the old port. Is the Wii's bit depth only 16bit ?
Original comment by iamsca...@gmail.com
on 30 Apr 2009 at 4:43
8 and 24 work ok. 32 is not written.
Original comment by dborth@gmail.com
on 14 May 2009 at 5:20
Yes, still needs finishing. I'll be looking at it some more. I'm not too
happy with
the current methodology, you can only load 640x480 surfaces. Although adding a
memory allocation for each blit would slow things down.
Original comment by iamsca...@gmail.com
on 14 May 2009 at 5:29
I agree. Keep in mind we could utilize scaling (GX or VI) for smaller/larger
surface
sizes. But this still needs to be written...
Original comment by dborth@gmail.com
on 14 May 2009 at 5:32
Some comments from a user's view:
I have been using 8 bit mode and this produced a black screen when compiling
with the
4-23-2009 version. Changing to 16 bit displayed the images far less clear than
before
and the game ran slower overall.
Will the current SDL version work with the next download of devkitpro coming
up? Is 8
bit mode as clear and fast as before then?
Original comment by dos...@gmail.com
on 27 Jun 2009 at 1:00
I don't believe there has been any changes to 8-bit mode since I hacked up a
version
that actually works. It's clumsy and non-optimal and the palette is not quite
what it
should be, but it seems to be at least as fast as the old non-GX one (with my
tests
in Frodo).
For best performance, you should be using 16-bit depth. I'm surprised that
16-bit
mode would be slower - from what I've seen it's a lot faster than the old non-GX
implementation. Make sure that you don't have surfaces in 8-bits which needs to
be
converted for each blit, that could make things slower.
Not much of an answer, I know :-)
Original comment by simon.ka...@gmail.com
on 28 Jun 2009 at 7:27
He is referring to the 4-23-2009 version, which I believe is the pre-GX SDL.
Yes, a
new SDL will be released immediately following the devkitpro release.
Original comment by dborth@gmail.com
on 28 Jun 2009 at 7:31
Sorry for not posting earlier. My libSDL.a which I have used in the past
carries the
date 7-13-2008. Here the speed varies with the depth, 8 bit is the fastes by
far.
I now ported a new game to SDL. I have to use 16-bit depth with the current SDL
lib
(for me this is 4-23-2009) which results in lines which are not clear. I hope
the new
version will improve this.
Original comment by dos...@gmail.com
on 12 Jul 2009 at 2:55
Use the SVN I think you'll find 16bit is the fastest. 8 and 24 bit basically
create a
16bit texture then process it. If you're using SDL from 7-13-2008 then it's not
using GX for rendering.
Original comment by iamsca...@gmail.com
on 12 Jul 2009 at 6:19
Original comment by dborth@gmail.com
on 23 Jul 2009 at 6:27
Original comment by dborth@gmail.com
on 27 Jul 2009 at 7:10
ok, this is reasonably fixed; however, using 24/32 bpp is highly discouraged
for
performance reasons. typically games will also support 16 bpp (recommended)
Original comment by dborth@gmail.com
on 4 Aug 2009 at 7:20
Original comment by dborth@gmail.com
on 4 Aug 2009 at 8:52
Original issue reported on code.google.com by
iamsca...@gmail.com
on 29 Apr 2009 at 6:33