Closed GoogleCodeExporter closed 8 years ago
Original comment by cast...@gmail.com
on 4 Feb 2008 at 7:59
In order to support that, the following things need to be done:
- Add ColorTransform_YCoCg.
- Add support for BC3 compression with AlphaMode_None.
The user would then specify the following options:
inputOptions.setAlphaMode(AlphaMode_None);
inputOptions.setColorTransform(ColorTransform_YCoCg);
compressionOptions.setFormat(Format_DXT5);
compressionOptions.setColorWeights(1, 1, 0, 1);
Just like with BC1n this mode could also be optimized so that the DXT1
compressor
only fits the two first color components.
Alternatively we could simply add a new format:
compressionOptions.setFormat(Format_DXT5_YCoCg);
Original comment by cast...@gmail.com
on 1 May 2008 at 9:48
We could also name the color transform 'ColorTransform_CoCgSY' to make clear
that the
blue component encodes the scale, and that the alpha encodes the luminance.
Original comment by cast...@gmail.com
on 1 May 2008 at 10:03
I've checked in an implementation of ColorTransform_YCoCg provided by Jim
Tilander.
Original comment by cast...@gmail.com
on 6 May 2008 at 11:20
Original comment by cast...@gmail.com
on 11 Oct 2008 at 7:29
I use the algorithm invented at
http://developer.nvidia.com/object/real-time-ycocg-dxt-
compression.html,in many cases it works perfect,but in some case it also have
color
bleeding artifacts.could anybody give me some advice?
Original comment by duge0...@gmail.com
on 16 Jan 2010 at 8:50
Attachments:
duge0413, this is not the place for general questions about our paper. A better
idea is to mail us directly, or use the developer mailing list. In any case, to
answer your question:
DXT is in general a lossy compression algorithm, that is, artifacts are to be
expected, and generally present themselves as block compression artifacts, this
is most noticeable if you use a real-time compression such as the one proposed
in our paper. YCoCg-DXT5 reduces these artifacts significantly compared to
DXT1, but does not remove them entirely. Color bleeding usually occur when the
chroma cannot be approximated accurately with a line, this is probably the case
in the subpixel antialiased fonts that your screenshot shows.
Original comment by cast...@gmail.com
on 14 Jun 2010 at 11:27
I see this is checked in about 2 years ago... is it still planned to be
released with version 2.1?
Original comment by pope...@gmail.com
on 23 Jul 2010 at 5:31
That's still the plan. Sorry, it was a mistake to make so many breaking changes
in 2.1, that has delayed things way too long, and devote too little time to
NVTT to make progress at a faster rate. 2.1 should hopefully be released
sometimes this summer.
Original comment by cast...@gmail.com
on 26 Jul 2010 at 8:19
Hi, Ignacio.
How close are you to release Release 2.1? I'm about to implement this feature
in our texture pipeline which's already using NVTT 2.0.6. So if this is gonna
be released anytime soon, I'd just wait for it.
Original comment by pope...@gmail.com
on 29 Oct 2010 at 1:03
I've actually been making some good progress toward the 2.1 release recently.
In the end I'm changing a lot more things that originally planned, which is why
this has taken so long.
I think I'll have an alpha release pretty soon, probably sometime next week.
Original comment by cast...@gmail.com
on 29 Oct 2010 at 1:43
oh that's very cool.. it's a good thing that I checked with you. My lead, Ian,
will be very happy, too. (You worked with him before) :D
nvtt tool makes my job really easy, and I really appreciate what you are doing
for this tool even after you left nvidia :)
Original comment by pope...@gmail.com
on 29 Oct 2010 at 2:08
Hmm... I've worked with a few Ian's, but I suppose you are at Relic and are
talking about Ian Thomson, right?
We use NVTT ourselves in The Witness, so I actually have more time and
motivation to keep it in good shape now than I used to have at nvidia. We have
been using 2.1 internally for a while, so it's usable, but only because I know
what's implemented and what's missing, and I can fix stuff easily as I need it.
Original comment by cast...@gmail.com
on 1 Nov 2010 at 7:38
That's right.. Ian T. I'm glad you actually know proper spelling of his last
name.. :-) (people often put 'p between m and s.) It's also good that 2.1 is
already somewhat tested code.
Actually one question about this YCoCg transformation. Does it convert sRGB
images to linear before encoding as YCoCg?
Original comment by pope...@gmail.com
on 1 Nov 2010 at 8:21
That was a common problem with all the color transforms, it wasn't clear
whether they should be done in linear or sRGB space, before or after
mipmapping, and the order of multiple color transforms wasn't obvious either.
Several people suggested adding an imperative API and that has become the major
feature in 2.1. So, instead of providing all the image data and manipulation
options upfront through the InputOptions API in a declarative way, there's a
new TexImage class that represents an array of images or faces that you can
manipulate directly and allows you to implement things in an imperative way.
The resulting code is a bit more verbose, but much more concise and internally
the code is cleaner and doesn't have to make assumptions about what you want.
Also, the old InputOptions API is implemented on top of TexImage and the
resulting code is a bit more simple and easier to maintain.
Original comment by cast...@gmail.com
on 1 Nov 2010 at 11:45
Oh, I may write it right, but as he can attest, I always pronounced it wrong.
:-)
Original comment by cast...@gmail.com
on 1 Nov 2010 at 11:46
And to answer your last question, YCoCg is a linear transform, so you can
theoretically do the sRGB transform before or after. If you are using YCoCg you
cannot use the SRGB sampler state/texture type, because Y is stored in the
alpha and that's not automatically transformed, but using sRGB space is still a
good idea. A simple cheap approximation is to only square the color component.
Or you could use a combination: sRGB for the chrominance using the texture
hardware, and pow(2) for the luminance in the shader.
Exposing all these variations in the declarative API would be a nightmare, but
with the imperative API is trivial.
Original comment by cast...@gmail.com
on 1 Nov 2010 at 11:55
Yeah that was why I asked. I kinda wanted to have YCoCg in linear space(if it
doesn't lose the quality with dark pixels) so that I can avoid doing manual
gamma correction in shader. But now I think linear YCoCg would suffer from
quality loss, so I guess I have no choice but doing manual gamma correction.
Doing pow(gamma) for the luminance only in shader sounds like a good approach,
since power is a scalar op on console I believe.
Thanks. I'm really looking forward to version 2.1 now :)
Original comment by pope...@gmail.com
on 2 Nov 2010 at 12:06
OK, I've recently checked in the changes I was talking about. There are still a
few minor issues, but I think that it should mostly work.
For the 2.1 beta release I'd still like to polish things a bit more and do a
lot more testing. Note that the YCoCg stuff is completely untested.
I'll be integrating this version into our code base tomorrow, chances are I'll
catch a few bugs in the process. I'll probably launch the beta sometime after
that, my wife is having a baby at any time now, so I'll soon be taking a few
days off.
Original comment by cast...@gmail.com
on 5 Nov 2010 at 7:19
that is an excellent & amazing news(having baby part). Big Gratz. I was under
impression that you had a kid already (i might have read it from your blog long
time ago... or am I misled ). If s/he's your 2nd child, make sure you still
pay enough attention to your first child( more than to nvidia texture tools :)
) so that he doesn't feel like he's left behind!
Hope everything is well with your beautiful and almost-new-born baby!
Original comment by pope...@gmail.com
on 5 Nov 2010 at 2:32
YCoCg has been supported for a while now. One could write a much better
compressor for YCoCg images, but for the time being, the current implementation
is good enough.
Original comment by cast...@gmail.com
on 27 Sep 2011 at 6:45
Original issue reported on code.google.com by
cast...@gmail.com
on 1 Nov 2007 at 5:47