Closed RunDevelopment closed 2 years ago
And I can't run benchmarks. It errors at some point and doesn't output anything. Maybe this has something to do with your recent project changes @nickbabcock? This is the error:
And I can't run benchmarks. It errors at some point and doesn't output anything. Maybe this has something to do with your recent project changes @nickbabcock? This is the error:
Yeah I'm not sure when the benchmarks broke, as I tried running the benchmarks many commits ago and they were still broken. This should now not happen again, as I've added running the benchmarks to CI.
Can you rebase off master and see if the benchmarks are fixed?
Sorry for all the dust I kicked up with the project changes. I wanted to get them in to make it easier for us to feel confident about any performance results
Very nice, they work now. Thank you! I'll post the results shortly.
Alright, the results are in and it's looking pretty good. As my initial run in #88 suggested, there is no noteable difference in performance, which is good.
DXT5 seems to have gotten faster, but that's probably just background process running and making the Before benchmark slower. The other benchmarked decoders also got faster in the After benchmark, so it's unlikely that my code changes increased performance.
Benchmarks:
BenchmarkDotNet=v0.13.1, OS=Windows 10.0.19044.1865 (21H2)
Intel Core i7-8700K CPU 3.70GHz (Coffee Lake), 1 CPU, 12 logical and 6 physical cores
.NET SDK=6.0.400
[Host] : .NET 6.0.8 (6.0.822.36306), X64 RyuJIT
DefaultJob : .NET 6.0.8 (6.0.822.36306), X64 RyuJIT
In case you want to run more dds benchmarks in the future you can target them better with:
dotnet run -c Release --project .\src\Pfim.Benchmarks -- --filter "*.DdsBenchmark.Pfim"
I ran the benchmarks too and didn't notice any deviations with this change.
You can really see the rounding errors in the tests. They were a lot more common than I expected. Interestingly, the largest difference is 3. I only expected them to be off by one, so this was quite surprising
Wow yeah, it's a bummer this large of a correctness issue has gone unnoticed for so long. I'm unsure what the upper bound of the error is, but anything above zero is unacceptable 😄
I ran the PR through some conversion tests and everything looked good:
dotnet run -c Release --project .\src\Pfim.Skia -- .\tests\Pfim.Tests\data\bc2-simple-srgb.dds
And spot checked the output the generated png with the dds image open in Paint.NET.
I did try and find a way to compare pfim output with a reference implementation, but I haven't solved the problem yet. I was hoping that imagemagick's compare would work, but it only supports BC1 images (afaik) and even for the BC1 image I tested on, it didn't work.
This PR looks good. I'll mull it over a little longer before merging and then releasing a new version.
This PR fixes #88. I applied my floating point solution (with a nicer interface) to DXT1, DXT3, and DXT5.
You can really see the rounding errors in the tests. They were a lot more common than I expected. Interestingly, the largest difference is 3. I only expected them to be off by one, so this was quite surprising. I checked that all changed colors are decoded correctly by comparing them with the color values Paint.net (which uses DirectX to read and write DDS files) has.
As for the hashes, I simply copied the actual values. There's probably some cool command that updates the these inline snapshots, so it would be nice if that was in the README.
I haven't done benchmarks yet, but they will likely not show much given my previous experience. I'll do them shortly.