Closed adrhill closed 2 years ago
Merging #63 (0bac976) into master (21996b9) will increase coverage by
0.46%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #63 +/- ##
==========================================
+ Coverage 96.81% 97.27% +0.46%
==========================================
Files 14 14
Lines 251 257 +6
==========================================
+ Hits 243 250 +7
+ Misses 8 7 -1
Impacted Files | Coverage Δ | |
---|---|---|
src/error_diffusion.jl | 100.00% <100.00%> (ø) |
|
src/utils.jl | 92.30% <100.00%> (+20.87%) |
:arrow_up: |
Continue to review full report at Codecov.
Legend - Click here to learn more
Δ = absolute <relative> (impact)
,ø = not affected
,? = missing data
Powered by Codecov. Last update 21996b9...0bac976. Read the comment docs.
Comparison to master: | Color scheme | master | original | this PR |
---|---|---|---|---|
:flag_us |
||||
:PuOr_6 |
||||
:websafe |
||||
:RdBu_10 |
||||
:jet |
||||
:rainbow |
||||
:flag_it |
Rerunning bechmarks.
I'm not too sure what to make of the comparison. To me, XYZ looks "more colorful", maybe that's due to higher contrast.
Not sure what's going on with the benchmarks. At least the relevant one is faster.
@johnnychen94 any opinion on how to proceed with this one?
I'm not too sure what to make of the comparison. To me, XYZ looks "more colorful", maybe that's due to higher contrast. Agree on this.
About the performance, we might need to figure out why it's slower for other unrelated cases. Let me run a local benchmark and see if it's more stable.
Seems to be a benchmark noise introduced by GitHub Action (which is why we need an exclusive benchmark machine...)
Sorry for stalling this over the holidays, I guess I'm just not sure what to make of the qualitative differences in the outputs. Is there some sort of metric for image similarity we could use? Alternatively, we could also just make the "diffusion color space" a parameter.
I guess I'm just not sure what to make of the qualitative differences in the outputs. Is there some sort of metric for image similarity we could use? Alternatively, we could also just make the "diffusion color space" a parameter.
Most people use PSNR and SSIM in image restoration fields and they're implemented in https://github.com/JuliaImages/ImageQualityIndexes.jl. Compared to PSNR, SSIM gives more "visual" similarity results as it is designed based on human eyes model. There's also a multi-scale version of SSIM implemented there. I'm not sure if this satisfies your need because dithering is quite another thing.
Thanks, I'll take a look at it!
I made the color space a kwarg of all error diffusion methods, defaulting to XYZ
.
I also noticed a small issue where clamp01
would be applied to XYZ
colorants. As far as I can tell, the upper limits of XYZ
are (0.95047, 1.0, 1.08883)
. I replaced clamp01
with a new function clamp_limits
that dispatches on the color type.
Closes #59