Closed Poker-sang closed 10 months ago
How to start workflow?
Crikey you’re fast!!
Workflows need permission to run from the repository owner. I’ll run it now.
I tried to play the encoded out image in my local image viewer and it plays fine. But I'm not sure how to write a good unit test to verify it.
I would look at our Gif unit tests. We test using individual frames plus do metadata tests to ensure consistency.
@JimBobSquarePants To avoid creating bitWriter
objects too early, I changed some methods to static. As a result, the scratchBuffer
field does not have any references. Should we remove this field?
If it's not used yeah.
I'm sorry I used Resharper to refactor to Specify created type
. Forgetting it leads to SA1117
.
We need to throw some test images at this now.
I would like to see some testing involving alpha blending as I think we've previously implemented it incorrectly. If it works anything like png the blend should take place per scanline where a temp buffer is used to fold the initial decode before blending it with the frame row. If you look at the latest changes to #2511 you can see how it is implemented there.
Do you mean that I should modify the WebP decoder and encoder according to APNG? Especially to address issues that arise with Blending and Disposing. Then select some new test images?
There's a good sample for alpha blending tests at https://bugs.chromium.org/p/chromium/issues/detail?id=364679 (landscape.webp linked at the top)
Do you mean that I should modify the WebP decoder and encoder according to APNG? Especially to address issues that arise with Blending and Disposing. Then select some new test images?
Yes.
Blending needs to work per scanline within the frame region against the current frame.
We always build a frame using the previous frame data. We might then clear that data before populating the frame with the decoded pixel data. If that newly decoded data represents transparent pixels and you blend with the previous frame you then lose that transparency. What we should do is write the scanline to a temporary buffer and blend that buffer with the current frame. One of the tests png images alerted me to that issue.
I've also noticed we're not capturing the disposal or blending data in the webp frame metadata, we need that for encoding.
@JimBobSquarePants I tried to modify blend operation according to https://github.com/SixLabors/ImageSharp/pull/2511/commits/66f444d200985720a9ff3e27a7a52d4722836b43. But the decoder seems to have a problem whereas it didn't before. I'm not sure if I misunderstood what you meant before. Sorry if so.
Before:
After:
@Poker-sang I'll try to have a look tonight.
OK. Alpha blending works as expected using the provided test image using the correct approach.
@JimBobSquarePants Wow it's merged. Thank you for your help and quick response!
Maybe we can close #1802 as well.
Prerequisites
Description
Animated webp encoder closed #2521