Open rayrobdod opened 1 year ago
On the one hand, I want the program to be relatively strict in order to avoid corrupting something that it does not understand; on the other hand "be liberal in what you accept"
If the input isn't checked for the artificial chunk size limit, only output, then the test for a large uncompressed file hits the same code paths as a zip bomb, meaning I wouldn't have to do both.
Granted, I don't think CI would be happy whether the program tried to process a large input file or a zip bomb.
-- the PNG spec (Second Edition)
IDAT
(orfdAT
- but this would require renumbering all the APNG chunks, which is the same problem as merging adjacentfdATs
) can be split;Chunk
struct could hold data that is illegal to write - which can already happen through theConcatinateIdats
step, or theoretically sinceChunk
doesn't currently encapsulate itstyp
field. Being idiomatic would probably require forcingChunk
to always be a valid-to-write chunk. (have separate png-legal Chunk and a not-neccessarily-legal Chunk types? seems overengineered). And this would require adding aSplitIdats
step.zTXT
,iCCP
oriTXT
would be unrecoverable. Those chunks can't be split with a preserved meaning, and an algorithm to inflate as much as possible without going over 2MB is not worth the time or effort.Also, might want to be able to integration-test this without having to commit a 2MB file into the source repository. Which would mean having to let the derive crate generate files in addition to listing existing paths.