mozilla / mp4parse-rust

Parser for ISO Base Media Format aka video/mp4 written in Rust.
Mozilla Public License 2.0
404 stars 62 forks source link

Badly formed files can cause `create_sample_table()` to OOM. #395

Open Zaggy1024 opened 1 year ago

Zaggy1024 commented 1 year ago

See Gecko Bug 1661368 and more recently Bug 1814791.

There is currently no validation of whether a file with an stsc box with a large sample count is actually referencing offsets that fall within the file, which allows create_sample_table() to allocate an arbitrarily large vector and OOM.

We could just set a configurable hard limit on the number of samples it can allocate, but I don't know if that's the best solution.

We can also fail early when the relevant sample boxes have mismatched sample counts, but I'm not sure if there are files that currently parse and break that rule, and as tnikkel pointed out, the fuzzer may still find a way to create a matching sample count that causes OOM. Might still be worth implementing this if it makes sense to fail that way.

I think the most ideal solution is instead to make create_sample_table() aware of the amount of data in the file that is currently available to the caller, and use the stsz box to determine how many samples the vector needs to allocate to reach EOF. For cases where the full size is not known beforehand, API users would have to call get_indices multiple times and be aware that their pointers will be invalid, but that doesn't seem like a difficult problem.

What I wonder is whether the available data is something that should be accessible from other parts of mp4parse as well, for example to allow the box parsing to indicate that it hit the end of the available data but can continue parsing when more data is available.

Feel free to let me know if this idea doesn't make sense, we can use this issue to brainstorm more solutions if necessary.

Zaggy1024 commented 1 year ago

Looks like #394 is semi-related, and the limit it uses could also apply to create_sample_table(). However, since we're not actually reading from the offsets we create in the table, it won't actually exit when an invalid offset is reached, so it would only make the file size-based limit easier to implement.

I'm throwing up a proof of concept for that file size check now in a WIP pull request.

Zaggy1024 commented 1 year ago

Perhaps a less complicated way to prevent this issue is for the API to provide the table in batches, which seems like it may be a better general solution. However, it would require more Gecko changes to get that working, since the MP4 demuxer currently just assumes that the entire indice is immediately available.

kinetiknz commented 1 year ago

Sorry for the delay! Thanks for filing the issue and providing the WIP fix. My fix in #394 was a bit of a band aid, as this similar issue arising in a different piece of code demonstrates. I've got an idea for a better approach to #394, which would also handle the case here and other cases an invalid file could trigger an OOM trivially. I'll post an updated patch on #394 in a day or so and would appreciate any feedback.

Zaggy1024 commented 1 year ago

Sounds good, I'll follow that PR to see when you update it.

tysmith commented 1 year ago

@kinetiknz Have you had a chance to revisit #394? This will help with fuzzing performance.

kinetiknz commented 1 year ago

Sorry for the delay, I'm still working on this but delayed due to other priorities - I hope to finish this quite soon.