The API has been a constant PITA and it's super buggy.
Instead, we should use AVAssetReader and readerOutput.copyNextSampleBuffer().
I think it would be best to wrap up all the logic into a similar interface as generateCGImagesAsynchronously, but instead of accepting time points, it would accept FPS.
The API has been a constant PITA and it's super buggy.
Instead, we should use
AVAssetReader
andreaderOutput.copyNextSampleBuffer()
.I think it would be best to wrap up all the logic into a similar interface as
generateCGImagesAsynchronously
, but instead of accepting time points, it would accept FPS.Some inspiration here: https://github.com/twostraws/ControlRoom/blob/659c5d5f3ab982ea54c9346aec9b700c147112b5/ControlRoom/Extensions/AVAssetToGIF.swift
This issue requires advanced Swift knowledge.
You can use this to get the CGImage from the pixel buffer: