Open djromero opened 12 years ago
I don't know that the way to do this would be to have another class. Instead, we just need a flag on the GPUImageMovie class that makes it obey the timestamps of the video data it's playing back.
Someone else has to have done this for an AVAssetReader, so I just need to find a working implementation to slot into this in response to setting such a flag.
I haven't found a solution yet. I've tried some workaounds w/out much success.
But, just in case someone else has this same need, the upcoming 10.8 has news in this front (under NDA as usual). If iOS 6 adds the same changes it'll be perfectly doable.
I would also like doing that. can someone point me where to start? i couldn't find some "AVPlayer implementation" using AVAssetReader. thanks
What about to use -[AVAssetTrack nominalFrameRate]
to get FPS and delay processing each frame?
Does iOS6 make this an easier to implement?
It'd be phenomenally useful if this could be used to edit video streamed through AVPlayer
via Apple's HTTP live streaming protocol. As many of you know, Apple has pretty strict 5mb limits on the amount of progressive-download network video iOS can access on the cellular network, so being able to use GPUImage on streaming sources is gonna be very important. Thoughts?
Any updates on this? Has anyone made any progress on "realtime" GPUImage-processed video playback?
@zakdances check my comment on Issue #458 we added audio playing support to GPUImageMovie
I have done this with some efforts. In my case as we apply filter on image or video it will play without any loading and also start writing the output video. I want to contribute to GPUImage.
You can see my implementation https://drive.google.com/open?id=1yyD_AaE6lIGpb_waOhUIFG6jIzuHJqFj
If anyone interested to build GPUImageAVPlayer then contact me. sagarkoyani4u@gmail.com
Any plan to implement GPUImageAVPlayer?
This would be a new kind of output that will play a movie on file with one or more filters applied, without requiring to process the whole movie first, just after a few seconds of buffering. The movie should play like it will do in AVPlayer (speed, audio, seek operations, pause) but showing filtered video frames.
This can be done now in two steps: filter movie to a new file and then play it. But it's not very practical if the movie is longer than a few seconds. In my tests AVPlayer doesn't work very well if the played file it's being modified, even if you delay the playback.
SimpleVideoFileFilter is able to show the transformed video frames, but it "plays" too fast and doesn't have any audio. I looked into GPUImageView and GPUImageOpenGLESContext and it's not obvious how to add some kind of time synchronization to the mix.
I'm willing to collaborate with anyone that knows where to start.