Open BradLarson opened 12 years ago
Actually I've seen this in prerecorded as well... Don't know if that helps or hurts, but +1
+1. I met this problem, too. I have rewrote some method to meet my need and found that using " if ([GPUImageOpenGLESContext supportsFastTextureUpload])" to deal with buffer will have this black frame problem, but if we use the method below, it won't have black frames.
So I wonder if there are any mistake in opengles handler or something else? I am not familiar with opengl. please help me to check it out.
Thank you.
+1 Here. Also the first time you record you get few black frames, the second time you get some garbage frames at the beginning (like "leftover" frames from previous record session).
I believe this is now fixed. Try with the latest code in the repository.
I had to add a check to make sure that frames didn't start being recorded before the writer was ready for them. Also, I added a glFlush() before the frame reading to make sure that all processing to a texture is done before accessing it, which should prevent some of the display corruption people were seeing in the first few frames.
I've tried, the black frames are gone but the first recording shows the first frame frozen for a few milisecs. That problem seems to disappear when I hit record again. (iPhone 4S - 960x540 preset).
Moving the bit of code below to right before if(![assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:frameTime]) seems to almost solve this issue. But I'm still getting the first frame green. (Only noticeable on a video editor).
if (CMTIME_IS_INVALID(startTime))
{
if (videoInputReadyCallback == NULL)
{
[assetWriter startWriting];
}
[assetWriter startSessionAtSourceTime:frameTime];
startTime = frameTime;
}
I think there is too much "[assetWriter startWriting]" here! You'd better add "if(assetWriter.status == 0){ ... }" assetWriter does not support to write anymore after finishWriting. and I find a new problem in the latest code.
https://github.com/BradLarson/GPUImage/issues/77
And what's more, I can't save my video anymore after the update. I don't know why. It indeed goes through the if(![assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:writeTime])
ok the black frame has gone and I can save my video, too. But why there is an error in #issues 77#?
And what's more, when I enable the audio, there are some small pauses in every second.
ok, the problem about small pauses when recording has been resolved by setting _writer.encodingLiveVideo = NO; It won't raise other problem, isn't it?
Shouldn't the glFlush() you added to fix this be a glFinish()? glFlush() doesn't wait, but glFlush() does. P.S. GPUImage is awesome, thanks.
Good point on the glFlush(). I've changed that over. I may still use glFlush(), with a staggered glFinish(), based on a technique I discussed with the engineers at WWDC, but for now I'll just use the glFinish().
I wonder if this issue is related: https://github.com/BradLarson/GPUImage/issues/345 - it does seem to only be affecting the first few frames.
Did someone already solved this problem? I noticed that using fileType kUTTypeMPEG4 everything works fine but the problem is then skip the video and audio frames. When you switch off all audioEncoderTarget also works correctly.
I have got a solve just now. I found here is a dispatch_sync problem in iOS 7 iPhone4s. Watch the code:
if (CMTIME_IS_INVALID(startTime))
{
if (videoInputReadyCallback == NULL)
{
[assetWriter startWriting];
}
[assetWriter startSessionAtSourceTime:frameTime];
startTime = frameTime;
}
Both in:
- (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex;
and
- (void)processAudioBuffer:(CMSampleBufferRef)audioBuffer;
Sometimes, the audio buffer called first while the video buffer is not ready. So that, the movie file is black in these frames.
So I just removed the dispatch_sync codes from "processAudioBuffer", to let the [assetWriter startWriting] only after "newFrameReady".
Yeah, we are experiencing issues of black frames at the end of the video as well. The resulting video track is shorter then the reported video container duration.
##### GPUImage ######
Maxims-MacBook-Air:test maximveksler$ MP4Box -info capture.mov
* Movie Info *
Timescale 44100 - Duration 00:00:03.552
Fragmented File no - 2 track(s)
File Brand qt - version 0
Created: GMT Wed Mar 19 20:16:16 2014
File has no MPEG4 IOD/OD
Track # 1 Info - TrackID 1 - TimeScale 600 - Duration 00:00:03.201
Media Info: Language "Undetermined" - Type "vide:avc1" - 96 samples
Visual Track layout: x=0 y=0 width=1280 height=720
MPEG-4 Config: Visual Stream - ObjectTypeIndication 0x21
AVC/H264 Video - Visual Size 1280 x 720
AVC Info: 1 SPS - 1 PPS - Profile Main @ Level 3.1
NAL Unit length bits: 32
Self-synchronized
Track # 2 Info - TrackID 2 - TimeScale 44100 - Duration 00:00:03.622
Media Info: Language "Undetermined" - Type "soun:mp4a" - 156 samples
MPEG-4 Config: Audio Stream - ObjectTypeIndication 0x40
MPEG-4 Audio MPEG-4 Audio AAC LC - 1 Channel(s) - SampleRate 44100
Synchronized on stream 1
@colinhebe you code is very useful for me ,thanks
I'm not sure what's causing this, but there's a brief period at the beginning of all recorded videos from the live camera feed where it's black. This doesn't happen with prerecorded videos that are fed through filters and then encoded.
It could still be an initial timestamp issue, but the initial timestamp should be getting set to that of the first frame of video that comes in.