Open barbayrak opened 4 years ago
do you have an audioEncodingTarget
configured in the GPU Camera? i encountered this issue as well when trying to enable the audio track (by copying over bits from GPUImage2), but the black pixels went away when removing the audio logic. reference: https://github.com/BradLarson/GPUImage/issues/1255
@luoser I also readded some of the audioEncodingTarget logic from GPUImage2 so that my video would have audio. I'm getting black squares on the top parts of my video sometimes. Which logic did you end up removing?
The reference you posted seems to tell you to remove the first and last frame from the video which isn't necessary for me as only parts of the video have these black squares.
@eliot1019 you are correct, that reference mentions frames rather than the pixelation issue that we are experiencing, i had followed some comments from there to see if it would help. i actually haven't been able to consistently resolve this issue so am still working on it, unfortunately deprecating back to GPUImage2 doesn't seem to be a viable option since OpenGL is deprecated and my project targets 13+ :/
In MovieOutput.swift change this
func renderIntoPixelBuffer(_ pixelBuffer:CVPixelBuffer, texture:Texture) {
guard let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer) else {
print("Could not get buffer bytes")
return
}
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let outputTexture:Texture
if (Int(round(self.size.width)) != texture.texture.width) && (Int(round(self.size.height)) != texture.texture.height) {
let commandBuffer = sharedMetalRenderingDevice.commandQueue.makeCommandBuffer()
outputTexture = Texture(device:sharedMetalRenderingDevice.device, orientation: .portrait, width: Int(round(self.size.width)), height: Int(round(self.size.height)), timingStyle: texture.timingStyle)
commandBuffer?.renderQuad(pipelineState: renderPipelineState, inputTextures: [0:texture], outputTexture: outputTexture)
commandBuffer?.commit()
commandBuffer?.waitUntilCompleted()
} else {
outputTexture = texture
}
let region = MTLRegionMake2D(0, 0, outputTexture.texture.width, outputTexture.texture.height)
outputTexture.texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
}
to this
func renderIntoPixelBuffer(_ pixelBuffer:CVPixelBuffer, texture:Texture) {
guard let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer) else {
print("Could not get buffer bytes")
return
}
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let outputTexture:Texture
let commandBuffer = sharedMetalRenderingDevice.commandQueue.makeCommandBuffer()
outputTexture = Texture(device:sharedMetalRenderingDevice.device,
orientation: .portrait,
width: Int(round(self.size.width)),
height: Int(round(self.size.height)),
timingStyle: texture.timingStyle)
commandBuffer?.renderQuad(pipelineState: renderPipelineState, inputTextures: [0:texture], outputTexture: outputTexture)
commandBuffer?.commit()
commandBuffer?.waitUntilCompleted()
let region = MTLRegionMake2D(0, 0, outputTexture.texture.width, outputTexture.texture.height)
outputTexture.texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
}
will fix. It should wait until rendering is complete.
Actual video is green
remove green color using ChromaKeyBlend
but its give black background
I want to remove that black background because I want to play on AR
how its possible ?
Hi there ,
I want to migrate GPUImage2 to GPUImage3 but when i run a test that simply captures from Camera and writes to a file i weirdly got black pixels randomly in my output . I am sharing my test viewcontroller and result video's one frame (black pixels appears especially on top of the record video). I also test this code on Iphone X (13.5) and Iphone 8 (12.1) . Iphone X a lot less black pixels compared to Iphone 8
When i run the same code on GPUImage2 everything is fine