BradLarson / GPUImage

An open source iOS framework for GPU-based image and video processing
http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework
BSD 3-Clause "New" or "Revised" License
20.23k stars 4.61k forks source link

GPUImageUIElement – update outside of frameProcessingCompletionBlock #2211

Open gershengoren opened 8 years ago

gershengoren commented 8 years ago

Hello,

I have a GPUImageBrightnessFilter, a GPUImageAlphaBlendFilter with attached GPUImageUIElement. UIElement is initialized with UIView and has an UILabel inside. I'm trying to frequently update the text of this label and faced following strange behavior:

When I use frameProcessingCompletionBlock and update the text inside – everything works fine.

   filter.frameProcessingCompletionBlock = { [weak self] filter, time in
      if let strongSelf = self {
        strongSelf.filter.useNextFrameForImageCapture()

        // Update all labels
        strongSelf.overlayTimeLabel.text = strongSelf.currentRoundTime
        strongSelf.uiElementInput.updateUsingCurrentTime()
      }
      GPUImageContext.sharedFramebufferCache().purgeAllUnassignedFramebuffers()
    }

filter is an instance of GPUImageBrightnessFilter But when I try to change the overlayTimeLabel.text outside of this block, no changes will be displayed.

  func updateLabel() {
    overlayTimeLabel.text = currentRoundTime
  }

Function updateLabel will be called each second.

My full setup:

  func setupFilters() {
    filterView = self.view as! GPUImageView
    filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill

    filter = GPUImageBrightnessFilter()
    blendFilter = GPUImageAlphaBlendFilter()
    blendFilter.mix = 1.0

    uiElementInput = GPUImageUIElement(view: createOverlayView())

    filter.addTarget(blendFilter)
    uiElementInput.addTarget(blendFilter)
    blendFilter.addTarget(filterView)

    filter.frameProcessingCompletionBlock = { [weak self] filter, time in
      if let strongSelf = self {
        strongSelf.filter.useNextFrameForImageCapture()

        // Update all labels
        strongSelf.overlayTimeLabel.text = strongSelf.currentRoundTime

        strongSelf.uiElementInput.updateUsingCurrentTime()
      }
      GPUImageContext.sharedFramebufferCache().purgeAllUnassignedFramebuffers()
    }
  }

  func prepareCamera() {
    if camera != nil {
      camera!.removeAllTargets()
      camera = nil
    }
    camera = GPUImageVideoCamera(sessionPreset: AVCaptureSessionPreset1280x720, cameraPosition: AVCaptureDevicePosition.Back)
    camera!.outputImageOrientation = UIInterfaceOrientation.LandscapeRight
    camera!.horizontallyMirrorFrontFacingCamera = false
    camera!.horizontallyMirrorRearFacingCamera = false

    if let output = camera!.captureSession.outputs.last as? AVCaptureVideoDataOutput {
      let settings = output.videoSettings as NSDictionary
      videoWidth = settings.objectForKey("Width") as! Int
      videoHeight = settings.objectForKey("Height") as! Int
    }
    let videoSize = CGSize(width: videoWidth, height: videoHeight)

    filter.forceProcessingAtSizeRespectingAspectRatio(videoSize)
    blendFilter.forceProcessingAtSizeRespectingAspectRatio(videoSize)
    uiElementInput.forceProcessingAtSizeRespectingAspectRatio(videoSize)

    camera!.addTarget(filter)
    camera!.startCameraCapture()
  }

I'm wondering if it is a correct behavior?

gershengoren commented 8 years ago

When I call function updateLabel from frameProcessingCompletionBlock everything works fine - all labels will be updated. But when this function is called from outside of the block, nothing happens.

gershengoren commented 8 years ago

One more thing – if I try to update uiElementInput only when required (new data is available):

   filter.frameProcessingCompletionBlock = { [weak self] filter, time in
      if let strongSelf = self {
        strongSelf.filter.useNextFrameForImageCapture()

        // Update all labels
        if (strongSelf.needUpdate) {
            strongSelf.update()
            strongSelf.uiElementInput.updateUsingCurrentTime()
        }
      }
      GPUImageContext.sharedFramebufferCache().purgeAllUnassignedFramebuffers()
    }

I always get following error:

Assertion failure in -[GPUImageFramebuffer unlock], PATH/Pods/GPUImage/framework/Source/GPUImageFramebuffer.m:269
Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Tried to overrelease a framebuffer, did you forget to call -useNextFrameForImageCapture before using -imageFromCurrentFramebuffer?'
linjiansheng commented 8 years ago

I meet same problem if update uiElementInput only when required

linjiansheng commented 8 years ago

@gershengoren I solved

the reason is

    // TODO: This may not work
    outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:layerPixelSize textureOptions:self.outputTextureOptions onlyTexture:YES];
    [outputFramebuffer disableReferenceCounting]; // Add this line, because GPUImageTwoInputFilter.m frametime updatedMovieFrameOppositeStillImage is YES, but the secondbuffer not lock

    glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
    // no need to use self.outputTextureOptions here, we always need these texture options
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)layerPixelSize.width, (int)layerPixelSize.height, 0, GL_BGRA, GL_UNSIGNED_BYTE, imageData);

    free(imageData);
    for (id<GPUImageInput> currentTarget in targets)
    {
        if (currentTarget != self.targetToIgnoreForUpdates)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];

            [currentTarget setInputSize:layerPixelSize atIndex:textureIndexOfTarget];
            [currentTarget setInputFramebuffer:outputFramebuffer atIndex:textureIndexOfTarget]; // add this line, because the outputFramebuffer is update above
            [currentTarget newFrameReadyAtTime:frameTime atIndex:textureIndexOfTarget];
        }
    } 
ssertion failure in -[GPUImageFramebuffer unlock], PATH/Pods/GPUImage/framework/Source/GPUImageFramebuffer.m:269
Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Tried to overrelease a framebuffer, did you forget to call -useNextFrameForImageCapture before using -imageFromCurrentFramebuffer?'
llinardos commented 8 years ago

Thanks @linjiansheng, it works ok. But I don't understand why, can you explain it?

kevin-zqw commented 8 years ago

@linjiansheng Thanks so much, that worked. Could you please explain it for us?

AAWayne commented 7 years ago

very clever man

wlinshicong commented 6 years ago

@linjiansheng 为什么要这么改,源码的bug吗

CocoaML commented 6 years ago

@linjiansheng Yes ! it worked for me. but I do not know why ? can you give us some reasons ? Thanks

hezhk3 commented 5 years ago

In GPUImageMovie.m, the outputFrameBuffer is unlocked before targets' newFrameReadyAtTime:atIndex

for (id<GPUImageInput> currentTarget in targets)
{
    NSInteger indexOfObject = [targets indexOfObject:currentTarget];
    NSInteger targetTextureIndex = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
    [currentTarget setInputSize:CGSizeMake(bufferWidth, bufferHeight) atIndex:targetTextureIndex];
    [currentTarget setInputFramebuffer:outputFramebuffer atIndex:targetTextureIndex];
}

[outputFramebuffer unlock];

for (id<GPUImageInput> currentTarget in targets)
{
    NSInteger indexOfObject = [targets indexOfObject:currentTarget];
    NSInteger targetTextureIndex = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];
    [currentTarget newFrameReadyAtTime:currentSampleTime atIndex:targetTextureIndex];
}

So just replace the code in GPUImageUIElement.m with this.

chenyingchao commented 4 years ago

[outputFramebuffer disableReferenceCounting];

a long video use "GPUImageUIElement", if close the referenceCount, Whether the memory will burst?

chenyingchao commented 4 years ago

@all

mml237 commented 3 years ago

@gershengoren I solved

the reason is

    // TODO: This may not work
    outputFramebuffer = [[GPUImageContext sharedFramebufferCache] fetchFramebufferForSize:layerPixelSize textureOptions:self.outputTextureOptions onlyTexture:YES];
    [outputFramebuffer disableReferenceCounting]; // Add this line, because GPUImageTwoInputFilter.m frametime updatedMovieFrameOppositeStillImage is YES, but the secondbuffer not lock

    glBindTexture(GL_TEXTURE_2D, [outputFramebuffer texture]);
    // no need to use self.outputTextureOptions here, we always need these texture options
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, (int)layerPixelSize.width, (int)layerPixelSize.height, 0, GL_BGRA, GL_UNSIGNED_BYTE, imageData);

    free(imageData);
    for (id<GPUImageInput> currentTarget in targets)
    {
        if (currentTarget != self.targetToIgnoreForUpdates)
        {
            NSInteger indexOfObject = [targets indexOfObject:currentTarget];
            NSInteger textureIndexOfTarget = [[targetTextureIndices objectAtIndex:indexOfObject] integerValue];

            [currentTarget setInputSize:layerPixelSize atIndex:textureIndexOfTarget];
            [currentTarget setInputFramebuffer:outputFramebuffer atIndex:textureIndexOfTarget]; // add this line, because the outputFramebuffer is update above
            [currentTarget newFrameReadyAtTime:frameTime atIndex:textureIndexOfTarget];
        }
    } 
ssertion failure in -[GPUImageFramebuffer unlock], PATH/Pods/GPUImage/framework/Source/GPUImageFramebuffer.m:269
Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Tried to overrelease a framebuffer, did you forget to call -useNextFrameForImageCapture before using -imageFromCurrentFramebuffer?'

Thanks! Almost same code like you. But i got two questions:

  1. I called filter?.imageFromCurrentFramebuffer() just behind filter?.useNextFrameForImageCapture() within the block to detector human face, then i called uiElement.update() to update position, but still got overreleased crash.
  2. In the meantime, the memory is keeping rising, seems something cann't be released

Could you please help me? Thanks a lot!

image

func getFilter() -> GPUImageFilterGroup { let emoji = UIImage(named: "emoji") let bounds = UIScreen.main.bounds

    let temp = UIView(frame: bounds)
    temp.contentScaleFactor = UIScreen.main.scale

    let imageView = UIImageView(image: emoji)
    imageView.backgroundColor = .red
    imageView.frame = CGRect(x: 100, y: 200, width: 50, height: 50)
    imageView.contentMode = .scaleAspectFit
    temp.addSubview(imageView)

    let filterGroup = GPUImageFilterGroup()

    let uiElement = GPUImageUIElement(view: temp)
    let blendFilter = GPUImageTwoInputFilter(fragmentShaderFrom: loadShader(name: "AlphaBlend_Normal", extensionName: "frag")!)

    let filter = GPUImageFilter()
    filter.useNextFrameForImageCapture()
    let uiFilter = GPUImageFilter()

    uiElement?.addTarget(uiFilter)

    filter.addTarget(blendFilter)
    uiFilter.addTarget(blendFilter)

    filterGroup.addFilter(filter)
    filterGroup.addFilter(uiFilter)
    filterGroup.addFilter(blendFilter)

    filterGroup.initialFilters = [filter]
    filterGroup.terminalFilter = blendFilter

    filter.frameProcessingCompletionBlock = { filter, cmtime in
        defer {
            GPUImageContext.sharedFramebufferCache()?.purgeAllUnassignedFramebuffers()
        }
        filter?.useNextFrameForImageCapture()

        guard let image = filter?.imageFromCurrentFramebuffer(), let cgImage = image.cgImage else {
            return
        }

        DispatchQueue.main.async {
            guard let rect = FaceDetector.detectFaces(inputImage: cgImage).first else {
                return
            }
            let realRect = CGRect(x: bounds.width * rect.origin.x, y: bounds.height * rect.origin.y, width: bounds.width * rect.size.width, height: bounds.height * rect.size.height)

            imageView.frame = realRect
            uiElement?.updateUsingCurrentTime()
        }
    }

    return filterGroup
}

I tried add autorelasepool but doesn't work:


            defer {
                GPUImageContext.sharedFramebufferCache()?.purgeAllUnassignedFramebuffers()
            }
            filter?.useNextFrameForImageCapture()
            autoreleasepool {
                guard let image = filter?.imageFromCurrentFramebuffer(), let cgImage = image.cgImage else {
                    return
                }
                guard let rect = FaceDetector.detectFaces(inputImage: cgImage).first else {
                    return
                }
                let realRect = CGRect(x: bounds.width * rect.origin.x, y: bounds.height * rect.origin.y, width: bounds.width * rect.size.width, height: bounds.height * rect.size.height)

                imageView.frame = realRect
                uiElement?.updateUsingCurrentTime()
            }
        }
mml237 commented 3 years ago

Hello,

I have a GPUImageBrightnessFilter, a GPUImageAlphaBlendFilter with attached GPUImageUIElement. UIElement is initialized with UIView and has an UILabel inside. I'm trying to frequently update the text of this label and faced following strange behavior:

When I use frameProcessingCompletionBlock and update the text inside – everything works fine.

   filter.frameProcessingCompletionBlock = { [weak self] filter, time in
      if let strongSelf = self {
        strongSelf.filter.useNextFrameForImageCapture()

        // Update all labels
        strongSelf.overlayTimeLabel.text = strongSelf.currentRoundTime
        strongSelf.uiElementInput.updateUsingCurrentTime()
      }
      GPUImageContext.sharedFramebufferCache().purgeAllUnassignedFramebuffers()
    }

filter is an instance of GPUImageBrightnessFilter But when I try to change the overlayTimeLabel.text outside of this block, no changes will be displayed.

  func updateLabel() {
    overlayTimeLabel.text = currentRoundTime
  }

Function updateLabel will be called each second.

My full setup:

  func setupFilters() {
    filterView = self.view as! GPUImageView
    filterView.fillMode = kGPUImageFillModePreserveAspectRatioAndFill

    filter = GPUImageBrightnessFilter()
    blendFilter = GPUImageAlphaBlendFilter()
    blendFilter.mix = 1.0

    uiElementInput = GPUImageUIElement(view: createOverlayView())

    filter.addTarget(blendFilter)
    uiElementInput.addTarget(blendFilter)
    blendFilter.addTarget(filterView)

    filter.frameProcessingCompletionBlock = { [weak self] filter, time in
      if let strongSelf = self {
        strongSelf.filter.useNextFrameForImageCapture()

        // Update all labels
        strongSelf.overlayTimeLabel.text = strongSelf.currentRoundTime

        strongSelf.uiElementInput.updateUsingCurrentTime()
      }
      GPUImageContext.sharedFramebufferCache().purgeAllUnassignedFramebuffers()
    }
  }

  func prepareCamera() {
    if camera != nil {
      camera!.removeAllTargets()
      camera = nil
    }
    camera = GPUImageVideoCamera(sessionPreset: AVCaptureSessionPreset1280x720, cameraPosition: AVCaptureDevicePosition.Back)
    camera!.outputImageOrientation = UIInterfaceOrientation.LandscapeRight
    camera!.horizontallyMirrorFrontFacingCamera = false
    camera!.horizontallyMirrorRearFacingCamera = false

    if let output = camera!.captureSession.outputs.last as? AVCaptureVideoDataOutput {
      let settings = output.videoSettings as NSDictionary
      videoWidth = settings.objectForKey("Width") as! Int
      videoHeight = settings.objectForKey("Height") as! Int
    }
    let videoSize = CGSize(width: videoWidth, height: videoHeight)

    filter.forceProcessingAtSizeRespectingAspectRatio(videoSize)
    blendFilter.forceProcessingAtSizeRespectingAspectRatio(videoSize)
    uiElementInput.forceProcessingAtSizeRespectingAspectRatio(videoSize)

    camera!.addTarget(filter)
    camera!.startCameraCapture()
  }

I'm wondering if it is a correct behavior?

Hello, could you please tell what is "strongSelf.overlayTimeLabel"? it's a UILabel? Why it is a property? I update my uilabel's frame in completionBlock but crash because of update ui in non UI queue. Thanks so much