BradLarson / GPUImage2

GPUImage 2 is a BSD-licensed Swift framework for GPU-accelerated video and image processing.
BSD 3-Clause "New" or "Revised" License
4.85k stars 605 forks source link

Swipe between filters works but sometimes produces bad frames in RenderView #213

Open doudouperrin opened 6 years ago

doudouperrin commented 6 years ago

Hello there,

Using GPUImage2 on my iOS app, I am facing an issue I can't resolve by my own. I implemented a swipe system to switch between filters. Actually, everything works fine globally. I face the issue with two behaviors, both with rendering the video in the renderView:

1) Sometimes when I apply the default filter just after calling startCapture 2) Sometimes when I switch from a filter to another ==> Thus I suspect my issue is with applyVideoFilter() (see my code below)

Quite hard to describe, some filters becomes weird, giving me a broken preview. For example:

Here is a video of what appends, with MissEtikate filter as default filter: https://drive.google.com/open?id=1cBbj5VsUZ_8ab_Nba-bULQlezKGp4CMR

Note that :

1) The issue never appears if I don't use filter ( when camera --> renderView) 2) It appends on all iOS device I tested (iphone 5S, ipad air, ipad 2, ipad 4) 4) It seems to happen only on the default filter (the first I apply, Miss Etikate in the video). 5) It appears more after having running a first record (which ends with another screen before going back to Camera screen), or after reversing camera. 6) camera.runBenchmark() continues to print normally with normal values 7) I am up-to-date with the repo and I applied the Path #135 Fix: Front Camera Issue in iOS

Can someone (Brad?) help me with this ?

Please, have a look at my code structure: CameraViewController (simplified with only Camera stuff) ` import UIKit import Foundation import AVFoundation import GPUImage

class CameraViewController: CameraDelegate {

var cameraView: RenderView!

// CameraManager
var camera:Camera!
var movieOutput:MovieOutput? = nil

// GPUImage filters management
var videoEffectsDict: [Int: VideoEffectEnum] = [:]
var currentVideoEffect: ImageProcessingOperation?
// Default filter is MissEtikate
var currentVideoEffectType: VideoEffectEnum! = .missEtikate

// Video effect gesture recognizer properties
var videoEffectPanRecognizer: UIPanGestureRecognizer!
var startSwipeVideoEffectLocation: CGPoint?
var hasSwitchVideoEffect: Bool = false
var videoEffectSlide = false

override func viewDidLoad() {
    super.viewDidLoad()

    // configure CameraView :
    let screenSize = UIScreen.main.bounds.size
    cameraView = RenderView()
    cameraView.frame = CGRect(0, 0, screenSize.width, screenSize.height)

    // Exposure tap listener
    let exposureTapListener = UITapGestureRecognizer(target: self, action: #selector(CameraViewController.focusAndExposeTap(_:)))
    cameraView.addGestureRecognizer(exposureTapListener)

    self.view.insertSubview(cameraView, at: 0)

    cameraView.fillMode = .preserveAspectRatioAndFill
    self.cameraView.backgroundColor = UIColor.clear
    configureCamera(location: PhysicalCameraLocation.frontFacing)
    configureVideoFilters()

    // Manage Camera session when app goes BG or becomes active
    NotificationCenter.default.addObserver(self, selector: #selector(CameraViewController.appMovedToBackground), name: Notification.Name.UIApplicationWillResignActive, object: nil)
    NotificationCenter.default.addObserver(forName: Notification.Name.UIApplicationDidBecomeActive, object: nil, queue: OperationQueue.main, using: {_ in
        if !self.camera.captureSession.isRunning {
            self.camera.startCapture()
            self.applyVideoFilter(self.currentVideoEffectType)
        }
    })

    // Register Battles updated event
    if !Cizoo.isCizooLive {
        SwiftEventBus.onMainThread(self, name:"BattlesUpdatedEvent") { _ in
            self.updateBattlesButton()
        }
    }
}

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)

    camera.startCapture()
    applyVideoFilter(currentVideoEffectType)
    videoEffectPanRecognizer.isEnabled = true
}

override func viewDidAppear(_ animated: Bool) {
    super.viewDidAppear(animated)
        // Nothing related to GPUImage()
    }

    if !Cizoo.isCizooLive {
        let isPushPermShown = UserDefaults.standard.object(forKey: "Push_Permission_Ask") as! Bool?
        if isPushPermShown != nil && !isPushPermShown! {
            if !Cizoo.isCizooLive {
                if (self.actionManagerService.times(.cizooValidated) + 0) != 0 && ((self.actionManagerService.times(.cizooValidated) + 0) % 4) == 0 && !self.actionManagerService.isDone(.presentRate) && !self.actionManagerService.isDone(.rateApp){
                    self.actionManagerService.done(.presentRate)
                    self.rate()
                }

                // Manage the display of invitation popup (for appStore version only)
                if !invitePopinAlreadyShown {
                    manageInvitationPopupToUnlockEffects()
                }

                let lock = UserDefaults.standard.object(forKey: "Lock_For_New_Effect") as! String?
                if  lock != "HadInvitedAndShowedEffectPopRemind" {
                    self.showConfirmationOrRemindInvitationPopup()
                }
            }
        } else {
            UserDefaults.standard.set(false, forKey: "Push_Permission_Ask")
        }
    }

}

override func viewDidDisappear(_ animated: Bool) {
    // Release GPUImage camera
    releaseCamera()
}

// Manager of going to backfround -> stop record
func appMovedToBackground() {

    releaseCamera()
    if isRecording == true {
        self.cancel(self.cancelButton)
    }
}

// Manage Reverse Camera action
@IBAction func switchCamera(_ sender: AnyObject) {

    camera.stopCapture()
    camera.removeAllTargets()

    if camera.location == .frontFacing {
        configureCamera(location: PhysicalCameraLocation.backFacing)
    } else {
        configureCamera(location: PhysicalCameraLocation.frontFacing)
    }
    applyVideoFilter(currentVideoEffectType!)

    camera.startCapture()

}

// Init the Camera session
fileprivate func configureCamera(location: PhysicalCameraLocation) {
    do {
        camera = try Camera(sessionPreset:AVCaptureSessionPresetHigh, location: location)
        camera.audioEncodingTarget = nil    // no audio
        camera.delegate = self
    } catch {
        print("Can not initialize Camera.")
    }
}

// Release camera session (when going to another screen, app going to background)
func releaseCamera() {
    // Dealloc GPUImage camera
    camera.stopCapture()
    camera.removeAllTargets()
    if currentVideoEffect != nil {
        currentVideoEffect!.removeAllTargets()
        if let cve = currentVideoEffect as? BlendOperationGroup {
            cve.release()
        }
    }
}

// Manage the focuse and exposure when touching the Render view
@IBAction func focusAndExposeTap(_ sender: UITapGestureRecognizer) {

    let point = sender.location(in: sender.view)

    // Convert coordinates for GPUImage
    var pointOfInterest = UtilsService.convertToPointOfInterestFromViewCoordinates(viewCoordinates: point, frame: cameraView.bounds, orientation: UIDeviceOrientation.portrait, fillMode: .preserveAspectRatioAndFill, mirrored: true)
    pointOfInterest.y = 1.0 - pointOfInterest.y

    guard let currentDevice = camera.location.device() else {
        return
    }

    // Manage Focus
    if currentDevice.isFocusPointOfInterestSupported && currentDevice.isFocusModeSupported(AVCaptureFocusMode.autoFocus) {

        do {
         try currentDevice.lockForConfiguration()
            currentDevice.focusPointOfInterest = pointOfInterest
            currentDevice.focusMode = AVCaptureFocusMode.autoFocus
            currentDevice.unlockForConfiguration()

        } catch {
            return
        }
    }

    // Manage Exposition
    if currentDevice.isExposurePointOfInterestSupported && currentDevice.isExposureModeSupported(AVCaptureExposureMode.autoExpose) {

        do {
            try currentDevice.lockForConfiguration()
            currentDevice.exposurePointOfInterest = pointOfInterest
            currentDevice.exposureMode = AVCaptureExposureMode.autoExpose
            currentDevice.unlockForConfiguration()

        } catch {
            return
        }
    }
}

// Init the video filter dictionnary. Defines filters position inside the "Swiper"
fileprivate func configureVideoFilters() {

    /*** Add effects in dictonnary ***/
    videoEffectsDict.removeAll()

    var count = 0
    videoEffectsDict[count] = VideoEffectEnum.missEtikate
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.glitchBlend
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.vhs
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.vhsBW
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.none
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.sobelEdge
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.sepia
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.grayScale
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.redFalseColor
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.pinkFalseColor
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.sketch
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.sparkle3Blend
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.discoBlend
    count+=1
    videoEffectsDict[count] = VideoEffectEnum.starsBlend
    count+=1

    // Gesture recognizer configuration for video filter
    videoEffectPanRecognizer = UIPanGestureRecognizer(target: self, action: #selector(CameraViewController.onVideoFilterChanged))
    videoEffectPanRecognizer?.isEnabled = false
    swipeForEffectView.isHidden = true
    self.view.addGestureRecognizer(videoEffectPanRecognizer!)

}

// Apply selected videoEffect on camera and define GPUImage pipeline
fileprivate func applyVideoFilter(_ videoEffect: VideoEffectEnum) {

    // Clear all before applying the selected filter
    camera.removeAllTargets()
    if let currentVideoEffect = currentVideoEffect {
        currentVideoEffect.removeAllTargets()
        if let currentVideoEffectBlend = currentVideoEffect as? BlendOperationGroup {
            currentVideoEffectBlend.release()
        }
    }

    switch videoEffect {

    case .missEtikate:
        currentVideoEffect = MissEtikateFilter()
        break
    case .softElegance:
        currentVideoEffect = SoftElegance()
        break
    case .sepia:
        currentVideoEffect = SepiaToneFilter()
        break
    case .grayScale:
        currentVideoEffect = SaturationAdjustment()
        (currentVideoEffect as! SaturationAdjustment).saturation = 0
        break
    case .redFalseColor:
        currentVideoEffect = FalseColor()
        (currentVideoEffect as! FalseColor).firstColor = Color(red: 0.0, green: 0.0, blue: 0.5, alpha: 1.0)
        (currentVideoEffect as! FalseColor).secondColor = Color(red: 1.0, green: 0.0, blue: 0.0, alpha: 1.0)
        break
    case .yellowFalseColor:
        currentVideoEffect = FalseColor()
        (currentVideoEffect as! FalseColor).firstColor = Color(red: 0, green: 0.22, blue: 0.56, alpha: 1.0)
        (currentVideoEffect as! FalseColor).secondColor = Color(red: 0.97, green: 0.92, blue: 0.1, alpha: 1.0)
        break
    case .pinkFalseColor:
        currentVideoEffect = FalseColor()
        (currentVideoEffect as! FalseColor).firstColor = Color(red: 0, green: 0.22, blue: 0.56, alpha: 1.0)
        (currentVideoEffect as! FalseColor).secondColor = Color(red: 1.0, green: 0.42, blue: 0.59, alpha: 1.0)
        break
    case .sobelEdge:
        currentVideoEffect = SobelEdgeDetection()
        (currentVideoEffect as! SobelEdgeDetection).edgeStrength = 0.7
        break
    case .sketch:
        currentVideoEffect = SketchFilter()
        break
    case .vhs:
        currentVideoEffect = VHSFilter()
        break
    case .vhsBW:
        currentVideoEffect = VHSBWFilter()
        break
    case .discoBlend:
        currentVideoEffect = DiscoBlendFilter()
        break
    case .grungeBlend:
        currentVideoEffect = GrungeBlendFilter()
        break
    case .purpleBlend:
        currentVideoEffect = PurpleBlendFilter()
        break
    case .sparkle1Blend:
        currentVideoEffect = Sparkle1BlendFilter()
        break
    case .sparkle2Blend:
        currentVideoEffect = Sparkle2BlendFilter()
        break
    case .sparkle3Blend:
        currentVideoEffect = Sparkle3BlendFilter()
        break
    case .stars2Blend:
        currentVideoEffect = Stars2BlendFilter()
        break
    case .glitchBlend:
        currentVideoEffect = GlitchBlendFilter()
        break
    case .imageChromakeyBlend:
        currentVideoEffect = ImageChromaKey()
        break
    case .starsBlend:
        currentVideoEffect = StarsBlendFilter()
        break
    case .none:
        break
    }

    currentVideoEffectType = videoEffect

    // If no filter, direclty send camera to render view
    if videoEffect == VideoEffectEnum.none {
        camera --> cameraView
    } else {
        camera.addTarget(currentVideoEffect!)
        currentVideoEffect!.addTarget(cameraView)
    }
}

// Manage Swipe between video filters 
func onVideoFilterChanged(_ sender: UIPanGestureRecognizer) {

        if (sender.state == UIGestureRecognizerState.began) {
            self.videoEffectSlide = true
            startSwipeVideoEffectLocation = sender.location(in: self.view)
        } else if (sender.state == UIGestureRecognizerState.changed) {
            if(!hasSwitchVideoEffect) {
                let newLocation: CGPoint = sender.location(in: self.view)
                let xChange = newLocation.x - startSwipeVideoEffectLocation!.x
                if abs(xChange) > 40 {

                    // Define the video effect to use based on position in videoEffectsDict
                    let nbVideoFilter = videoEffectsDict.count
                    var newVideoFilterIndex = 0
                    if let currentVideoFilterIndex = videoEffectsDict.keysForValue(value: currentVideoEffectType).first {

                        if xChange > 0 {
                            // Apply previous video effect
                            newVideoFilterIndex = currentVideoFilterIndex - 1
                            if(newVideoFilterIndex < 0) {
                                newVideoFilterIndex = nbVideoFilter - 1
                            }
                        } else {
                            // Apply next video effect
                            newVideoFilterIndex = currentVideoFilterIndex + 1
                            if newVideoFilterIndex > nbVideoFilter - 1 {
                                newVideoFilterIndex = 0
                            }
                        }

                        applyVideoFilter(videoEffectsDict[newVideoFilterIndex]!)
                        hasSwitchVideoEffect = true
                    }
                }
            }

        } else if (sender.state == UIGestureRecognizerState.ended) {
            hasSwitchVideoEffect = false
        }
}

func didCaptureBuffer(_ sampleBuffer: CMSampleBuffer) {
    // Nothing to do here
}

}`

BlendOperationGroup (simplify asset management with Blend filters) ` import GPUImage

public class BlendOperationGroup: OperationGroup {

var blendAsset: ImageSource?
var timer: Timer?

func processBlendAsset() {
    guard let blendAsset = blendAsset else {
        return
    }

    if let blendImage = blendAsset as? PictureInput {
        blendImage.processImage()
        return
    }

    if let blendMovie = blendAsset as? MovieInput {
        blendMovie.start()
    }
}

func release() {

    if timer != nil {
        timer!.invalidate()
        timer = nil
    }

    guard let blendAsset = blendAsset else {
        return
    }

    if let blendImage = blendAsset as? PictureInput {
        blendImage.removeAllTargets()
        return
    }

    if let blendMovie = blendAsset as? MovieInput {
        blendMovie.removeAllTargets()
        blendMovie.cancel()
    }

}

} `

Questions to work around :

Please, do not hesitate to ask if you need more info. Thanks.