rFlex / SCRecorder

iOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing
Apache License 2.0
3.06k stars 583 forks source link

Orientation changes when filter is applied on a video in iphone 6s #351

Closed amrit42087 closed 7 years ago

amrit42087 commented 7 years ago

I tried saving a video after applying filters to it. The orientation of the output file is changed. However when I save the video with no filter, it saves in correct orientation. This issue occurs on iphone 6s, however works perfectly on iphone.

NOTE: It only rotates the video when filters are applied and the device is iphone 6 or higher.

This is the code for saving the output file:

    self.view.lock()

    let currentFilter = filterView.filter?.copy()

    player.pause()

    let videoAsset = AVAsset(url: videoUrl!)

    let documentsDirectory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
    let filteredVideoUrl = documentsDirectory.appendingPathComponent("filteredVideo.mp4")

    let exportSession = SCAssetExportSession(asset: videoAsset)
    exportSession.videoConfiguration.filter = currentFilter as! SCFilter?
    exportSession.videoConfiguration.preset = SCPresetHighestQuality
    exportSession.audioConfiguration.preset = SCPresetHighestQuality
    exportSession.videoConfiguration.maxFrameRate = 35
    exportSession.outputUrl = filteredVideoUrl
    exportSession.outputFileType = AVFileTypeAppleM4V
    exportSession.delegate = self
    exportSession.contextType = SCContextType.auto

    let time = CACurrentMediaTime()

    exportSession.exportAsynchronously(completionHandler: {() -> Void in

        if !exportSession.cancelled {
            print("Completed compression in \(CACurrentMediaTime() - time)")
        }

        let error = exportSession.error

        if exportSession.cancelled {
            self.view.unlock()
            print("Export was cancelled")
        }

        else if error == nil {

            AudioVideoEditor().saveToLibrary(url: exportSession.outputUrl!) { (success, error) in
                                self.view.unlock()
                                if success {
                                    self.showAlert(title: "Success", message: "Successfully saved to camera roll")
                                }else {
                                    if error != nil {
                                        self.showAlert(title: "Error", message: error!.localizedDescription)
                                    }
                                }

                    }

        }
        else {
            self.view.unlock()
            if !exportSession.cancelled {
                self.view.unlock()
                UIAlertView(title: "Failed to save", message: (error?.localizedDescription)!, delegate: nil, cancelButtonTitle: "OK", otherButtonTitles: "").show()
            }
        }

    })

Here are the videos saved with and without filters applied: Without Filter: https://vid.me/tilP

with Filter https://vid.me/hOdZ

Also the video is zoomed a lot when filters are applied. I also tried changing the videoconfiguartion.scale = AVVideoScalingModeResizeAspect. The zoom is fixed but orientation still changes.

justjoeyuk commented 7 years ago

Same issue. A response would be appreciated.

Highdry03 commented 7 years ago

Sorry if i'm not 100% intelligible, english is not my first language.

I had the same problem, investigated the issue and found something. In this method in the SCFilter (VideoComposition) category.

- (AVMutableVideoComposition *)videoCompositionWithAsset:(AVAsset *)asset {
    if ([[AVVideoComposition class] respondsToSelector:@selector(videoCompositionWithAsset:applyingCIFiltersWithHandler:)]) {
        CIContext *context = [CIContext contextWithOptions:@{kCIContextWorkingColorSpace : [NSNull null], kCIContextOutputColorSpace : [NSNull null]}];
        return [AVMutableVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler:^(AVAsynchronousCIImageFilteringRequest * _Nonnull request) {

            CIImage *img2 = request.sourceImage;
            CIImage *image = [self imageByProcessingImage:img2 atTime:CMTimeGetSeconds(request.compositionTime)];

            [request finishWithImage:image context:context];
        }];

    }
    return nil;}

This videoComposition is returned when a filter is active, ran some breakpoints in this method: AVMutableVideoComposition videoCompositionWithAsset:asset applyingCIFiltersWithHandler and noticed that the frame images that it returns are already with the prefferedTransform applied. This is also mentioned in the apple documentation of the method.

Now going back to the SCAssetExportSession class we see that in the method - (void)_setupVideoUsingTracks:(NSArray *)videoTracks, we apply the track preffered transform to the video input, so basically when creating a video composition with SCFilter we get the images with the transform already applied and then the input applied it again.

I changed the (void)_setupVideoUsingTracks:(NSArray *)videoTracks and it seems to be working for me. This is the changed method:

- (void)_setupVideoUsingTracks:(NSArray *)videoTracks {
    _inputBufferSize = CGSizeZero;
    if (videoTracks.count > 0 && self.videoConfiguration.enabled && !self.videoConfiguration.shouldIgnore) {
        AVAssetTrack *videoTrack = [videoTracks objectAtIndex:0];

        // Input
        NSDictionary *videoSettings = [_videoConfiguration createAssetWriterOptionsWithVideoSize:videoTrack.naturalSize];

        // Output
        AVVideoComposition *videoComposition = self.videoConfiguration.composition;

        //Apply tranform to video input only when no filer is selected
         if (!(videoComposition == nil && self.videoConfiguration.filter != nil && self.translatesFilterIntoComposition)) {

             _videoInput = [self addWriter:AVMediaTypeVideo withSettings:videoSettings];

            if (_videoConfiguration.keepInputAffineTransform) {
                _videoInput.transform = videoTrack.preferredTransform;
            } else {
                _videoInput.transform = _videoConfiguration.affineTransform;
            }
        }else{

            //If we have a filter selected don't apply the transform but make sure to create the video settings with the right size ie naturalSize + the preffered transform
            CGSize naturalSizeFirst = videoTrack.naturalSize;

            CGSize temp = CGSizeApplyAffineTransform(naturalSizeFirst, videoTrack.preferredTransform);
            CGSize size = CGSizeMake(fabs(temp.width), fabs(temp.height));

            NSDictionary *videoSettings = [_videoConfiguration createAssetWriterOptionsWithVideoSize:size];
            _videoInput = [self addWriter:AVMediaTypeVideo withSettings:videoSettings];
        }

        if (videoComposition == nil) {
            _inputBufferSize = videoTrack.naturalSize;
        } else {
            _inputBufferSize = videoComposition.renderSize;
        }

        CGSize outputBufferSize = _inputBufferSize;
        if (!CGSizeEqualToSize(self.videoConfiguration.bufferSize, CGSizeZero)) {
            outputBufferSize = self.videoConfiguration.bufferSize;
        }

        _outputBufferSize = outputBufferSize;
        _outputBufferDiffersFromInput = !CGSizeEqualToSize(_inputBufferSize, outputBufferSize);

        _filter = [self _generateRenderingFilterForVideoSize:outputBufferSize];

        if (videoComposition == nil && _filter != nil && self.translatesFilterIntoComposition) {
            videoComposition = [_filter videoCompositionWithAsset:_inputAsset];
            if (videoComposition != nil) {
                _filter = nil;
            }
        }

        NSDictionary *settings = nil;
        if (_filter != nil || self.videoConfiguration.overlay != nil) {
            settings = @{
                         (id)kCVPixelBufferPixelFormatTypeKey     : [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA],
                         (id)kCVPixelBufferIOSurfacePropertiesKey : [NSDictionary dictionary]
                         };
        } else {
            settings = @{
                         (id)kCVPixelBufferPixelFormatTypeKey     : [NSNumber numberWithUnsignedInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange],
                         (id)kCVPixelBufferIOSurfacePropertiesKey : [NSDictionary dictionary]
                         };
        }

        AVAssetReaderOutput *reader = nil;
        if (videoComposition == nil) {
            reader = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:settings];
        } else {
            AVAssetReaderVideoCompositionOutput *videoCompositionOutput = [AVAssetReaderVideoCompositionOutput assetReaderVideoCompositionOutputWithVideoTracks:videoTracks videoSettings:settings];
            videoCompositionOutput.videoComposition = videoComposition;
            reader = videoCompositionOutput;
        }
        reader.alwaysCopiesSampleData = NO;

        if ([_reader canAddOutput:reader]) {
            [_reader addOutput:reader];
            _videoOutput = reader;
        } else {
            NSLog(@"Unable to add video reader output");
        }

        [self _setupPixelBufferAdaptorIfNeeded:_filter != nil || self.videoConfiguration.overlay != nil];
        [self _setupContextIfNeeded];
    } else {
        _videoOutput = nil;
    }
}

Hope it helps.

amrit42087 commented 7 years ago

@Highdry03 : It worked. Thanks a lot. I think creators should take this solution into account.