BradLarson / GPUImage

An open source iOS framework for GPU-based image and video processing
http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework
BSD 3-Clause "New" or "Revised" License
20.24k stars 4.61k forks source link

How to optimize filter chain #836

Closed oliverwolfson closed 11 years ago

oliverwolfson commented 11 years ago

I have the results I want, represented by the code below, but I assume I am not running a very tight show in terms of optimization (or even correct setup) . In the interest of helping understand how these filters should be set up, would anyone be so kind as to point out the errors in my code below.

-(void)initFilterChain
{

    hueFilter = [[GPUImageHueFilter alloc] init];
    contrastFilter = [[GPUImageContrastFilter alloc]init];
    multiplyBlendFilter = [[GPUImageMultiplyBlendFilter alloc]init];
    overlayFilter = [[GPUImageOverlayBlendFilter alloc] init];

    inputPicture = [[GPUImagePicture alloc] initWithImage:inputImage smoothlyScaleOutput:NO];
    picture1 = [[GPUImagePicture alloc] initWithImage:[UIImage imageNamed:@"image1.jpg"] smoothlyScaleOutput:NO];

}

- (IBAction)processFilter:(id)sender {

    [self initFilterChain];

    UIImageView *processedImage = [[UIImageView alloc] init];
    [processedImage setFrame:CGRectMake(20, 20, 280, 400)];
    [self.view addSubview:processedImage];

    [contrastFilter setContrast:slider.value*10];

    [picture1 addTarget:hueFilter];
    [hueFilter addTarget:contrastFilter];
    [picture1 processImage];

    processedImage.image = [contrastFilter imageFromCurrentlyProcessedOutput];

    UIImage *filteredImage2 = processedImage.image;
    GPUImagePicture *filteredPicture = [[GPUImagePicture alloc] initWithImage:filteredImage2 smoothlyScaleOutput:NO];

    [filteredPicture processImage];
    [inputPicture processImage];
    [filteredPicture addTarget:overlayFilter];
    [inputPicture addTarget:overlayFilter];
    processedImage.image = [overlayFilter imageFromCurrentlyProcessedOutput];

    NSLog(@"processedImage.image size %f %f", processedImage.image.size.width, processedImage.image.size.height);  
}
BradLarson commented 11 years ago

There are a few things I see here. First, you shouldn't recreate the filters or blended image for every new image you pass in. Create them and the picture1 instance once, and just swap out the input image by removing its targets and adding the appropriate filter as a target with the new input image. Then you just need to call -processImage to re-run the filter.

Second, don't use a UIImageView as your preview. Use a GPUImageView and target the last filter in the chain to that. UIImageView is expensive in two ways: first, the extraction of a UIImage from a filter requires a costly trip through Core Graphics, and second, the re-rendering of that in a UIImageView then causes the image to be extracted for display to the screen. With a GPUImageView, the content already on the GPU is kept there for display. Also, there's no need to create a new instance of that view every time the filter runs, so just create it once and keep it attached to the filter chain.

oliverwolfson commented 11 years ago

Thanks. I can't figure out exactly how to chain the filters together without creating the UIImages between. Is there a resource for learning this? Feeling pretty dense.

BradLarson commented 11 years ago

It's pretty easy, you just have to use -addTarget: to connect one filter to the next. You'd add your GPUImageView as the last target in the chain. There are examples of this in the Readme, as well as the sample applications. In particular, the SimpleImageFilter example shows how to load an image, filter it, and present it onscreen.