Open rorz opened 6 years ago
Found the source of the issue: RenderView
class needs to implement re-scaled backing bounds, as outlined in the OpenGL spec here. It turns out that setting wantsBestResolutionOpenGLSurface
alone isn't enough.
I will look at getting a PR going to implement this, but if you're in the same position as me, you can implement this really easily by modifying the RenderView
source to the following:
let backingBounds = self.convertToBacking(self.bounds) // <-- magic line
let viewSize = GLSize(width:GLint(round(backingBounds.size.width)), height:GLint(round(backingBounds.size.height)))
Hi there
I'm really enjoying writing custom filters and retro-fitting them into the 'SimpleImageFilter' example project to get going with GPUImage2 on macOS.
However, one thing irking me early on was the lack of retina / HiDPI output from the
RenderView
. According toRenderView
's parent class,NSOpenGLView
, you can set this directly viawantsBestResolutionOpenGLSurface
totrue
. I've since assumed that's the same as checking this box:When I run the application, the
pictureInput --> (filter x, y, z) -->
pipeline renders perfectly to theRenderView
output, in x2 resolution. Unfortunately, subsequent calls toprocessImage
, such as after changing filter inputs, causes the image to shrink down to occupy the bottom-left 1/4 of the view.In fact, if I take all the filters away and reduce the pipeline down to just
pictureInput --> renderView
, the firstprocessImage
call loads it up on the screen fine, while the second (and all subsequent calls), shrinks it down again.This is best illustrated by running it on the 'SimpleImageFilter' example, as vanilla as possible, and only checking the 'best resolution' box in IB.
Initial pass:
Subsequent passes:
I've so far tried a number of things, including:
processImage
I do feel as though the fact this fails to work on the example project either illustrates that I'm doing something wrong, or that it's not a simple one-liner.
FYI, this is in Xcode 9 / 10, and macOS 13. Thanks!