Open tmm1 opened 4 years ago
🤦 I forgot to setup the MGLContext first!
I'm trying to use MGLKView
and MGLContext
to swap in for EAGL*
counterparts now, and ran into some missing methods:
MGLContext
is missing API
property (https://developer.apple.com/documentation/opengles/eaglcontext/1624885-api?language=objc)MGLKView
is missing:
bindDrawable
(https://developer.apple.com/documentation/glkit/glkview/1615593-binddrawable?language=objc)drawableHeight
drawableWidth
(https://developer.apple.com/documentation/glkit/glkview/1615591-drawablewidth)initWithFrame:Context:
(https://developer.apple.com/documentation/glkit/glkview/1615609-initwithframe)setEnableSetNeedsDisplay:
After adapting to the API differences above, I was able to run my app and everything renders as expected. Very nicely done!
One of the reasons I'm looking into MetalANGLE is because Apple's GLES is stuck at 3.0 and doesn't support EXT_texture_norm16
(https://www.khronos.org/registry/OpenGL/extensions/EXT/EXT_texture_norm16.txt). Do you think this could be implemented on top of Metal?
Thanks for the issue report,
Yes, glGetString
doesn't work if there is no context.
For your glGetProcAddress
implementation you can just call EGL API's eglGetProcAddress
instead of using CFBundleGetFunctionPointerForName
boiler plate code above. eglGetProcAddress
will be only 1 line of code.
The missing EAGL's equivalent APIs seem to be simple to implement. I will add them later. So you managed to make your project work without these APIs? That's great to know.
It's possible to implement EXT_texture_norm16
. However, this extension requires GLES 3.1 which I don't think MetalANGLE
can achieve anytime soon. So if this extension was implemented, it would only be partially supported. Nevertheless, it might be enough for your use cases if you don't need any GLES 3.1 functionalities.
eglGetProcAddress will be only 1 line of code.
Which MGLKit header can I use for this? Before I only had @import GLKit
.
Nevertheless, it might be enough for your use cases if you don't need any GLES 3.1 functionalities.
Yes it would be enough for me. Can you point me where in the code I could get started trying to add these new formats?
eglGetProcAddress
you can use #include <MetalANGLE/EGL/egl.h>
. MGLKit.h
header doesn't include EGL/GL headers by default, so they should be included manually. Sorry I forgot to mention this.EXT_texture_norm16
extension (excluding GLES 3.1 requirements for now). This format extension seems to not be difficult to implement.I will try to implemented the EXT_texture_norm16 extension (excluding GLES 3.1 requirements for now). This format extension seems to not be difficult to implement.
Great, I will look forward to it. If you are busy then I can try to implement it also. But in that case I need some pointers to get familiar with where MGL defines formats and how/where I can start adding the 16bit versions.
How do you intend to use 16 bits formats? Use them for OpenGL textures or IOS view layer/default framebuffer? The extension above is only for OpenGL textures. So if the intention is to use it for textures you only need to modify OpenGL code (particularly some glTexImage, glTexStorage function calls’ parameters). However if you intend to use these formats for creating default framebuffer then some new MGL enums would need to be created. Not to mention the OpenGL extension above is not needed in this case.
I will try to implemented the EXT_texture_norm16 extension (excluding GLES 3.1 requirements for now). This format extension seems to not be difficult to implement.
Great, I will look forward to it. If you are busy then I can try to implement it also. But in that case I need some pointers to get familiar with where MGL defines formats and how/where I can start adding the 16bit versions.
Ops, I missed your comment, if you want to implement it, you can take a look at this commit https://chromium.googlesource.com/angle/angle/+/25ab4510787f247ca364a052f7b3389ed7311d7a.
Looks like ANGLE already implemented this extension, the only thing to do is making sure all required 16 bits formats are supported in metal back-end so that ANGLE can enable this extension in front-end (ANGLE will enable it even if the context API is 3.0).
Most of the 16 bits formats already are supported. The only ones missing are R16G16B16_UNORM
and R16G16B16_SNORM
. These formats are not natively supported by metal, so we need to convert them to 4 components format in a similar way to this: https://github.com/kakashidinho/metalangle/blob/b3b8f451ba00f94a4c75089a63d316f74f31dc8d/src/libANGLE/renderer/metal/mtl_format_map.json#L99
This json file is used to generate metal format conversion code. Every time it is modified, the script scripts/run_code_generation.py
needs to be run again to re-generate the appropriate code.
It would be great if this extension implementation could be tested in your project.
Thank you for those pointers!
How do you intend to use 16 bits formats? Use them for OpenGL textures or IOS view layer/default framebuffer?
I am using libmpv to render video. It uses OpenGL textures to upload the video frame planes into, and then shaders to render into frame buffer for playback.
For HDR videos, each color is 10bit so it's not possible to render the colors correctly with only 8bit textures for processing.
For HDR videos, each color is 10bit so it's not possible to render the colors correctly with only 8bit textures for processing.
Actually, GLES3 has 10 bits RGB format (alpha is 2 bits) - GL_RGB10_A2
, maybe it can be used in this case?
I tried your suggestion of adding the missing formats to the json file, and it works just as expected! (It took me longer to setup python2, and depot_tools etc so I could run the codegen script :)
Right away I see the extension is being advertised, and libmpv uses it:
[libmpv_render] v: Loaded extension GL_EXT_texture_norm16.
I cannot believe how easy it was. Really appreciate you guiding me through this process, and doing the hard work of figuring out what needed to be changed and where.
What is the best way for me to contribute these changes? Can I send a PR here, or do I need to send a CL upstream? I have signed the Google CLA already.
Since now the correct texture format is being used, next I need to find a way to set the colorspace on the underlying metal layer so that the colors are shown correctly on the display. For example on https://developer.apple.com/documentation/metal/drawable_objects/displaying_hdr_content_in_a_metal_layer/using_color_spaces_to_display_hdr_content?language=objc:
const CFStringRef name = kCGColorSpaceITUR_2020_PQ_EOTF;
CGColorSpaceRef colorspace = CGColorSpaceCreateWithName(name);
metalLayer.colorspace = colorspace;
CGColorSpaceRelease(colorspace);
Is there any way for me to reach in and get the underlying CAMetalLayer*
to modify it? Or would it be better for me to expose a new extension like EGL_EXT_gl_colorspace_bt2020_pq
and use that? (In this case, I would also need to make sure that the metalLayer.pixelFormat
is set to MTLPixelFormatRGBA16Float
).
Another improvement I would like to figure out: hardware videotoolbox decoding interop. Currently, with GLKit, I can take a CVImageBuffer
that comes from the VideoToolbox hardware decoder and pass it to CVOpenGLESTextureCacheCreateTextureFromImage
to convert into a GLES texture (CVOpenGLESTextureRef
). But obviously if I'm using MGLKit, that function will no longer work.
Instead, there is another function CVMetalTextureCacheCreateTextureFromImage
available to convert the image into a metal texture. But if I use that one, I need to figure out some way to take the CVMetalTextureRef
and pass to MetalANGLE directly.
I would really appreciate any thoughts you have on approaches and implementation here. Thanks again!
Or would it be better for me to expose a new extension like
EGL_EXT_gl_colorspace_bt2020_pq
and use that?
I saw this in generateExtensions
:
But still I don't see EGL_KHR_gl_colorspace
extensions string.
It seems to go down this route we would add another glColorspaceXXX
and set it to true, to expose the extensions listed on https://www.khronos.org/registry/EGL/extensions/EXT/EGL_EXT_gl_colorspace_bt2020_linear.txt
But on https://www.khronos.org/registry/EGL/extensions/KHR/EGL_KHR_gl_colorspace.txt it talks about eglCreateWindowSurface
, and I think that is not used with GLKit? So is it better to add more public APIs into MGLKView instead for controlling color space?
I need to figure out some way to take the
CVMetalTextureRef
and pass to MetalANGLE directly.
Another idea would be to get an IOSurface from the CVPixelBuffer using CVPixelBufferGetIOSurface.
Then maybe that can be passed in using the existing EGL_ANGLE_iosurface_client_buffer?
To convert the IOSurface to a metal texture, perhaps https://developer.apple.com/documentation/metal/mtldevice/1433378-newtexturewithdescriptor can be used. I'm not really sure what CVMetalTextureCache does so it may be more complicated than that.
I see also that you have a TODO for external image support:
Is this something that could be used to sample from an external metal texture reference?
talks about
eglCreateWindowSurface
, and I think that is not used with GLKit? So is it better to add more public APIs into MGLKView instead for controlling color space?
It looks like MGLKit is indeed responsible for eglCreateWindowSurface
:
And that's how the color space is passed in currently:
So it seems the best solution may be to add more MGLDrawableColorFormat
types which are supported? And we will still need to implement and use EGL_EXT_gl_colorspace_bt2020_pq
in between MGLKit and ANGLE.
You can create a PR on this repo. Upstream repo's metal code is very out of date. Their metal implementation has low priority, hence the merging process is slow.
For BT.2020 color space, I don't think metal supports it natively. EGL_KHR_gl_colorspace
is currently only used to create sRGB
colorspace surface. Though I need to confirm. You may also take a look at metal's supported format here: https://developer.apple.com/metal/Metal-Feature-Set-Tables.pdf
For IOSurface
interop. gles3-dev
branch already has implementation for it via EGL_ANGLE_iosurface_client_buffer
extension. See usage example here: https://github.com/kakashidinho/metalangle/blob/785cf0e1859f46e46e3b1af6a0512bf5899e3b58/src/tests/egl_tests/EGLIOSurfaceClientBufferTest.cpp#L152
Basically, you would create a EGL Pbuffer from iOSurface, then bind the PBuffer to OpenGL's texture. Then the texture would then be used by OpenGL code.
Update: Seems like CAMetalLayer
has colorSpace
property that can be changed to BT.2020 as you mentioned. Yes, this can be implemented by passing appropriate parameters to MGLLayer
and propagate them to metal backend's SurfaceMtl
.
Update v2: I just realized that IOSurface was disabled in tvOS in recent commit https://github.com/kakashidinho/metalangle/commit/1964ec02f475dc210300313d543f2cd8f96de9bb.
This was due to someone reported that IOSurface is private API in pre tvOS 13.0. Since I support tvOS 11.0+ by default, the easiest way is just disable it. If you want to use IOSurface, perhaps special Xcode targets angle_metal_tvos_13
& MetalANGLE_tvos_13
without ANGLE_DISABLE_IOSURFACE
macro need to be created.
Thank you! I just started to figure out that DisplayMtl::createPbufferFromClientBuffer
needed an implementation, but you are way ahead of me in 9fc9dddd83f047b6b314d498f12726893a1918d1
I think I understand all the pieces required for my goal, so I will be working on implementing it together this week and will be sending you some PRs.
FYI, today I built MetalANGLE in Debug
configuration and tried with my app while debugging some PRs, and I discovered some ANGLE asserts are triggering.
ERR: setDefaultFramebuffer(8954): ! Assert failed in setDefaultFramebuffer (src/libANGLE/Context.cpp:8954): mCurrentDrawSurface == nullptr
It seems related to my use of GLContext sharegroups across threads. I noticed MGLContext is using TLS so maybe that is having a bad interaction with my threaded usage. I plan to investigate further later and make a repro or fix PR.
Update v2: I just realized that IOSurface was disabled in tvOS in recent commit 1964ec0.
This was due to someone reported that IOSurface is private API in pre tvOS 13.0. Since I support tvOS 11.0+ by default, the easiest way is just disable it. If you want to use IOSurface, perhaps special Xcode targets
angle_metal_tvos_13
&MetalANGLE_tvos_13
withoutANGLE_DISABLE_IOSURFACE
macro need to be created.
I have just received some more requests on supporting importing external textures to MetalANGLE recently. Besides IOSurface
route, another possible solution as you suggested is creating a new extension similar to EGL_ANGLE_d3d_texture_client_buffer to import external Metal texture to MetalANGLE. This solution doesn't require IOSurface.framework
so the private API issue on older devices could be avoided.
If this new extension was to be implemented, another new extension would need to be implemented also. i.e. something similar to EGL_ANGLE_device_d3d, in order to query the metal device used by MetalANGLE so that the external textures could be created from the same device.
I’m trying to implement a mechanism for importing external texture to MetalANGLE. However when importing a texture I need to know the format of it. Do you have a list of formats that you are currently using with CVOpenGLESTextureCacheCreateTextureFromImage
?
Do you use YUV format? Cuz it is not supported in metal backend yet.
I have been playing with CVMetalTextureCacheCreateTextureFromImage
this week, and most YUV CVPixelBuffer
s (such as kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
) can be mapped into MTLPixelFormatR8Unorm
or similar (RG8, R16, RG16) formats depending on the plane.
For GLES variant the common formats are documented:
CVOpenGLESTextureCacheCreateTextureFromImage(
GL_TEXTURE_2D,
internal_format, // like GL_RGBA, GL_LUMINANCE, GL_RGBA8_OES, GL_RED, GL_RG
format, // like GL_RGBA and GL_LUMINANCE
type // like GL_UNSIGNED_BYTE
)
For 10bit formats, I think the only way is using GL_RGBA16F and GL_HALF_FLOAT_OES
Do you use YUV format? Cuz it is not supported in metal backend yet.
How do you mean?
For GLES variant the common formats are documented
One thing which is not documented: in GLES2 mode you can use GL_RED or GL_RG for a U or UV plane. But if you use GLES3, it starts to fail. The workaround is to use GL_LUMINANCE and GL_LUMINANCE_ALPHA
See http://stackoverflow.com/q/36213994/332798 and https://stackoverflow.com/a/8653891/332798
Regarding YpCbCr, this is helpful context: https://developer.apple.com/documentation/accelerate/conversion/understanding_ypcbcr_image_formats
For these video frames, they are backed by IOSurface with multiple data planes. The CVPixelBuffer wraps the IOSurface, and has flags which allow import/export to either Metal or GLES. This is detailed in https://developer.apple.com/documentation/metal/mixing_metal_and_opengl_rendering_in_a_view
Here is an example of po pixbuf
which shows the plane layouts:
<CVPixelBuffer 0x281eb1fe0 width=720 height=480 pixelFormat=y420 iosurface=0x282db03c0 planes=3>
<Plane 0 width=720 height=480 bytesPerRow=768>
<Plane 1 width=360 height=240 bytesPerRow=384>
<Plane 2 width=360 height=240 bytesPerRow=384>
<attributes={
Height = 480;
IOSurfaceProperties = {
};
MetalCompatibility = 1;
PixelFormatType = 2033463856;
Width = 720;
} propagatedAttachments={
} nonPropagatedAttachments={
}>
Do you use YUV format? Cuz it is not supported in metal backend yet.
How do you mean?
I meant if you use direct YUV422 format in metal then it is not supported yet. For example, there is a format
GL_RGB_422_APPLE
supported by the oldCVOpenGLESTextureCacheCreateTextureFromImage
function.
For example, there is a format
GL_RGB_422_APPLE
supported by the oldCVOpenGLESTextureCacheCreateTextureFromImage
function.
Hm, somehow I never saw this on the documentation before. Maybe it's new. Sounds interesting for some use-cases, but I think most applications will still prefer mapping the underlying planes directly rather than repack/resample.
//Mapping a yuvs buffer as a source texture (note: yuvs/f and 2vuy are unpacked and resampled -- not colorspace converted)
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, textureCache, pixelBuffer, NULL, GL_TEXTURE_2D, GL_RGB_422_APPLE, width, height, GL_RGB_422_APPLE, GL_UNSIGNED_SHORT_8_8_APPLE, 1, &outTexture);
Also, there is yuv pixel formats in metal for example: https://developer.apple.com/documentation/metal/mtlpixelformat/gbgr422 I just wanted to confirm if you ever need it.
I have added a new extension EGL_MGL_mtl_texture_client_buffer
to import metal texture as PBuffer on gles3-dev
branch.
Maybe you could give it a try. Here is an usage example: https://github.com/kakashidinho/metalangle/blob/b5b41eecf1ea8ae4e416e429ccbd5991d50c71e2/src/tests/egl_tests/EGLTextureClientBufferTest.mm#L163
This is just one of the way metal texture can be imported. There are other ways such as implement a new target type for EGL_KHR_image_base
, so that an EGLImageKHR
can be created from existing metal texture. I am planning to implement this extension some time in future. But PBuffer extension might be enough for you usage for now.
This is the extension specification's draft https://github.com/kakashidinho/metalangle/blob/b5b41eecf1ea8ae4e416e429ccbd5991d50c71e2/extensions/EGL_MGL_texture_client_buffer.txt
Note: I have only implemented EGL_MGL_mtl_texture_client_buffer
for metal textures. EGL_MGL_gl_texture_client_buffer
variant is not implemented yet in GL back-end.
@tmm1
Thanks for mention this wonderful framework. I'm trying adopt MetalANGLE + libmpv also as OpenGL is prohibit/removed from macCatalyst.
But I meet empty screen issue, seems I missing some MetalANGLE setup.
demo: https://github.com/qiudaomao/MPVColorIssue/blob/master/MPVColorIssue/MPVViewController.m
mpv shows error:
[libmpv_render/videotoolbox] error: need a current EAGLContext set
Replace GLKView with MGLKView.
#import <MetalANGLE/MGLKit.h>
#import <MetalANGLE/MGLContext.h>
#import <MetalANGLE/MGLKView.h>
#import <MetalANGLE/GLES2/gl2.h>
static void *get_proc_address(void *ctx, const char *name)
{
CFStringRef symbolName = CFStringCreateWithCString(kCFAllocatorDefault, name, kCFStringEncodingASCII);
void *addr = CFBundleGetFunctionPointerForName(CFBundleGetBundleWithIdentifier(CFSTR("com.google.OpenGLES")), symbolName);
CFRelease(symbolName);
NSLog(@"get_proc_address %s => %p", name, addr);
return addr;
}
@interface MpvClientOGLView : MGLKView
@property mpv_opengl_cb_context *mpvGL;
@end
@implementation MpvClientOGLView {
GLint defaultFBO;
}
- (void)awakeFromNib
{
[super awakeFromNib];
self.context = [[MGLContext alloc] initWithAPI:kMGLRenderingAPIOpenGLES2];
if (!self.context) {
NSLog(@"Failed to initialize OpenGLES 3.0 context");
}
[MGLContext setCurrentContext:self.context];
// Configure renderbuffers created by the view
self.drawableColorFormat = MGLDrawableColorFormatRGBA8888;
self.drawableDepthFormat = MGLDrawableDepthFormatNone;
self.drawableStencilFormat = MGLDrawableStencilFormatNone;
defaultFBO = -1;
}
- (void)fillBlack
{
glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT);
}
- (void)drawRect
{
if (defaultFBO == -1)
{
GLint i = 0;
glGetIntegerv(GL_FRAMEBUFFER_BINDING, &i);
defaultFBO = (i != 0) ? i : 1;
}
if (self.mpvGL)
{
mpv_opengl_cb_draw(self.mpvGL,
defaultFBO,
self.bounds.size.width * self.contentScaleFactor,
-self.bounds.size.height * self.contentScaleFactor);
}
}
- (void)drawRect:(CGRect)rect
{
[self drawRect];
}
@end
Just realize that I need rewrite video/out/opengl/hwdec_ios.m and link metaangle to build libmpv.
Unfortunately it cannot work with hwdec=videotoolbox yet so you must use hwdec=videotoolbox-copy
OpenGL is prohibit/removed from macCatalyst
I thought in Big Sur they added OpenGLES support. But maybe it only works for unmodified iOS app and not available to macCatalyst
https://twitter.com/stroughtonsmith/status/1286071942118879233?s=21
OpenGL is prohibit/removed from macCatalyst
I thought in Big Sur they added OpenGLES support. But maybe it only works for unmodified iOS app and not available to macCatalyst
https://twitter.com/stroughtonsmith/status/1286071942118879233?s=21
yes OpenGL runtime is there on Big Sur for macCatalyst, but not working on compiling time for intel/arm macCatalyst. It's now private APIs.
Unfortunately it cannot work with hwdec=videotoolbox yet so you must use hwdec=videotoolbox-copy
Is it because of metal texture interop?
Unfortunately it cannot work with hwdec=videotoolbox yet so you must use hwdec=videotoolbox-copy
Is it because of metal texture interop?
There are some missing APIs mpv videotoolbox used.
Hi @kakashidinho, thanks so much for your work on this project!
I'm trying to replace openGL with MetalANGLE on a tvOS project. First I tried simply to import MetalANGLE.framework into my project. It kept throwing "image not found" errors, until I realized I need to embed/codesign the framework into the product.
Once I was able to start my app, I changed my
getProcAddr
to usecom.google.OpenGLES
Now I can call for example
glGetProcAddr("glGetString")
and get a valid address. I checked in the debugger, and confirmed I'm getting an address from inside MetalANGLE:Next I tried to run
glGetString(GL_VERSION)
, but I only get backNULL
. Same thing withGL_EXTENSIONS
. What am I doing wrong?