Open avaer opened 6 years ago
Not sure how useful this will be, but I have this shader for converting YUV to RGB on iOS. It took forever to find this one that worked for iOS specifically -- not sure if ML is the same or different but I figured it's worth sharing in case it does happen to be the same.
const mat3 yuv2rgb = mat3(
1, 0, 1.2802,
1, -0.214821, -0.380589,
1, 2.127982, 0
);
vec4 getRGB(vec2 uv) {
vec4 lum = texture(tLum, uv);
vec4 chroma = texture(tChroma, uv);
vec3 yuv = vec3(
1.1643 * (lum.r - 0.0625),
chroma.r - 0.5,
chroma.a - 0.5
);
vec3 rgb = yuv * yuv2rgb;
return vec4(rgb, 1.0);
}
Thanks, yeah we did a similar thing for the livestreaming prototype (though much more complicated planes management with https://github.com/brion/yuv-canvas).
Luckily the ML platform does the conversion for us now with an external sampler2D.
The code in https://github.com/webmixedreality/exokit/pull/496 is a greatly expanded superset of this issue; we just need to whittle it down and expose it under the media API for this.
We have initial webcam support hooked up on desktop (
getUserMedia({video: true})
), but it's missing from the ML hooks.This was blocked until we understood the camera data shape coming from the API (it's
ANativeWindowBuffer
YUV_420_888
). We need to convert to RGB and pipe that stream.