Looking-Glass / HoloPlayJS_Issues

A repository for feature requests and bug reports for the HoloPlay.js library. This repository doesn't contain source code. You can download the HoloPlay.js library at http://look.glass/threejs
0 stars 0 forks source link

performance in JS #19

Open JuanIrache opened 2 years ago

JuanIrache commented 2 years ago

Hi there,

I'm usin holoplay-core (0.0.8). I create an HTML canvas, convert it to PNG and send it to the device with HoloPlayCore.ShowMessage().

The problem is that converting a 3360 x 3360 canvas to PNG is really slow (about 300 ms), so performance is really bad. I'm getting between 3 and 4 frames per second. See this example: https://youtu.be/CJbYD2GOqBw

I tried sending the raw image data as a Uint8ClampedArray in RGBA format with CanvasRenderingContext2D.getImageData().data, instead of the PNG, which should be way faster, but the library did not accept that.

Is there a different approach that I can use? Is there a chance support for Uint8ClampedArray in RGBA format can be implemented?

The confirmation message after sending que Quilt also takes about 100 milliseconds, but I suppose that's harder to avoid (and I can start drawing the next frame while that is happening).

If you need to test things, my project is available here (it's a wrapper of p5 and holoplay-core, so 2D layers can be drawn to 3D space): https://github.com/JuanIrache/p5-holoplay-2d

Thank you.

alxdncn commented 2 years ago

Hi! As I mentioned on your other issue, we're working on a major refactor of this library. That makes it hard for us to make incremental improvements like those that would apply to the issue you're having.

However, it may be worth looking at our Blender add-on. That tool uses a websocket to send texture data via nng and is quite optimized. You can see a lot of that logic here: https://github.com/regcs/AliceLG/tree/master/lib/pylightio/lookingglass

JuanIrache commented 2 years ago

That looks beyond my curent skill set, but having a look. Thanks!

I think for cases like mine and other JS devs, what would be optimal is either if the JS api accepted a reference to an HTML canvas, or the canvas data in a way that did not require expensive processing (like the Uint8ClampedArray RGBA). Or, if we could have acess to the shader code that moves the pixels around to generate the final bitmap, then just show that on the looking glass device as a maximised window (I think some of your previous libraries do that). p5js can handle shaders, so I think this would enable real time animation.

In any case, looking forward to seeing what you come up with!

alxdncn commented 2 years ago

Hi! Yep this makes sense - we'll look at this for the next iteration of the API!

The shader code necessary is accessible in HoloPlay Core, if that's helpful! You can reimplement it in your pipeline, just keep in mind that the shader code can't be used in GPL software. https://github.com/Looking-Glass/HoloPlayCoreSDK

Our next API shouldn't require you to implement the shader yourself and should have an optimized way to send image data over a websocket - that is the intention. I'll take the canvas data suggestion to the team overseeing that process!