ASLS-org / studio

ASLS Studio is an open-source, web-based, DMX lighting control software and visualizer.
https://studio.asls.timekadel.com
GNU General Public License v3.0
74 stars 5 forks source link

Only moving-head fixtures are patchable. #30

Open timekadel opened 1 year ago

timekadel commented 1 year ago

Since complex fixtures are not yet available for visualization, only moving-head fixtures are patchable. Such complex fixtures should be made patchable and setup with a placeholder 3D model signifying the user that the visualization process for that kind of equipment is still under development. Please refer to https://open-fixture-library.org/categories for a list of fixture categories that are/should be implemented.

terence1990 commented 1 month ago

@timekadel what do you think about supporting GDTF imports?

Seems to be a decent way to offset the responsibility of building out specific Vue components for every single Fixture and how DMX values affect each Beam/Three.SpotLight inside each Fixture specifically, like you've done with MovingHead.

We could create an abstracted approach where we have one Vue component for GDTFFixture which receives the config for an individual fixture and within iterate through Geometry within GDTF fixture config to render out the Model, looking at primitiveType (Cylinder/Sphere) etc. and positioning and build out the Geometry in Three.js.

Then each GDTFFixture can subscribe to DMX dumps and unpack it according to it's starting channel, then we kind do what BlenderDMX has done (https://github.com/open-stage/blender-dmx/blob/f8dce29bf74f100a38ff89063d0f3cfe13bc7725/fixture.py#L574-L802) and unpack DMX config within GDTF fixture config to calculate how and which Geometry each DMX channels relates to and what attribute type it is (Dimmer, Shutter1 etc.) https://gdtf.eu/gdtf/attributes/attributes/

If it's something you're interested in, i'd be up for forking and working on a PR.

I was planning on working on this on my own project, though my plan was/is a bit different in that I have QLC+ running on a machine and using ArtNet as an output and planning on listening to that with node-artnet-protocol (https://github.com/jeffreykog/node-artnet-protocol) then use a ws:// to receive DMX state into my application. ASLS Studio sends DMX out to Artnet with ASLS Server which is cool, would love to make this work both ways too though so ASLS Studio could act purely as a visualiser.

timekadel commented 1 month ago

@timekadel what do you think about supporting GDTF imports?

Seems to be a decent way to offset the responsibility of building out specific Vue components for every single Fixture and how DMX values affect each Beam/Three.SpotLight inside each Fixture specifically, like you've done with MovingHead.

We could create an abstracted approach where we have one Vue component for GDTFFixture which receives the config for an individual fixture and within iterate through Geometry within GDTF fixture config to render out the Model, looking at primitiveType (Cylinder/Sphere) etc. and positioning and build out the Geometry in Three.js.

Then each GDTFFixture can subscribe to DMX dumps and unpack it according to it's starting channel, then we kind do what BlenderDMX has done (https://github.com/open-stage/blender-dmx/blob/f8dce29bf74f100a38ff89063d0f3cfe13bc7725/fixture.py#L574-L802) and unpack DMX config within GDTF fixture config to calculate how and which Geometry each DMX channels relates to and what attribute type it is (Dimmer, Shutter1 etc.) https://gdtf.eu/gdtf/attributes/attributes/

If it's something you're interested in, i'd be up for forking and working on a PR.

I was planning on working on this on my own project, though my plan was/is a bit different in that I have QLC+ running on a machine and using ArtNet as an output and planning on listening to that with node-artnet-protocol (https://github.com/jeffreykog/node-artnet-protocol) then use a ws:// to receive DMX state into my application. ASLS Studio sends DMX out to Artnet with ASLS Server which is cool, would love to make this work both ways too though so ASLS Studio could act purely as a visualiser.

HI @terence1990,

My appologies for the late reply, I'm quite busy with work lately, ASLS is just a side project which I am working on and maintaining on my free time.

First of all, thank you for your interest in ASLS Studio and for taking the time to share your thoughts! I'm thrilled to have a potential contributor like you onboard!

I did not know about GDTF and that indeed seems like an amazing feature to implement. We might need to scrap the instanced buffer geometry approach for that to work, but I always felt like the performances/workload balance wasn't realy worth it anyway. I implemented it a while ago thinking that the performance gain would be substantial since only 1 drawcall is required to update each individual moving-head fixture in the scene, but this approach is very limiting, both in the definition of custom geometries and in the handling of instances in the shaders!

Without any real volumetric lighting rendering pipeline, we won't be able to mimic gobo occlusion and colors just yet though. Currently the beams are rendered on cylinder geometries with specifically crafted shaders which are able to mimic very simple beams of light. I'd like to get that working too someday, but crafting an efficient WebGL volumetric lighting rendering pipeline is a HUGE job!

Regarding the DMX engine and channel mapping, it is currently done using the amazing Open Fixture Libary. Supporting GDTF import would surely mean that some effort should be put through creating a custom DMX capabilities loader which might entirely replace the existing OFL loader. I believe the matching between OFL & GDTF DMX fixtures capabilities is doable, meaning that not much change would be required in the DMX engine's core to adapt to GDTF-style capabilites, but I might be wrong. Anyway, I'm more than willing to give it a try!

Anyway, I'm more than willing to get contributors working on this project and your idea is really interesting! a fork and PR sounds like a great starting point!

Regarding connectivity, communication in web based-environments is very limited. I'm actively working on a protocol proposal - WSC (Web Show Control) which is a WebRTC based show control protocol which would enable web platforms to stream show control protocols (such as ArtNet, sACN and many more) from a single webpage (no server required!), I'll post about it in the organization once it's ready (hopefully very soon)! ASLS-Server is actually based on a first iteration of WSC and leverages WebRTC to establish a communication between the browser and ArtNet nodes, reversing the communication in order to stream ArtNet to the browser through WebRTC is doable. I even implemented it quite a while ago but scraped it somehow and I'd be glad to bring the feature back! If you're wondering, the choice to use WebRTC instead of Web Sockets lies in the fact that contrarily to WS, WebRTC is stream oriented and matches the use-case perfectly!

terence1990 commented 1 month ago

hey @timekadel

Yep for any new component for GDTFFixture would need to avoid InstancedBufferGeometry to ensure each fixture is independent. You could probably use this for the same fixtureId on the same startChannel but becomes an edge case for most users potentially. WiIll be interesting to see how the browser copes if we fill up a whole universe of fixtures. My plan for the Beam's themselves is to use a Cylinder for every known Led in the fixture. We know the Light's in the fixture by virtue of the GDTFFixture configuration and their position, decay and angle relative to the Model. The real issue here as we've already highlighted could be performance.

OFL is awesome, I am so pleased though that there is a standardisation that translates what we have in OFL into actual Geometry that we can use to build out models, and how Geometry relates to DMX. That has been the missing link. GDTF has all the data OFL provides, DMX Modes (8/12/50 CH etc.) and all of the config and types of faders. I think because GDTF gives us both DMX and Geometry, we only need 1 component that renders the Geometry dynamically from the GDTF config. For OFL, we would need to create our own hard coded Geometry per fixture like you've done with moving head. So I think we could continue to build out specific fixtures if we want with OFL but also just have the option to add a GDTFFixture to the scene and upload the .gdtf file.

Good point on WebRTC, it is actually more suitable for this kind of project. Yeah for reverse ArtNet, it's easy when running locally as everything is on 127.0.0.1 but I'd love to throw up ASLS Studio as a free SaaS users can login to and remotely receive DMX over ArtNet to the client by running a simple UDP forwarder in shell and subscribing from client to WebRTC matching on IP address which ASLS Server detects from the forwarder. Obviously this won't work the other direction though users still need to run ASLS server locally not just a UDP forwarder. Maybe Electron is the answer so all the user has to do is install and everything is local.

Anyway! I'm definitely gonna get started soon, probably week after next!