worldmaking / mischmasch

https://www.alicelab.world/msvr
MIT License
9 stars 1 forks source link

Need Graham: all outlets have LEDS for visualizing amplitude values #7

Open michaelpalumbo opened 5 years ago

michaelpalumbo commented 5 years ago

Priority: pulsing LED. requires solving many problems on the max side and how we channel that data. Not part of OT, a separate channel (doesn't represent change in the definition to the system). like the stderr stream or max console, they aren't part of the max patch, but they are still linked.

possible method is output from gen~: as an [out] or as writing into a [buffer~]?

possible uses:

possible implementation:

Zodsmar commented 5 years ago

Creating a blinking LED would be more of how the data will come in and how it would pulse. From the Javascript side it really only would be any shape (preferably something that looks like a light) and just have the emission or material color change based on values coming. Should be pretty simple to implement.

michaelpalumbo commented 5 years ago

That's great to hear. We'll need to set up this secondary communication channel between the server and the client (as it will be useful for other things as well).

What if what the client received was an update on the blink rate value and the length of the blink (aka pulse-width). Pros and cons: Pro:

Con:

I see three milestones for the LED design:

  1. An LED which is either fully on or fully off. This would represent square waves, gates, triggers. Its pulse-width could also be controlled.
  2. An LED which has a variable amount of 'ON'. This would represent waveforms like triangle waves, sawtooths, etc. (in other words, would represent amplitude values between 0. and 1.
  3. An 2-colour LED, where one colour represents amplitude between 0. 1. (positive) and the other colour between 0. and -1. (negative). would represent Sine waves, noise (maybe), etc.
grrrwaaa commented 5 years ago

I'd vote for getting the simplest case working first: the LED displays the last value it received, and this value is clamped in the range -1..1, and either it displays absolute value or it shows a different colour for negative signals.

I don't think we need to send rate/pulse-width data packets just now. The socket latency on a localhost should be low enough that just displaying the most recent value would work well enough for triggers and LFOs, and these are about the only ones where the representation can be fully unambiguous. A LED in VR can flicker at most at around 45hz (because Nyquist), meaning it can only really show sub-audio waveforms (and special cases like white noise and DC, which happen to look the same at all frequencies).

For other kinds of signal there are a lot of things we can do, but it is a hard problem -- part of the general difficulty of representing the important features of high-resolution data (audio signals) with a low-resolution output (visuals). We could show analysis of higher frequencies (like average envelope, spectral centre, etc.) as these analyses are also low frequency. But that might get expensive and I'd rather save it for later. Even then, the problem is hard, as you don't know in advance what kind of signal you might be receiving, and thus what kinds of features make sense. It might be a unipolar signal (in which case, envelope tracking the average value might make sense) or it might be bipolar (in which case, the average will not represent it well, but RMS average would, so long as the signal has no DC offset... and so long as representing envelope is more important than representing the sign). It might have hard edges like a square (in which case averages might wrongly soften the edges). Also, it might be oscillating between +4 and +5, in which case clamping at +/-1 makes no sense. We dealt with some of this in working out the streamline visualisations in Gibberwocky, which is a very similar problem, and we solved it by adapting the range to the signal. We show a kind of rolling oscilloscope of the most recent N values (a hundred or so) for any signals generated. We track the average min & max value of a signal to work out the vertical range of the streamline graph, and extend the graph if the signal goes beyond those boundaries, and periodically shrink it if it has stayed in a smaller range for a while. (The values themselves are sent as a single snapshot of all active signals every 33ms, and the renderer takes care of storing them in a list to make the graph.) So I would imagine in the long run that we could do something similar for MSVR: to adapt to the kind of signal received where we can, and show multiple properties (instantaneous value, RMS-smoothed envelope, spectral balance etc.) in mini-oscilloscopes.

For now, Max will need to take a periodic snapshot every 10ms or so of all visualized LEDs, and send them as an array to the renderer. We could use a buffer~ in Max to capture these snapshots out of gen~, just giving each LED a unique index in the buffer~. Then we'd send the buffer~ contents over a websocket directly to the renderer, as a separate channel of communication. (At this stage, I don't think it even needs to go through the server -- we only need to know what the unique indices are.) I believe the [ws] object in the @worldmaking/Max_Worldmaking_Package can send jitter matrices as Float32Array buffers, and we can use [jit.buffer~] to wrap a buffer~. Or I could add support to [ws] for sending a buffer~ directly.

grrrwaaa commented 5 years ago
Screen Shot 2019-07-10 at 4 22 39 PM

This is an example of the adaptive streamlines (oscilloscopes) in Gibberwocky, FYI.

michaelpalumbo commented 5 years ago
michaelpalumbo commented 5 years ago

@grrrwaaa I've managed to script poke objects into [gen~ msvr_world] but the jit.buffer object creates a new matrix plane for each buffer channel. This means that with a patch of say 5 nodes, could require between 5-20 channels and as many matrix planes. You could see how this will very quickly get quite expensive on the jitter side. Not sure how to overcome this.

michaelpalumbo commented 5 years ago

@zodsmar I've merged your work on instancing and my work on the max client into a new branch called develop. The idea here is that we can work on features in their own branches, then merge them into develop for testing. Once verified working, we will periodically merge develop into master. any questions?

From within develop you can now run the max patch to get the data from gen for outlet visualizations. ALSO, and this is important, there were some errors in the scene JSONs that I fixed (particularly scene_rich.json and scene_simple.json)

michaelpalumbo commented 5 years ago
Zodsmar commented 5 years ago

Receiving data directly from max to the server using port 8084 (Port subject to change). Data comes in -1...1 and is being clamped to 0...255 for RGBA values. Depending on how we do colors and emissions might need to change the clamping values but for now this works. I get path plus value clamped. Just need to put the value to outlets. Will do this once Project Ghost is finished being restructured.

michaelpalumbo commented 5 years ago

@Zodsmar heres the current spec:

{\"lfo_1__sine\":-0.7605468034744263\,\"lfo_1__phasor\":0.6140456795692444\,\"lfo_1__pulse\":1\,\"lfo_2__sine\":-0.7223623394966125\,\"lfo_2__phasor\":0.6223999857902527\,\"lfo_2__pulse\":1\,\"ffmvco_1__vco_1\":0.3347249925136566\,\"ffmvco_1__vco_2\":-0.3240717351436615\,\"ffmvco_1__master\":0.9353091716766357\,\"vca_1__output\":0\,\"comparator_1__max\":0\,\"comparator_1__min\":0\,\"outs_1__left\":0\,\"outs_1__right\":0}
michaelpalumbo commented 4 years ago

this is nearly ready: just need to change the outlet colour, since the viz data is ready.

michaelpalumbo commented 4 years ago

nearly ready.