mmckegg / loop-drop-app

[unmaintained] MIDI looper, modular synth and sampler app built using Web Audio and Web MIDI APIs
826 stars 88 forks source link

Idea: visualizer component #92

Open mattdesl opened 9 years ago

mattdesl commented 9 years ago

This is a random idea. Since the app is geared toward performances, it would be truly cool if the audio processes could somehow hook into a visualizer using webgl/canvas in another browser window (ie second monitor, projector). Then the artist can code their own visuals with ThreeJS, stackgl or what have you.

VJing in the browser !! wut 😱

mmckegg commented 9 years ago

That would be sweet! This is something me and @hughsk were playing with at Camp JS a few weeks ago. Started working on a plugin that streams out all internal loop drop data on a websocket so that it could be consumed by another app, or even another machine.

ahdinosaur commented 9 years ago

streams out all internal loop drop data on a websocket so that it could be consumed by another app, or even another machine.

:+1:

janmonschke commented 9 years ago

Would be great to also stream the audio ;)

mmckegg commented 9 years ago

Trouble with using audio is introducing latency that makes it look less spectacular. I'm really keen for some sort of non-audio output from loop drop that can be used to generate perfect sync visuals.

I posted a video on my intro of doing something like that with the original loop drop: http://www.youtube.com/watch?v=827L7UA_0bc

That was just using the data being sent to the launchpads. But a better system would also look at the patch being used to generate the sound (pretty easy with oscillators and envelopes) or the waveform in the case of samples.

Potentially you'd could send the project and loop data to the visualiser and it would be running a special visual version of loop drop with audio-slot swapped out for audio-slot-visualiser

mmckegg commented 9 years ago

Here's a looop drop plugin I wrote at Camp JS:

https://github.com/mmckegg/loop-stream

It streams out all the data sent to the launchpads on a websocket (works for qwerty too).

Originally this was going to drive the visuals, but we ran out of time, so ended up using audio beat detection instead.

TimPietrusky commented 8 years ago

+1