websound / midilooper

A Web MIDI Looper prototype that aims to eventually enable collaboration between remote musicians
MIT License
5 stars 1 forks source link

Sketch out general architecture and first steps #1

Open vine77 opened 6 years ago

vine77 commented 6 years ago

Let's sketch out the very basics of a looper based on the Web MIDI API so we can get this project stasrted!

/cc @abstractmachines @luciusbono @obensource

obensource commented 6 years ago

@vine77 awesome! I'm off for Halloween fun atm/tonight but will get some thoughts in tomorrow! ๐ŸŽ‰

natevw commented 6 years ago

Oooh, this looks fun! Basically a shared sequencer interface?

obensource commented 6 years ago

Hi @natevw! So glad to see you chime in! ๐ŸŽ‰๐ŸŽ‰๐ŸŽ‰

Something like that!

We're definitely at the 'define what it is stage'! โšก

luciusbono commented 6 years ago

Basic features:

abstractmachines commented 6 years ago

@luciusbono Good call. Adding a couple more here:

Most loop pedals I've worked with have that intuitive start/stop interface, with either no capability for editing the start/stop points, or it's buried in menus somewhere.

vine77 commented 6 years ago

@natevw, yeah I think a "shared sequencer interface" could definitely be an outcome of this work. I even ponder a "collaborative Ableton Live in the cloud" though that's currently outside the scope of this project.

As I mentioned in the thread @obensource referenced, there is always the issue of latency in networked music performance (with some citing 30ms as the bound of human perception), but if you can't guarantee a very low latency, it seems like a looping paradigm might present ways to sidestep the issue, e.g. by essentially quickly syncing multiple workspaces even if the transport/play positions may or may not be nearly real-time.

@obensource, do you have any sense for the latency in milliseconds you were getting with the WebRTC version of midisocket?

vine77 commented 6 years ago

For this project, I think @abstractmachines framed a good goal:

Remote MIDI looping that 2 musicians in different locations can sync in on. The MIDI can then be piped to control synths at each musician's separate location.

I like that for a few reasons:

For v0.1, I'm assuming we'd also want to initially limit scope to a self-contained non-collaborative looper and then add the ability for multiple musicians to collaborate in an upcoming version? Thoughts? There really are a lot of things we could do with a project like this, but in the spirit of agile development, starting with the simplest usable product may be beneficial. What do we think is a good order of operations such that we can start from a minimal proof-of-concept and work up from there?

I've been thinking of the first iteration as basically a MIDI version of the traditional guitar stompbox looper:

Future features:

I'm just brainstorming. Would love to hear more about what other think about first steps.

Questions:

natevw commented 6 years ago

@vine77 Yeah, sidestepping the latency issue seems a key win to something like this! That's what caught my eye (besides just liking loopers in general :-)

With the right design, jitter should be a complete non-issue, and you might even have time on the order of a whole bar to get everyone in sync.

Sounds like for now the goal is a "solo instrument" though, rather than something networked?

vine77 commented 6 years ago

I think something networked would be awesome if yโ€™all are up for it. Just trying to think about how to implement incrementally.

obensource commented 6 years ago

@vine77 @natevw definitely iterating towards a networked app is what makes this project rad to me. No question for me there. ๐Ÿ˜Ž

natevw commented 6 years ago

+1 for networking too, but that doesn't mean it has to be the first lines of code either.

My vote would be "React" โ€”ย preact+redux+reselect compiled from ES2017 โ€” but I'm not opposed to a "raw JS" approach either (maybe with the core of D3?) especially since the browsers with MIDI are the browsers with niceties like => and {โ€ฆobj} and async/await anyway.

I think the main key is to keep the data layer rigorously separate from the DOM rendering. That will make a potential transition from any initial solo experiments to adding network features much easier too.

[Clarification: if the interface is just some control buttons, plain JS is probably the way to go. I was imagining more of a sequencer/piano-roll display, which probably isn't needed at first?]

natevw commented 6 years ago

What's the plan here, crew? I have a synth coming this week that this would be great for โ€”ย so I'd love to take on something here, but I don't want to just take over either.

If there's no objections, I'd like to start a real simple prototype of the core MIDI stuff. It might not have much interface to speak of at all at first: I might just map two "unused" keys on the input device to toggle record and play/pause?

If there are objections, I guess I'll race to the first Pull Request then :-P

vine77 commented 6 years ago

Yeah, I agree we need to get some of the core MIDI stuff out of the way. @natevw, feel free to start a prototype. I think as long as we're communicating about it, anyone should feel free to move the project forward!

vine77 commented 6 years ago

Questions:

abstractmachines commented 6 years ago

Hooking up MIDI OUT into another synth isn't necessarily a requirement, but that is the part that will help a remote musician collaborate. Not married to the idea or anything. The test tone may help development move more quickly. Sorry I haven't been involved too much yet, been busy!

oh yeah! " i.e. the ability to translate an incoming MIDI stream into a data structure..." my coworker just did that on a project very recently with WebMIDI. Pinging him now about it.

vine77 commented 6 years ago

Oh, awesome. Will be curious what your coworker has to say about it. Add a link if it happens to be on GitHub. I'm curious about the time tracking strategy of incoming MIDI data. Though now that I check, it looks like MIDIMessageEvent already includes a DOMHighResTimeStamp so maybe that part will be straightforward.

And to clarify, I do think it should be a requirement that this project include MIDI OUT. I was just questioning whether we also could use a test tone (as a simple and separate component) really just to streamline development in the near term.

That makes me curious... can one browser window send MIDI OUT to another browser window's MIDI IN?

abstractmachines commented 6 years ago

@vine77 will do, getting his permission to share. I think he's currently refactoring. I'll get some tips from him too!!

I get it now :) haha, yes, a test tone would be a very useful addition.

Piping audio streams from one process to another (likely IPC) is something Soundflower does.... I think that would be available in something like MIDI Monitor or similar freeware? Or maybe something made by the Jitter or Ableton people? There's a couple packages like this one and this one too - the last one mentions it's "duplex," I assume they mean full duplex.

natevw commented 6 years ago

Just a heads up, I did start something in the "nvw-proto1" branch although it's not as far along as I intended. [For a while last night I was stuck inexplicably not getting any MIDI message events. Restarted Chrome and I think it simultaneously updated to a new version, and finally worked as expected!]

I did end up pulling in a couple tiny external frameworks:

These are both used via old-school script tags and without JSX syntax, so my prototype-in-progress is still all simple static files with no compilation needed. That said, I am using any and all fancy new Ecmascript stuff [async/await, classes, arrow functions, destructuringโ€ฆ] that Chrome has available.

natevw commented 6 years ago

Oh, and as far as architecture that's already a bit of a mess (and it's not even doing anything yet ;-) but:

I'm not entirely sure the best way to bridge between the two yet. To avoid extra abstractions and wrappers, I might just make the MIDI logic aware of the store and give the UI logic access to the looper instance. Basically:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”                                  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚  MIDI  โ”‚                                  โ”‚ on-screen โ”‚
โ”‚ Events โ”‚               HUMAN              โ”‚interactionโ”‚
โ””โ”€โ”€โ”€โ”€โ”ฌโ–ฒโ”€โ”€โ”˜  โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ”€ โ””โ”€โ”€โ”€โ”€โ”€โ”ฌโ–ฒโ”€โ”€โ”€โ”€โ”˜
     โ”‚โ”‚                  ROBOT                    โ”‚โ”‚     
     โ”‚โ”‚                                           โ”‚โ”‚     
   โ”Œโ”€โ–ผโ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”   โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”   โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–ผโ”ดโ”€โ”€โ”  
   โ”‚  Looper   โ”œโ”€โ”€โ”€โ”ผโ–ถ App state   โ”œโ”€โ”€โ”€โ”ผโ–ถ Components   โ”‚  
   โ””โ”€โ–ฒโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜   โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜   โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”˜  
     โ”‚                                             โ”‚     
     โ”‚                                             โ”‚     
     โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜     
natevw commented 6 years ago

Update: last night I cleaned up the prototype code a bit and added a super simple soft synth [ยฎ?] for preview. Was hoping to add some actual looping today and open a pull request, but that's the next step.

vine77 commented 6 years ago

That's awesome @natevw! I'll check it out this weekend.