SerenityOS / serenity

The Serenity Operating System 🐞
https://serenityos.org
BSD 2-Clause "Simplified" License
30.58k stars 3.19k forks source link

Piano: Transform Piano into a DAW/Tracker #6528

Closed kleinesfilmroellchen closed 1 year ago

kleinesfilmroellchen commented 3 years ago

Make Piano into a DAW; a.k.a.: Kleines Filmröllchen's Serenity Masterplan

The goal for Piano, as discussed with @willmcpherson2 and other people, is to transform it into a DAW and tracker program similar in mode of operation to programs like Cubase or Ableton.

We want to have multiple channels, each channel may contain note tracks or raw audio (and multiple of each). Raw audio can be loaded from a file, recorded from audio input devices, or dubbed from other track(s). A similar thing goes for notes: Load MIDI files, live record from the keyboard or MIDI devices (will require MIDI drivers), export to MIDI.

To create audio from notes and to modify existing audio, there should be a processing chain consisting of any number of processors. A processor in turn can input notes or sound and outputs notes or sound. The most important kinds of processors that we will port from the existing Piano functionality are synthesizers which create sound from notes and effects/filters which process audio to e.g. add delay or reverb, EQ, or band-pass the signal. Through this flexible system, just a handful of small processors may be combined for a huge variety in sound and anybody can implement their favorite synth/filter easily.

Additionally, turning Piano into a proper application will require proper project load and save and the ability to copy/paste basically anything.

This is a very wide scope that will require a lot of work. I won't put a to-do list here because this is very much a draft of ideas and the scope may change as we move along. Here's a plan of sorts, though other things are mixed in.

Finished work

Ongoing work

(your PR here!)

Planned next steps

willmcpherson2 commented 3 years ago

One basic design idea is to have a Processor virtual class with some virtual methods like:

Audio thru(Audio in)
MIDI thru(MIDI in)
Audio thru(MIDI in)
MIDI thru(Audio in)

In other words, a processor object will implement a thru method with input as either Audio or MIDI and output as either Audio or MIDI. Note that Audio will probably be a sample and MIDI will probably be the current notes. Here are some examples of processor instances:

Then there would be some sort of Chain type, or just a Vector<Processor>. There would be some runtime check that a signal can actually be threaded through the thru methods. For example Ableton Live simply prevents you from inserting an incompatible processor.

Processor could also have a draw method etc.

kleinesfilmroellchen commented 3 years ago

Two main things that are outside of the scope of this project would help us:

nooga commented 3 years ago

My 2 cents:

A Processor should also expose some methods to modify its internal params - to twist its knobs programmatically. This will enable automation and MIDI CC integration down the line. Now, these params should also be represented as objects wrapping a float or whatever because they can have medadata such as min, max values etc. Also, there are at least two types of them: audio rate params which can vary from sample to sample, and control rate params that don't have to vary 44k times a sec. The DSP code inside a Processor should be able to discern between those two for performance reasons.

This system could be even moved to its own lib, i.e. LibDSP or LibSynth because it can be useful for other apps - media players, games, Browser's WebAudio support etc. We'd have our own PD/Max analog.

Plus, we'd have a Track that has:

Track could be a MIDITrack and AudioTrack having either MIDIClips or AudioClips. I think MIDI is enough for now.

It might be useful to have a special Track for master (or main) that is a return bus and doesn't have any Clips. This could be also used for sends later on. This would require multiple Audio in on a track, or better, the Audio thing to sum any number of things plugged into it.

We'd also need to have a separate, global Transport object that does the playback control and timekeeping. It would know the BPM, measure, track playheads across all the clips at all times and provide the time to all processors and audio out. In general, this is more complex than it may sound so it's good to keep this logic contained from the start.

About MIDI, I think that Piano can be developed without any real MIDI drivers and support from the OS but the music model should at least resemble MIDI spec a little bit. This will make reading and writing MIDI files easy and if done correctly, should enable easy integration with the future MIDI subsystem if there is one.

willmcpherson2 commented 3 years ago

For completeness sake, kling suggested a node-based fx system many years ago...

https://freenode.logbot.info/serenityos/20191206#c2938796

https://freenode.logbot.info/serenityos/20200210#c3219213

Simply converting Piano to use a proper object model where processors can be chained would be a major improvement to the application. In fact I would suggest that the processor chain should be the only task within the scope of this issue. I agree with essentially everything in the thread currently, but let's not get too designy.

Also +1 for LibDSP, that just seems very reasonable.

willmcpherson2 commented 3 years ago

WIP: https://github.com/kleinesfilmroellchen/serenity

Thiri25 commented 3 years ago

6258

kleinesfilmroellchen commented 1 year ago

I will close this since it provides not much benefit to us as a tracking issue. I will continue to transform Piano into a DAW of course :^)