grame-cncm / faust

Functional programming language for signal processing and sound synthesis
http://faust.grame.fr
Other
2.53k stars 319 forks source link

netjack console & MIDI #189

Closed grz0zrg closed 6 years ago

grz0zrg commented 6 years ago

Hello,

is it possible to use MIDI within netjack in console mode ? I tried the sine synth example with faust2netjackconsole, audio work but there is no MIDI ports, i would like to control the synthesizer from a second computer which has connected MIDI devices.

sletz commented 6 years ago

I've added a --i and --o parameters for faust2netjackconsole in https://github.com/grame-cncm/faust/commit/7cf7bf1f55407fc1dde5f4b7485c6b069bb84aa6.

So now you can do:

faust2netjackconsole foo.dsp

and:

./foo --i 2 --o 3 to setup 2 MIDI input ports and 3 MIDI output ports.

Can you test and report?

grz0zrg commented 6 years ago

Tried with midiTester.dsp example file, i can see the I/O port on the jack server, connected them but there is no signals passing by

sletz commented 6 years ago

Second try in https://github.com/grame-cncm/faust/commit/d4ebf0a19dafb8aebb1d97694f63889aab4d0df0. I removed the --i and --o parameters. Two MIDI ports are always added: the first one is use internally to transmit control values. The second one should be usable for transmit MIDI. Blind commit here (nothing to they out), so you'll to test again.

grz0zrg commented 6 years ago

Thank you, MIDI seem to work (tested with midiTester.dsp and instruments), is there any reasons physicalModeling examples with MIDI don't take into account MIDI keyboard key frequency ? When i press a key, freq go to 0 resulting in an audio pop.

sletz commented 6 years ago

No reason. Can you trace the exact MIDI messages that are sent? Can you possibly test with a simpler polyphonic example?

grz0zrg commented 6 years ago

Tried with the example on this page : http://faust.grame.fr/examples/2015/10/01/organ.html

When i connect the MIDI keyboard to the first input port and press a key, frequency is set to zero, when i connect it to the second input port, nothing happen.

How can i trace the MIDI messages ?

sletz commented 6 years ago

Third try in https://github.com/grame-cncm/faust/commit/72d07c799535ce62d06e9e9e7a3056d126698d43

Now you'll have to explicitly activate MIDI at compile time, like:

faust2netjackconsole -midi foo.dsp

ans the second input port should be used.

grz0zrg commented 6 years ago

Wired the keyboard to the second input port on the organ example and still no sounds when pressing keys.

Also, i need to add -lpthread linker option (inside faust2netjackconsole script) each time i am running faust2netjackconsole after install.

sletz commented 6 years ago

I finally reworked the entire stuff:

faust2netjackconsole -h
faust2netjackconsole [-httpd] [-nvoices <num>] [-effect auto|<effect.dsp>] [-midi] [-osc] <file.dsp>
Use '-httpd' to activate HTTP control
Use '-nvoices <num>' to produce a polyphonic self-contained DSP with <num> voices, ready to be used with MIDI or OSC
Use '-effect <effect.dsp>' to produce a polyphonic DSP connected to a global output effect, ready to be used with MIDI or OSC
Use '-effect auto' to produce a polyphonic DSP connected to a global output effect defined as 'effect' in <file.dsp>, ready to be used with MIDI or OSC
Use '-midi' to activate MIDI control
Use '-osc' to activate OSC control

So you will have to add the appropriate parameters like:

faust2netjackconsole -midi -nvoices foo.dsp

I was able to test, it works here on OS X.

Note that faust2netjackqt tool has be reworked the same way.

grz0zrg commented 6 years ago

Thank you! Work great with -nvoices option. I will test this a bit more but all seem to work at first glance.

grz0zrg commented 6 years ago

Is there any reasons that -nvoices 2 does not produce any sounds with the organ example ? I was able to produce sounds with -nvoices 1 option but not by increasing the amount of voices.

sletz commented 6 years ago

Because "volume" slider was at 0 in the DSP code. Try this code:

import("stdfaust.lib");

midigate    = button ("gate");                              // MIDI keyon-keyoff
midifreq    = hslider("freq[unit:Hz]", 440, 20, 20000, 1);  // MIDI keyon key
midigain    = hslider("gain", 0.5, 0, 10, 0.01);            // MIDI keyon velocity

process = voice(midigate, midigain, midifreq) * hslider("volume", 0.5, 0, 1, 0.01);

// Implementation

phasor(f)   = f/ma.SR : (+,1.0:fmod) ~ _ ;
osc(f)      = phasor(f) * 6.28318530718 : sin;

timbre(freq)= osc(freq) + 0.5*osc(2.0*freq) + 0.25*osc(3.0*freq);

envelop(gate, gain) = gate * gain : smooth(0.9995)
                with { smooth(c) = * (1-c) : + ~ * (c) ; };

voice(gate, gain, freq) = envelop(gate, gain) * timbre(freq);