grame-cncm / faust

Functional programming language for signal processing and sound synthesis
http://faust.grame.fr
Other
2.58k stars 322 forks source link

faust2api with juce option doesn't properly route signals #273

Open miccio-dk opened 5 years ago

miccio-dk commented 5 years ago

When integrating faust code into a juce audio effect plugin (i.e. following this guide), the application seems to source audio in and stream audio out independently from the DAW controlling it.

In practice, this translates into the plugin only reading microphone data (as it is the default internal input) and spitting data out on the main output interface only (even if the DAW is e.g. connected to an ASIO device); the output V-meters are always at 0.

This occurs on both WIndows and Linux, and from the online editor and the self-compiled terminal faust2api tool.

Following this older tutorial does however work quite well and audio is routed correctly between DAW and plugin.

Let me know if I can be of any help troubleshooting.

sletz commented 5 years ago

The new model glues the generated DSP with a juceaudio class , see https://github.com/grame-cncm/faust/blob/master-dev/architecture/faust/audio/juce-dsp.h This juceaudio is a subclass of AudioAppComponent, something more or different may have to be done at this level?

miccio-dk commented 5 years ago

Thanks for the quick reply :) I think that it might need to be handled differently, at least if the final purpose is creating a plug-in with juce.

From the doc page:

This class should not be inherited when creating a plug-in as the host will handle audio streams from hardware devices.

sletz commented 5 years ago

Ok, so it seems our model of faust2api -juce is adapted to produce a standalone application, but not a plug-in. @rmichon, we have a problem here...

rmichon commented 5 years ago

Oh hey Riccardo :)! I've been suspecting that for a while but I never actually found time to test it. Stéphane, I think we already talked about that (I just forwarded you our previous thread by e-mail). Well in any case, I think this problem should be solved...

sletz commented 5 years ago

In this case when audio handling is actually done by the JUCE processor, the current model (where an additional JUCE audio layer is created) does not seem to be adapted. In the following commit: https://github.com/grame-cncm/faust/commit/4191cc39f90c32f24104248ba0838c2479f9c53b the DSP can be created without any audio driver (doing dsp = new DspFaust(false);). Then the processBlock method will have to be explicitly implemented (to call the DSP "compute" method) and so on...

rmichon commented 5 years ago

Mmmh I see. But then where does it get the sampling rate from?

sletz commented 5 years ago

You will have to call DSP->init in the prepareToPlay (double sampleRate, int samplesPerBlock) method. Basically you'll have to follow what is done in FaustPlugInAudioProcessor in this file https://github.com/grame-cncm/faust/blob/master-dev/architecture/juce/juce-plugin.cpp

rmichon commented 5 years ago

Sure, but DspFaust (generated with faust2api) doesn't have an init method, like it's not an actual Faust dsp class?

sletz commented 5 years ago

Ah OK, then I guess we coud add a getDSP method in the DspFaust API... so that the internal DSP object can be accessed ?

Or possibly try (again..) the other way: find a way to "plug" the JUCE audio context inside the already allocated DspFaust object with a kind of setDriver API... I'll have a look.

rmichon commented 5 years ago

OK, great!