Closed JamieJ1 closed 3 years ago
Thank you for reporting here!
LogicPro doesn't provide midi to AUFX plugins in an audio fx slot........only to Aumfx and AUinst plugins in either the midifx or instrument slot.
In Logic Pro you have to provide a so called "midi-controlled effect" which then should be used in the instrument slot of LogicPro and the side chain is used to bring audio in to it to use it like an audio effect, but with the ability to receive midi also.
Thanks. So to clarify, are you saying that because there is no "Instrument" or "Midi FX" slot on Logic's Audio channels, there is no way to get this to work because using these slot types is the only way i will be able to control Element with midi? Its strange because the only version of Element that Logic will allow me to load into its Audio FX slots is the "Element FX" version of the plugin which is a "Midi-controlled effect" plugin (says so in the Plugin Manager window). I would have thought this version of the plugin could be controlled with a midi controller.
If not, is there any workaround for this? Maybe routing the audio from the Audio channel to the Instrument track which has Element on it?
the LogicPro way to control audio insert FX is by using smart controls. Smart Controls intercept midi and then communicate with audio insert FX by using parameter automation. No actual midi is sent to the plugin. Does Element provide a way to expose the parameters of plugins being hosted within it, to the DAW that is hosting Element plugin? That is a question and that is how you can provide midi control to audio fx insert plugins on an audio track.
An example of a plugin that needs midi control, is Stutter Edit. It uses midi key switches to switch the stutter effect. However, if you read the docs they provide for how to use in logicPro, you will see that you have to use the midi controlled Fx in the instrument plugin slot of an instrument track, and then you provide audio to it via the side chain. Since its being used as an "instrument" this way that means for an audio track you essentially have to create two tracks, one for the audio track itself, you route the output of that track to a bus and that bus is used to feed the side chain of the instrument track.
When you load a plugin into the instrument slot of an instrument track you will see at the bottom of the pop up menu, a sub menu called "AU Midi-Controlled Effects". Then any AUFx plugins which were built with that capability should show up there. Note, that Element does not show up there. So really there is actually a need to probably set some appropriate JUCE build flags so that it will show up in that sub menu to use as a midi controlled effect as I have described
Alternatively, or additionally....you could use smart controls too control Element directly as an audio insert FX, providing that Element is able to expose to the DAW the plugin parameters for automation purposes...which I am not sure of the answer. You can do that with PlogueBidule, I am away from computer now so I can't look into that.
Sorry if I’m still misunderstanding, but I was under the impression any audio plugin, midi inst, software inst or effect could be midi controlled via setting up mappings in the Controller Assignment window? I’ve never had an issue controlling any other plugin format (or any other parameter in Logic) using Logics Controller Assignment window. Only if the plugin has a bug have I had issues midi controlling it
The controller assignment method is another approach I didn’t mention because it’s more complicated but it’s similar as smart controls in that it will intercept midi and control plugins vía parameter automation
In order to build Element AUFX with midi control, make sure the Plugin AU Main Type is set to kAudioUnitType_MusicEffect
in Jucer.
I just built ElementFX with that set that way and now it shows up in LogicPro as an AU midi controlled FX and appears to work properly as it should and I described earlier for how StutterEdit and other similar midi controlled FX work.
Seems to still show up as a normal non midi controlled AUFX for normal insert fx when configured this way in Jucer. But one thing that is a little annoying is that when this AUFX is loaded into LogicPro audio insert slot, then inside Element you see the midi input and output nodes...which are not connected to anything in LogicPro...
So really the best would be if there was an AUFX version without any midi...using the kAudioUntiType_Effect plugin type...and no midi input or output defined for the plugin either.. And then a separate one built for midi controlled FX which has the midi input and output as well as kAudioUnitType_MusicEffect plugin type. That way nobody would be confused trying to use midi in an audio effect insert.. But I am not totally sure about how to build those separately or what wo old be the best way to go about distinguishing that, but just something to think about here for whomever may decide to fix this.
The controller assignment method is another approach I didn’t mention because it’s more complicated but it’s similar as smart controls in that it will intercept midi and control plugins vía parameter automation
I'm still a bit confused. So my usual way of creating mappings to my midi controller is to go into the Controller Assignment window, hit "Learn", move the desired parameter (in any fx plugin or instrument) then touch my controller knob to make the assignment. Are you saying it is possible to use this same method for midi controlling Element's parameters, instead of mapping directly between Element and the controller? If so this would be ok.
Smart Controls are undesirable because they can only be loaded / copied to channels via the Channel Strip Settings menu or a Patch via the Library browser. So it would mean every time I want to use Element with my Midi controller I’d need to load Element via the Channel Strip Setting, I couldn’t just copy and paste it from one channel to the next or load it from the plugin menu.
In LogicPro, you have midi coming in the normal way and you have midi that gets intercepted by the "controller" framework. If you have an active controller manager, with "learned" assignments...then those learned midi commands are intercepted by LogicPro and translated into something else in the software, the midi commands never arrive anywhere.
Typically these midi commands are translated into plugin parameters... but they can also be translated into key commands, etc. but the point is...when you use LEARN, it doesn't send the midi to the plugin...it remaps the midi command to some other kind of controllable thing in LogicPro...such as a parameter of a plugin.
When you use the controller assignment window, you are essentially setting up a collection of remappings from midi to these other things, and as long as that controller is active, then those midi commands never make it to anything, they are intercepted and remapped and redirected as the other thing.
the Controller framework is kind of more global for the whole project and event across projects. Its not meant to be changed often. its meant to provide a way for midi controllers to directly control LogicPro...you set it up for your hardware and leave its alone after that typically. Though some people may do otherwise when using Lemur, etc, but you must remember that its a global thing, changes are not saved with the project (I don't think).
Smart Controls are kind of similar but in in a much more dynamic way on a per track and per project basis. Its saved with the project and provides a quick and simple way to map midi commands from your controller directly to track, channel and plugin parameters directly.
So if you're wanting to use the controller assignment window to make it more permanent, that's all fine and good, but what you have to understand is that what that does is to remap midi commands to plugin parameters. In this case LogicPro will not see or know what plugins are being hosted inside Element, it will only know what the Element plugin parameters that are available. So basically you can use the controller assignment window to map midi commands to Element parameters, then Element itself must provide a way to map those generic parameters to specific plugins that are being hosted inside it.
Thanks, yeah that’s what I’ve done. Just setup another “Mode” Inside Logics Controller Assignment window and setup a bunch of mappings to control Elements Macros with, in turn controlling the plugin parameters inside Element.
I’ve just been testing Waves StudioRack plugin wrapper. I’m not sure what format this plugin is but it is allowing me to map Logic assignments directly to the plugin parameters (EQ & compressors, etc) without have to use the wrappers macros as the go-between. Is there any way to create a version of Element that will function like this?
Should be possible to do, I just haven't had the time to investigate and fix this yet @JamieJ1 . Is the StudioRack on an audio or instrument track?
No problem. Yes, you can place StudioRack (the newly released version of it) as an insert on an audio track then open an EQ etc inside of it, hit “Learn” in the Assignment Window and map the EQ’s parameters directly. Unfortunately it’s for Waves plugins only. Just so I know, do you have any idea of timeframe for implanting this new version?
Shooting to have a release in the next two weeks. So only Waves plugins loaded inside StudioRack can be directly mapped?
I’ve only tested Waves plugins as that’s all I can load inside it unfortunately
Gotcha. I was planning on working on some Graph Editor stuff later this evening. I'll do some debugging in Logic as well.
basically what I think is being asked, is for Element to dynamically keep track of all parameters of all plugins that are hosted within and then automatically expose them to the host-of-Element by the same names...dynamically updating the names of those exposed parameters.
That seems to be what the waves product is doing.
But I personally think that would be a little overwhelming, I'd rather have a mechanism in Element where you can configure which hosted plugin parameter you want exposed up a level and mapped to the plugin within...and expose those only...but yes...by the name given by the plugin, not some generic name.
I see. Can already do this with the performance parameters feature. Element exposes parameters which are aliases to loaded plugin parameters (kind of like Apples AU sampler plugin)
Names aren't dynamic though. I'll have to do some research, but I don't think any of the plugin api's (VST/AU) allow for dynamic creation and removal of parameters.
I have to check to see, but I believe Vienna Ensemble Pro does it somehow. Actually let me verify that. In the case of VePro, there can be thousands and thousands of plugin parameters inside a VePro instance...so they provide a mapping interface to decide how and what to expose from within as automatable parameters to the host. They would have to be able to name them or it would become literally impossible to keep track of what is what. But I will verify that. As to how they do it, I have no idea... They aren't using JUCE either.
Hey thanks! I could be wrong about the fixed parameter thing then. Admittedly most of my knowledge of the APIs comes from JUCE :). JUCE 6 has been out for a while now, will take a look there and see if dynamic params is possible.
I just did a quick test with VePro. Yes it is somehow able to dynamically update the name of the parameter as seen by the host.
Hi guys. Do you have any idea of a release date for this?
Has there been any update on this yet? Have been waiting with bated breath
@JamieJ1 - No release date per se. Been chipping away at the 0.45.x issues as much as I can.
Can anyone update on the status of this? I first posted about it in July
Actually working on this one right now, believe it or not.
Not sure this one can be fixed. I checked all the configs for Element FX. It defines itself as an AudioUnit MusicEffect which is, according to apple docs, "An effect unit that can respond to MIDI control messages, typically through a mapping of MIDI messages to parameters of the audio units DSP algorithm".
So you'd think it would work, but I can't get Logic to send midi to the plugin with any preference I've found for MIDI. Are you able to get MIDI to any other "MusicEffect"? Waldorf d-pole doesn't work either, so I'm thinking this is a Logic problem. Not 100% sure on that though.
Going to remove this from 0.46, because I don't think a fix is coming soon: I've burned several hours on it just today with no progress.
Using Midi here is not the right approach I don't think. Aren't you really talking about exposing Plugin parameters?
Midi controlled fx are usually handled in logic as an instrument plugin that uses side chain for the input audio. I don't believe audio fx inserted in the fx section of a channel can receive any midi at all from logicpro, but I could be wrong, don't think so though
Thanks guys. Are you saying Element in its current format can't receive midi messages? Just to reiterate, Waves Studio Rack plugin can receive midi to enables you to control the plugins inside it with hardware. Im not sure whether this is a different plugin format though and there in lies the problem. If so, would there be any potentiality of creating an AU version of Element?
Would it be of any value me starting a threat on the LogicProHelp community about this issue? If so what questions would I need to ask exactly?
As far as I know, exposing another plugin's parameters isn't possible (aside through performance parameters).
@JamieJ1 - Element can indeed receive MIDI. Element FX has it enabled by being a MusicEffect AudioUnit, but logic still doesn't deliver MIDI to it. Not sure what Waves is doing differently, but it must be possible. There are zero docs about it from apple for us little guys.
Element/ElementFX are both audio units already. A thread in Logic's forum could be helpful. Definitely wouldn't hurt to ask in there.
To repeat what I said above, I will try to explain again...
LogicPro does not send midi to AUfx plugin slots. Midi can only be directed to the instrument plugin slot of an instrument channel.
LogicPro will let you put AUInst plugins into the instrument slot.....or... you can put AU midi controlled FX into the instrument slot.
When you put an actual AU midi controlled FX into the instrument slot of an instrument channel, then it will receive midi and the audio can be directed to it from the side chain menu.
This is not straightforward, I agree, but this is how LogicPro works for now.
You can find various tutorials on this topic if you research some well known MIDI controlled FX, such as NI's "The Mouth", StutterEdit, etc.
ElementFX is available as a midi controlled FX and you can load that into the instrument slot of LogicPro..and use the side chain. Try it. That is how LogicPro works.
If you make a post on the logic forum about this, I will respond better with graphical images and stuff to explain it.
Thanks @steveschow for the clarification. I wasn't aware I needed to use the sidechain when inserted in the Instrument slot. Next time I'm working on the Mac I'll try this and verify it's working (or not). Much appreciated!
BTW @steveschow : https://github.com/kushview/Element/issues/324
LogicPro does not send midi to AUfx plugin slots. Midi can only be directed to the instrument plugin slot of an instrument channel.
Thanks. But using Instrument slots is not an option for my workflow. The aim is to use Element on every channel as a custom channel strip for mixing. Just to add, you can use midi with BlueCat Audio Patchwork on an audio channels insert slot as normal. Does this have any baring on our discussion?; https://www.youtube.com/watch?v=T-l0S2Vela4
I've had a look through the manual and you seem to be able to use midi with DDMF's Metaplugin also.
I'll start a thread in Logicprohelp and post here.
Jamie you are conflating two different things. It's not a question of whether AUFX supports midi input. It does. Logicpro itself does not send midi to its own aufx slots. So even if you have a Midi controlled aufx plugin such as element or stutter edit, for example, logicpro does not send midi directly to aufx plugin slots in the channel strip.
In logicpro, on the instrument plugin slot when you go to insert a plugin you will see several submenus, one of which is for midi controlled aufx. Elementfx shows up there and I tested it last night it works as I described.
I'm sorry to bear the bad news but if you want to use midi directly in the fx section of a Channel strip then logicpro may not be the daw for you at this time. In logicpro, Midi is fed to the channel strip, through each midifx and finally to the instrument plugin; and terminates there. This is how logicpro is currently architected. At some point apple worked around that by providing the ability to load a midi controlled aufx into the instrument slot and use side chain for the audio input.
It is what is is.
In order to use midi controlled fx on every audio channel you will have to create a companion inst channel to use for that. That's it.
The truth is most people don't use midi for audio, most plugins don't use it either. Nearly all audio fx plugins use parameter automation, which logicpro DOES route directly to the fx section. It's only a few things like stutteredit that specifically need to use midi key switches to drive the way it works as an audio fx
Ps i don't have time to test this morning but the only other possibility would be to make sure elementfx can provide IAC midi inputs even when used as a non midi controlled aufx. Then perhaps you could get midi in regardless of logicpro limitations
Hey Guys,
I've been loosely following this conversation. I finally had a chance to look into it. it appears it may be possible to do what you're asking with a bit of tweaking. it has to use logic smart controls.
Smart controls are the logicpro easy way to assign Midi controllers to plugin parameters. It's somewhat interesting but note that some plugin parameters have higher resolution then midi so while you gain the ability to control plugin parameters with a midi controller, you can program higher resolution parameter automation by not using smart controls. But anyway that is definitely the logicpro wat to quickly hook up Midi controller knobs to plugin parameters
I got this response from DDMF regarding Metaplugin;
Metaplugin uses the MIDI functionality of your host. It exposes its parameters to the host like any other plugin, and in Metaplugin you then need to use the parameter map in the upper left area of the UI to map a certain Metaplugin parameter index to a certain parameter of a plugin inside Metaplugin.
Thanks for getting that info. Metaplugin is doing exactly the same thing Element does with it's Performance Parameters. Going to go ahead an close this issue.
I have managed to get my midi controller to work with and control plugins inside the Element stand-alone Mac app fine. I have tried to set this up with the plugin in Logic to control plugins inside Element with my midi controller, and can’t get this to work.
To reproduce; -Select / highlight a mixer channel so Logic know which channel to route midi to -Load the Element FX plugin on the insert of a Logic 10.5 Mixer channel -Load a plugin or number of plugins into Element -Hit the “Learn” button inside Element as instructed but no midi is being received by Element