mikeoliphant / neural-amp-modeler-lv2

Neural Amp Modeler LV2 plugin implementation
GNU General Public License v3.0
229 stars 28 forks source link

Add file handling for Mod #20

Closed micahvdm closed 1 year ago

micahvdm commented 1 year ago

This will also require an update to mod-ui and browsepy, but I’ll add it to the updates I’m doing for modep to 1.13

mikeoliphant commented 1 year ago

@falkTX Any thoughts?

falkTX commented 1 year ago

Please change the label to be "Neural Model", otherwise looks fine to me. but we need to make sure it actually works on MOD units first, no? Has there been performance tweaks or optimizations done to the core engine? Last I checked I couldn't run it even at 256 buffer size.

PS: similar ticket in https://github.com/moddevices/mod-live-usb/pull/18

mikeoliphant commented 1 year ago

but we need to make sure it actually works on MOD units first, no?

I think the primary motivation here is to use it under mod-ui on more powerful devices like RPi4.

Have you tried running any "feather" (lightest CPU) NAM models on the dwarf? It might just be able to handle them:

https://tonehunt.org/?tags=feather-mdl

falkTX commented 1 year ago

but we need to make sure it actually works on MOD units first, no?

I think the primary motivation here is to use it under mod-ui on more powerful devices like RPi4.

Sure, and being opensource means you can do it without even asking. Can add it to mod-ui so it fetches the related files, but not keen on adding this on browsepy (file browser side) if MOD units won't be able to load them, because then I would need to maintain a patch that hides these away.

Can try the feather models sure, will try to do it later today.

Have you perhaps tested and found the ideal compiler flags for the best performance on a Pi4 and any other units? Since these boards tend to be quite similar on the CPU side, optimizations done for 1 hw type likely benefits others.

mikeoliphant commented 1 year ago

Have you perhaps tested and found the ideal compiler flags for the best performance on a Pi4 and any other units?

I played around with the cortex neon options a bit on RPi4. As I recall, it didn't seem to make much difference - but I wasn't super thorough with testing it. I upgraded to a 64bit OS instead, and that made a huge difference.

micahvdm commented 1 year ago

I have changed the label to Neural Model as requested above. I am testing this on rpi now with my updated mod-ui and browsepy.

falkTX commented 1 year ago

I upgraded to a 64bit OS instead, and that made a huge difference.

This is likely due to NAM using a lot of double as type for DSP. While float vs double on SSE/x64 has comparable speed, the same is not true for NEON. So being able to change that type might be one way to get a bit of a performance gain. Internal filters (like biquad stuff) should be kept with double type, but regular audio buffers and other more trivial stuff should be converted to float as much as possible.

falkTX commented 1 year ago

Did some tests with feather models, these actually load now, but sits on the very limit of CPU under 256 buffer size. MOD Dwarf units can't run those in 128, but the Duo X can. So I suspect the Raspberry Pi4 should be able to do it too, specially if you do not run much else on the unit and have RT priorities for audio setup.

Added the filetype in mod-ui side, so the plugin settings dialog is able to list the NAM files.

image

Plugin category should be changed to simulator though, fits better considering what its main target is.

mikeoliphant commented 1 year ago

This is likely due to NAM using a lot of double as type for DSP.

NAM is actually using float for the CPU-intensive stuff. The interface is double because that is what the official NAM plugin uses (which, as you point out, is good for the EQ section of the plugin). Making it optionally accept float input is on my "todo" list, but it isn't really a hot path issue.

64bit makes float operations a lot faster as well, if they can be vectorized by the compiler - I think the Eigen matrix library (which is doing the heavy lifting under the hood) is well setup for that.

mikeoliphant commented 1 year ago

So I suspect the Raspberry Pi4 should be able to do it too,

Yes - I'm running it on Raspberry Pi4 and it can even handle "standard" NAM models with enough CPU left for cabinet IR and some effect.

falkTX commented 1 year ago

So I suspect the Raspberry Pi4 should be able to do it too,

Yes - I'm running it on Raspberry Pi4 and it can even handle "standard" NAM models with enough CPU left for cabinet IR and some effect.

but at what buffer size and latency?

128/48kHz gives around 2.6ms of DSP time, latency will depend on the sound setup. I suspect a Pi4 can run feather models at even 64 frames, but not much else. At 128 frames likely it could be a good enough FX chain for most purposes.

mikeoliphant commented 1 year ago

but at what buffer size and latency?

I'm able to just barely run full NAM models using 64 samples at 48k - that's the lowest my USB audio interface will open at in Linux.

In practice, I usually run a 96-sample buffer as it gives me a bit more headroom.

I'm very sensitive to latency, and this setup feels good to me. I used it as my live band practice rig for quite a while.