sdatkinson / NeuralAmpModelerCore

Core DSP library for NAM plugins
MIT License
304 stars 61 forks source link

TPU and GPU inference support #123

Open novikov-alexander opened 1 month ago

novikov-alexander commented 1 month ago

Are there any plans to support inference on some accelerators?

Let's say use ONNXRuntime or TensorRT to free up CPU resources.