ARM-software / armnn

Arm NN ML Software. The code here is a read-only mirror of https://review.mlplatform.org/admin/repos/ml/armnn
https://developer.arm.com/products/processors/machine-learning/arm-nn
MIT License
1.17k stars 310 forks source link

Documentation - Distributed CNN inference #570

Closed nullbyte91 closed 2 years ago

nullbyte91 commented 3 years ago

Hi Team,

Is it possible to distribute the CNN inference engine to both CPU and Mali GPU using ARMNN?

Regards, Jegathesan S

Colm-in-Arm commented 3 years ago

Hi Jegathesan S,

Yes it is possible. By default you can specify an ordered list of preferred backends. This is done when calling Network::Optimize passing in backendPreferences.

During the optimize phase for each layer, each backend is asked if they can support it. The first backend that says it can will then execute that layer.

For example, if you know the device you are using has a particularly high spec GPU you could specify the backends as [GpuAcc, CpuAcc]. All layers that are supported in the GPU will run there. If they're not supported it will fall back to the CPU.

There is also a mechanism to perform more fine grained control but let's see if this is enough to get you started.

Colm.

MikeJKelly commented 2 years ago

Closing as I think this has been resolved. If you need more help please open another issue.