apple / coremltools

Core ML tools contain supporting tools for Core ML model conversion, editing, and validation.
https://coremltools.readme.io
BSD 3-Clause "New" or "Revised" License
4.36k stars 630 forks source link

Could not select device type (CPU, GPU, NPU) #1701

Closed jaehwlee closed 1 year ago

jaehwlee commented 1 year ago

🐞Describing the bug

Hello. I've been trying to convert pytorch mode to CoreML, and successfully convert the model. However, I saw that the CoreML model uses only CPU, even though I set up to use all devices (CPU, GPU, NPU). What should I do for using all devices?

To Reproduce



## System environment (please complete the following information):
 - coremltools version: 6.0
 - OS (e.g. MacOS version or Linux type): macOS
 - Any other relevant version information (e.g. PyTorch or TensorFlow version): torch 1.12.1
TobyRoseman commented 1 year ago

Looks like you are setting the compute_units parameter of ct.convert to ct.ComputeUnit.All. Which is about all you can do here. Note - the compute_units dictates which compute units the model is allowed to run on. If you use ct.ComputeUnit.ALL, that doesn't mean all of the compute unit will be used. It just means all compute units can be used.

The Core ML Framework decided what compute units are actually used. This depends on many factors, such as: your hardware type, OS version, layer sizes, and just general network architecture. A post was recently released about how to get your model to run on the Apple Neural Engine.

Since this is a question about the Core ML Framework, not the coremltools python package, I'm going to close this issue. If you would like to submit questions or feedback about the Core ML Framework, please do so using the Feedback Assistant.