deepjavalibrary / djl

An Engine-Agnostic Deep Learning Framework in Java
https://djl.ai
Apache License 2.0
4.07k stars 650 forks source link

How to disable GPU for inference? #1617

Closed davpapp closed 2 years ago

davpapp commented 2 years ago

Hi, I'm running inference with my model using Java DJL. Is there a way to disable the GPU for inference? I saw in the FAQ that you can disable the GPU for training by using .setDevices() on the training config. Can this similarly be done for inference? If so, do we need to set the devices for the entire pipeline, or just the modelzoo, or something else? Thank you!

frankfliu commented 2 years ago

You can specify device when you load the model:

      Criteria<Image, DetectedObjects> criteria =
              Criteria.builder()
                      .setTypes(Image.class, DetectedObjects.class)
                      .optModelPath(Paths.get("/MyModel/mode.pt"))
                      .optDevice(Device.cpu()) // explicitly set to use CPU
                      .optTranslator(new MyTranslator())
                      .build();