Closed davpapp closed 2 years ago
You can specify device when you load the model:
Criteria<Image, DetectedObjects> criteria =
Criteria.builder()
.setTypes(Image.class, DetectedObjects.class)
.optModelPath(Paths.get("/MyModel/mode.pt"))
.optDevice(Device.cpu()) // explicitly set to use CPU
.optTranslator(new MyTranslator())
.build();
Hi, I'm running inference with my model using Java DJL. Is there a way to disable the GPU for inference? I saw in the FAQ that you can disable the GPU for training by using .setDevices() on the training config. Can this similarly be done for inference? If so, do we need to set the devices for the entire pipeline, or just the modelzoo, or something else? Thank you!