triton-inference-server / fil_backend

FIL backend for the Triton Inference Server
Apache License 2.0
68 stars 35 forks source link

Add ability to perform prediction on CPU #82

Closed hcho3 closed 3 years ago

hcho3 commented 3 years ago

It is now possible to run prediction on CPU. Just set

instance_group [{ kind: KIND_CPU }]

in the config.pbtxt.

Closes #41

Replaces #63

hcho3 commented 3 years ago

This PR is now ready for review. I tested it locally and all tests passed.

wphicks commented 3 years ago

Just created #88 as a follow-on item. No need for us to fix it in this PR, but we should clean up the duplication across CPU/GPU tests both for code cleanliness and CI efficiency.

wphicks commented 3 years ago

CI failure just needs latest changes from main merged in. Otherwise, the new changes look good! I particularly like the new utility function.