nginyc / rafiki

Rafiki is a distributed system that supports training and deployment of machine learning models using AutoML, built with ease-of-use in mind.
Apache License 2.0
36 stars 23 forks source link

No GPU support for predictor #141

Open vivansxu opened 5 years ago

vivansxu commented 5 years ago

May I confirm if there is no GPU support for the predictor service? If no, how do I implement prediction for models which have to use GPUs? Is it possible to add GPU support for the predictor to Rafiki? Besides, I notice it is said in the documentation that it is required for the model to train and evaluate with only CPUs if there is no GPU hardware available. Since my model only supports GPU environment, do I have to follow this requirement? Thank you.

nginyc commented 5 years ago

Yes, there is currently no GPU support when the trained model is deployed for inference on Rafiki. Currently, it is expected that models can always fallback to using CPU. Is there a workaround for your model? @nudles should we support using GPU for inference as well?

nudles commented 5 years ago

Yes. It would be good to support GPU inference. The implementation should be easy? similar to that for training.

On Tue, Jul 16, 2019 at 8:23 PM Ngin Yun Chuan notifications@github.com wrote:

Yes, there is currently no GPU support when the trained model is deployed for inference on Rafiki. Currently, it is expected that models can always fallback to using CPU. Is there a workaround for your model? @nudles https://github.com/nudles should we support using GPU for inference as well?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/nginyc/rafiki/issues/141?email_source=notifications&email_token=AA47DRY6Z2LT6WD4T6PVVKDP7W4RXA5CNFSM4IA54YKKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD2AVPEQ#issuecomment-511793042, or mute the thread https://github.com/notifications/unsubscribe-auth/AA47DR74UUEMFKBB2XV45NLP7W4RXANCNFSM4IA54YKA .

nginyc commented 5 years ago

Shouldn't be that difficult. Will add as a task

nginyc commented 5 years ago

Hi @vivansxu, we've recently added this functionality on the branch add_gpu_for_inference! Let us know if it works for you. Refer to the documentation on testing latest code changes to verify the changes since it's not on the master branch

vivansxu commented 5 years ago

Hi @nginyc, thank you so much for adding GPU support! I just tried to run the inference job, and it works well!

By the way, since I want to return a list of strings as prediction, I changed line 64 of rafiki/predictor/ensemble.py from: if isinstance(prediction, Iterable): to: if isinstance(prediction, list):

nudles commented 5 years ago

I think iterable is more general than list. a list is also an Iterable instance. If no other problems, I am going to merge the PR into the dev branch.

vivansxu commented 5 years ago

Hi @nudles, actually if just using Iterable here, since a string object is always iterable(explain at the end), there would be a RecursionError: maximum recursion depth exceeded in comparison

If I run:

for x in ('abc'): print(isinstance(x, Iterable))

The output would be: True True True

nginyc commented 5 years ago

Hi @vivansxu Thanks for the catch. I have a fix for this bug in #132, which is pending to be merged into dev.