Closed RavenLeeANU closed 5 years ago
Hi @RavenLeeANU,
I assume you set up tensorflow-gpu environment on your working machine? You can train your model using tensorflow-gpu, but it isn't possible to use cuda for inference. Inference happens on the mobile device when you run the sample, not on your machine.
Hi @RavenLeeANU,
I hope you resolved your problem. I'll close this issue for now, feel free to reopen if you have more questions.
hi, thanks for your work, I wonder is it possible to use cuda to speed up the inference time. I set up a tensorflow-gpu environment with cuda and cudnn, but I donnot know how to call the gpu operation. Can u give me some hints for that?
Thanks