Closed dqj5182 closed 2 months ago
May I ask how we can demo or do inference on custom image (meaning our own image)?
Also, I wish to ask what is computational burden for running inference of the model on GPU.
Hi, you can find instructions and computational requirements here.
Oh. I think I failed to find the section. I'll try it out!
May I ask how we can demo or do inference on custom image (meaning our own image)?
Also, I wish to ask what is computational burden for running inference of the model on GPU.