Open kbrajwani opened 3 years ago
Hi,you can refer to test.py, it supports fp16 inference.
Hey @hhaAndroid thanks it's works on fp16. I was converting model into fp16 to save memory but it's taking more memory that normal. can you please check?
At a time of model load.
After 2-3 image inference.
Can you tell me how can i resolve this issue.
@kbrajwani do you have any progress?
Hey i am using this code to reduce memory usage with fp16 but getting following error.