z-x-yang / CFBI

The official implementation of CFBI(+): Collaborative Video Object Segmentation by (Multi-scale) Foreground-Background Integration.
BSD 3-Clause "New" or "Revised" License
322 stars 43 forks source link

Is a nvidia Tesla V100 32GB enough for inference? #11

Closed hitsz-zuoqi closed 4 years ago

z-x-yang commented 4 years ago

Sure.

And 16GB is enough as well. I think even 12GB or much less is enough if you enable float16 and increase the number of --global_chunks during inference.

z-x-yang commented 4 years ago

If there are no more questions, I'll close this issue. THX.