Closed hitsz-zuoqi closed 4 years ago
Sure.
And 16GB is enough as well. I think even 12GB or much less is enough if you enable float16 and increase the number of --global_chunks during inference.
float16
--global_chunks
If there are no more questions, I'll close this issue. THX.
Sure.
And 16GB is enough as well. I think even 12GB or much less is enough if you enable
float16
and increase the number of--global_chunks
during inference.