JunzheJosephZhu / HiFA

Apache License 2.0
192 stars 6 forks source link

Batch size #2

Closed benoriol closed 1 year ago

benoriol commented 1 year ago

Hello,

Congrats on the great paper. I would appreciate it if you could share some additional implementation details, namely the batch size and what GPU memory was required for your implementation.

Thanks, B

HiFA-team commented 1 year ago

batch size has always been 1. 39GB VRAM used with AMP and 512x512 NeRF resolution

On Fri, Aug 18, 2023 at 5:04 PM Benet @.***> wrote:

Hello,

Congrats on the great paper. I would appreciate it if you could share some additional implementation details, namely the batch size and what GPU memory was required for your implementation.

Thanks, B

— Reply to this email directly, view it on GitHub https://github.com/HiFA-team/HiFA/issues/2, or unsubscribe https://github.com/notifications/unsubscribe-auth/BAEVSEFDX2GLLY7M2ZWNKZLXV77J3ANCNFSM6AAAAAA3WEOSGU . You are receiving this because you are subscribed to this thread.Message ID: @.***>