Thank you for your great work ! I wonder the hardware requirements for the all three versions of Point Transformer. It seems that Hengshuang's method needs 4 24GB GPUs (TITAN RTX),how about the another two versions? Can several RTX 2080 Ti GPUs work? Looking forward to your reply!
Thank you for your great work ! I wonder the hardware requirements for the all three versions of Point Transformer. It seems that Hengshuang's method needs 4 24GB GPUs (TITAN RTX),how about the another two versions? Can several RTX 2080 Ti GPUs work? Looking forward to your reply!