Official PyTorch implementation of Superpoint Transformer introduced in [ICCV'23] "Efficient 3D Semantic Segmentation with Superpoint Transformer" and SuperCluster introduced in [3DV'24 Oral] "Scalable 3D Panoptic Segmentation As Superpoint Graph Clustering"
Hi @drprojects , I have a question about the number of workers. I have a system with 53GB of RAM, and during training, I get the error 'dataloader's workers are out of shared memory'. I want to know if I can set the number of workers to be smaller. Thank you for your effort.
Hi @drprojects , I have a question about the number of workers. I have a system with 53GB of RAM, and during training, I get the error 'dataloader's workers are out of shared memory'. I want to know if I can set the number of workers to be smaller. Thank you for your effort.