drprojects / superpoint_transformer

Official PyTorch implementation of Superpoint Transformer introduced in [ICCV'23] "Efficient 3D Semantic Segmentation with Superpoint Transformer" and SuperCluster introduced in [3DV'24 Oral] "Scalable 3D Panoptic Segmentation As Superpoint Graph Clustering"
MIT License
560 stars 72 forks source link

num workers #153

Closed zeejja closed 3 weeks ago

zeejja commented 4 weeks ago

Hi @drprojects , I have a question about the number of workers. I have a system with 53GB of RAM, and during training, I get the error 'dataloader's workers are out of shared memory'. I want to know if I can set the number of workers to be smaller. Thank you for your effort.

drprojects commented 3 weeks ago

Hi, yes feel free to change the number of workers if that helps on your machine. This parameter can be found at datamodule.num_workers in the config.