drprojects / superpoint_transformer

Official PyTorch implementation of Superpoint Transformer introduced in [ICCV'23] "Efficient 3D Semantic Segmentation with Superpoint Transformer" and SuperCluster introduced in [3DV'24 Oral] "Scalable 3D Panoptic Segmentation As Superpoint Graph Clustering"
MIT License
545 stars 71 forks source link

Empty tensor when running inference on custom dataset #45

Closed JeppeVHolm closed 9 months ago

JeppeVHolm commented 9 months ago

Hi!

We are trying to use the pre-trained dales model to segment our dataset. Our dataset is similar to dales and is setup in SPT in the same way as dales. When we use the dales model to segment the dales dataset we do not have any issues. dales_example kirkegaard_example

When running inference we get the following error: image

It seems like the 'data.pos' tensor is loaded correctly at first. However, when we run the script the tensor 'coords' is empty. As 'coords' is a result of 'data.pos'/'grid_size' we tried to change 'grid_size' as 'data.pos' seem to be read correctly. Furthermore we tried adding our data to the dales dataset in SPT, as we thought we might have made an error when setting up our own dataset. This did not solve the issue. It seems as the issue is related to the dataset, but we are having issues locating the error. Do you have any suggestions?

Once again thank you for the great project! 😃

Best regards Jeppe

drprojects commented 9 months ago

Seems like you need to investigate the nature of your data before calling grid_cluster() here:

PS: this is your second issue asking for help on this project. If you are using and like the project, don't forget to all give us a ⭐, it means a lot to us !

drprojects commented 9 months ago

Have you solve this issue ? May I close it ?

JeppeVHolm commented 9 months ago

Hi. We have not yet resolved the issue. As described, our data is read as it should be. Something goes wrong when we need to create 'coords' and 'pos' as they are empty. I think the problem is with our dataset and not SPT, so you can resolve the issue. Best regards Jeppe