drprojects / superpoint_transformer

Official PyTorch implementation of Superpoint Transformer introduced in [ICCV'23] "Efficient 3D Semantic Segmentation with Superpoint Transformer" and SuperCluster introduced in [3DV'24 Oral] "Scalable 3D Panoptic Segmentation As Superpoint Graph Clustering"
MIT License
601 stars 75 forks source link

The number of items in the Loss function is different from the number of stages matching #4

Closed jing-zhao9 closed 1 year ago

jing-zhao9 commented 1 year ago

Hi,When I run train.py, found 4 stages, but 2 criteria in the loss.

Error executing job with overrides: []
Traceback (most recent call last):
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 92, in _call_target
    return _target_(*args, **kwargs)
  File "/home/zhaojing/code/superpoint_transformer-master/src/models/segmentation.py", line 53, in __init__
    assert len(self.net.out_dim) == len(self.criterion), \
AssertionError: The number of items in the multi-stage loss must match the number of stages in the net. Found 4 stages, but 2 criteria in the loss.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/utils.py", line 394, in _run_hydra
    _run_app(
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/utils.py", line 457, in _run_app
    run_and_report(
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/utils.py", line 223, in run_and_report
    raise ex
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/utils.py", line 220, in run_and_report
    return func()
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/utils.py", line 458, in <lambda>
    lambda: hydra.run(
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/hydra.py", line 132, in run
    _ = ret.return_value
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/core/utils.py", line 260, in return_value
    raise self._return_value
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/core/utils.py", line 186, in run_job
    ret.return_value = task_function(task_cfg)
  File "/home/zhaojing/code/superpoint_transformer-master/src/train.py", line 139, in main
    metric_dict, _ = train(cfg)
  File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 48, in wrap
    raise ex
  File "/home/zhaojing/code/superpoint_transformer-master/src/utils/utils.py", line 45, in wrap
    metric_dict, object_dict = task_func(cfg=cfg)
  File "/home/zhaojing/code/superpoint_transformer-master/src/train.py", line 82, in train
    model: LightningModule = hydra.utils.instantiate(cfg.model)
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 226, in instantiate
    return instantiate_node(
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 347, in instantiate_node
    return _call_target(_target_, partial, args, kwargs, full_key)
  File "/home/zhaojing/anaconda3/envs/spt/lib/python3.8/site-packages/hydra/_internal/instantiate/_instantiate2.py", line 97, in _call_target
    raise InstantiationException(msg) from e
hydra.errors.InstantiationException: Error in call to target 'src.models.segmentation.PointSegmentationModule':
AssertionError('The number of items in the multi-stage loss must match the number of stages in the net. Found 4 stages, but 2 criteria in the loss.')
full_key: model
python-BaseException
drprojects commented 1 year ago

I assume you just ran:

python src/train.py

without specifying any dataset, nor model config ? By doing so, hydra will fallback to some default configs, which are not compatible at the moment. I will change this so that the fallback model and dataset are compatible, but I must stress the above command is not exactly how this code is intended to be used.

Indeed, I invite you to read the information provided in the README for getting started with the project. In particular, you will want to have a look at the suggested commands for training SPT on one of the 3 currently provided datasets.