Open raviS123-dot opened 1 year ago
Hey, I have the same problem.
Did you find any solution? Form my understanding, the SemanticSegmentation pipeline should be equipped with Adam optimizer by default...
Hey, I also have this problem. Can anyone help?
Hey, i get the same error when i use custom dataset using CUSTOM3D. @raviS123-dot you found the solution to your problem. Will you please share
hey @raviS123-dot @RauchLukas @njakuschona @QasimMuhammad
The problem is in this line:
pipeline = SemanticSegmentation(model=model, dataset=dataset, max_epoch=100)
where the input is only dataset, model and max number of epochs, you need to add the pipeline configuration (cfg.pipeline)
use this:
pipeline = SemanticSegmentation(model=model, dataset=dataset, **cfg.pipeline)
this will call all the attributes in cfg_file instead of inserting them manually one by one
you can find the config file here:
_"Open3D-ML/ml3d/configs/randlanet_semantickitti.yml"
_
Traceback (most recent call last): File "/usr/lib/python3.10/idlelib/run.py", line 578, in runcode exec(code, self.locals) File "/home/v3/Desktop/kitti/train1.py", line 12, in
pipeline.run_train()
File "/home/v3/.local/lib/python3.10/site-packages/open3d/_ml3d/torch/pipelines/semantic_segmentation.py", line 376, in run_train
self.optimizer, self.scheduler = model.get_optimizer(cfg)
File "/home/v3/.local/lib/python3.10/site-packages/open3d/_ml3d/torch/models/randlanet.py", line 354, in get_optimizer
**cfg_pipeline.optimizer)
File "/home/v3/.local/lib/python3.10/site-packages/open3d/_ml3d/utils/config.py", line 244, in getattr
return getattr(self._cfg_dict, name)
File "/home/v3/.local/lib/python3.10/site-packages/open3d/_ml3d/utils/config.py", line 27, in getattr
raise ex
AttributeError: 'ConfigDict' object has no attribute 'optimizer'
\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\
My code is;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;
Import torch and the model to use for training
import open3d.ml.torch as ml3d from open3d.ml.torch.models import RandLANet from open3d.ml.torch.pipelines import SemanticSegmentation
Read a dataset by specifying the path. We are also providing the cache directory and training split.
dataset = ml3d.datasets.SemanticKITTI(dataset_path='/home/v3/Desktop/kitti/dataset', cache_dir='./logs/cache',training_split=['00', '01', '02', '03', '04', '05', '06', '07', '09', '10'])
Initialize the RandLANet model with three layers.
model = RandLANet(dim_input=3) pipeline = SemanticSegmentation(model=model, dataset=dataset, max_epoch=100)
Run the training
pipeline.run_train()
\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\ Dataset is : sementicKITTI