Closed hansoogithub closed 11 months ago
Hi @hansoogithub, thanks for your interest in the project.
I have a problem viewing the performance evaluation numbers when i run
This is normal behavior. KITTI-360's test set has held-out labels. Meaning you do not have access to the labels for performance evaluation, those are stored on a benchmarking server (see the official website. So the local performance evaluation of SPT can only be run on the validation set, as communicated in our paper. This is why you see empty test performance when running python src/eval.py experiment=kitti360
.
I get this warning during evaluation "You are using cuda device ('nvidia geforce rtx 4090') that has Tensor Cores. To properly utilize them you should set 'torch.set_float32_matmul_precision('medium' | 'high) which will trade-off precision for performance. For more details read https://pytorch.org/docs/stable/generated.torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision
This is unrelated to the above comment. You can safely ignore this warning.
Best,
Damien
how to get evaluation metrics on validation data[kitti-360]? where should I mention that to get evaluation metrics?
I have a problem viewing the performance evaluation numbers when i run
below is the result i got
but when i trained a new model from scratch with the kitti360
i can get view the numbers during training
I get this warning during evaluation "You are using cuda device ('nvidia geforce rtx 4090') that has Tensor Cores. To properly utilize them you should set 'torch.set_float32_matmul_precision('medium' | 'high) which will trade-off precision for performance. For more details read https://pytorch.org/docs/stable/generated.torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision
i tried using high and medium precision but theres no change in the evaluation result.
i used this project in a docker container with gpu passthrough according to your setup. cuda11.8 please help, thank you for the project