PRBonn / semantic-kitti-api

SemanticKITTI API for visualizing dataset, processing data, and evaluating results.
http://semantic-kitti.org
MIT License
739 stars 184 forks source link

Problem using visualize.py #151

Open Shiyao-Xu opened 1 week ago

Shiyao-Xu commented 1 week ago

Dear Team,

thank you for your work! I'm currently using this api for occupancy prediction task, and now facing with one fatal question. When I tried to launch visualize.py file, I got this output in terminal:

`**** INTERFACE: Dataset ../semantic_KITTI/kitti/ Config config/semantic-kitti.yaml Sequence 00 Predictions None ignore_semantics False do_instances False ignore_images False link False ignore_safety False color_learning_map False offset 0


Opening config file config/semantic-kitti.yaml Sequence folder ../semantic_KITTI/kitti/sequences/00/velodyne exists! Using sequence from ../semantic_KITTI/kitti/sequences/00/velodyne Labels folder ../semantic_KITTI/kitti/sequences/00/labels exists! Using labels from ../semantic_KITTI/kitti/sequences/00/labels Using semantics in visualizer To navigate: b: back (previous scan) n: next (next scan) q: quit (exit program) /media/varda/Dolphin/miniconda3/lib/python3.12/site-packages/vispy/gloo/texture.py:364: UserWarning: GPUs can't support dtypes bigger than 32-bit, but got 'float64'. Precision will be lost due to downcasting to 32-bit. data = downcast_to_32bit_if_needed(data, copy=copy)`

No ERROR occurred, but no vis window either...

Could you please help me with this problem? Is it because my GPU performance is too low(1080Ti)?

Best, Shiyao

jbehley commented 1 week ago

thanks for your interest in our work: I suspect that it has nothing to do with the GPU as it's only visualizing the point clouds; but I haven't checked in a while if the code still works with newer python packages. I guess that the error message from vispy has something to do with this. It might be that you have to use an earlier vispy version?

I have to look into this, but I will only possibly have time to look into this at earliest next week.