hturki / suds

Scalable Urban Dynamic Scenes
MIT License
199 stars 16 forks source link

How to generate the demo video in the project page including ego-vehicle shifting, panoptical segmentation and shorting the camera focal length. #13

Closed rockywind closed 1 year ago

rockywind commented 1 year ago

Hi, Thank you for sharing the code. I run the the eval.py, they don't generate the demo video in the project page. Can you sharing the code for generate the demo video in the project page?

hturki commented 1 year ago

training with the feature_clusters option should render segmentations. if you look at render.py you'll see that there are options to shift the positions and adjust the focal length

rockywind commented 1 year ago

Hi, @hturki When I want to train with the feature_cluster, which feature_clusters option should be set? I run with the script below. python suds/train.py suds --experiment-name kitti_06 --pipeline.datamanager.dataparser.metadata_path metadata/kitti-06.json --pipeline.feature_clusters kmeans but it show the error below return _builtin_open(local_path, mode, buffering=buffering, **open_kwargs) FileNotFoundError: [Errno 2] No such file or directory: 'kmeans'

rockywind commented 1 year ago

Hi, @hturki Can I modify the dynamic ocject's trajectory? I run the code below. python render_images_my.py --load_config /rockywin.wang/NeRF/suds/outputs/kitti_06/suds/2023-04-03_185016/config.yml --output_path /rockywin.wang/NeRF/suds/outputs/eval_add_flow_render_shift/ --generate-ring-view False --pos-shift 0.2 It can only modify the ego-car's trajectory.