Jumpat / SegmentAnythingin3D

Segment Anything in 3D with NeRFs (NeurIPS 2023)
Apache License 2.0
831 stars 52 forks source link

Can we test this model based on the real world data made by ourselves? #59

Open wzq20030207 opened 4 months ago

wzq20030207 commented 4 months ago

Hi, this is an amazing work. I want to use this technique to segment the real-world data made by myself, and how to achieve it. Is there any tutorial to preprocess our own data?

Jumpat commented 4 months ago

Hello, it is possible to segment your own data with SA3D.

You may try to process your data with colmap to estimate the camera parameters and store the processed data like the data structure introduced in our README. Then you can follow our provided instruction to train and segment the NeRF.

wzq20030207 commented 4 months ago

Oh, thank you. Does this mean that if I use colmap to get the transform.json, that's enough for me to train my data? Do I need to downscale images to get images_x? If I need to, how do I get them, and what rules should I follow? Moreover, should I write a specific configure file like fern.py to make training correctly run?

Jumpat commented 4 months ago

Yes, the images_xs are not necessary, except that your images are too large.

You do need the config file for training on your own data. If your images are forward-facing then check the configs for the llff dataset and if they are 360-degree you can check the configs in nerf_unbounded.