autonomousvision / convolutional_occupancy_networks

[ECCV'20] Convolutional Occupancy Networks
https://pengsongyou.github.io/conv_onet
MIT License
831 stars 113 forks source link

Test pretrained model on custom dataset #9

Closed raphaelsulzer closed 3 years ago

raphaelsulzer commented 3 years ago

Hi,

thanks for making this great work available here!

I would like to test one of the pretrained models on my own dataset.

However, I am a bit lost in writing a corresponding config file.

For testing, I simply copied a single points.npz file from ShapeNet to a new folder.

I wrote the following myConfig.yaml file:

data: classes: [''] path: /home/raphael/data pointcloud_n: 10000 pointcloud_file: points.npz voxels_file: null points_file: null points_iou_file: null training: out_dir: out/mine test: model_file: https://s3.eu-central-1.amazonaws.com/avg-projects/convolutional_occupancy_networks/models/pointcloud/shapenet_3plane.pt generation: generation_dir: generation

When running python generate.py config/myConfig.yaml I get the following error:

cfg_special = yaml.load(f) /home/raphael/remote_python/convolutional_occupancy_networks/src/config.py:33: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details. cfg = yaml.load(f) Traceback (most recent call last): File "generate.py", line 38, in dataset = config.get_dataset('test', cfg, return_idx=True) File "/home/raphael/remote_python/convolutional_occupancy_networks/src/config.py", line 134, in get_dataset inputs_field = get_inputs_field(mode, cfg) File "/home/raphael/remote_python/convolutional_occupancy_networks/src/config.py", line 202, in get_inputs_field 'Invalid input type (%s)' % input_type) ValueError: Invalid input type (img)

Could you give me a hint on how to achieve what I want to do?

Kind regards!

pengsongyou commented 3 years ago

Hi,

I think you should have the following line in your config file.

inherit_from: configs/pointcloud/shapenet_3plane.yaml

Check out my config file for the pre_trained model.

raphaelsulzer commented 3 years ago

Thanks for the quick reply.

I managed to get a working config file.

Now I have a problem with the test_loader. It loaded my single points.npz file to test_loader.dataset.models. However, I can not iterate over the test_loader, i.e. in for it, data in enumerate(tqdm(test_loader)): I get the following error:

Traceback (most recent call last): File "/home/raphael/miniconda3/envs/conv_onet/lib/python3.6/site-packages/tqdm/std.py", line 1165, in iter for obj in iterable: File "/home/raphael/miniconda3/envs/conv_onet/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 363, in next data = self._next_data() File "/home/raphael/miniconda3/envs/conv_onet/lib/python3.6/site-packages/torch/utils/data/dataloader.py", line 403, in _next_data data = self._dataset_fetcher.fetch(index) # may raise StopIteration File "/home/raphael/miniconda3/envs/conv_onet/lib/python3.6/site-packages/torch/utils/data/_utils/fetch.py", line 47, in fetch return self.collate_fn(data) File "/home/raphael/miniconda3/envs/conv_onet/lib/python3.6/site-packages/torch/utils/data/_utils/collate.py", line 86, in default_collate raise TypeError(default_collate_err_msg_format.format(elem_type)) TypeError: default_collate: batch must contain tensors, numpy arrays, numbers, dicts or lists; found <class 'NoneType'> python-BaseException

Besides that I also get the following warning in my output:

Error occured when loading field points of model points.npz

What are the field points?

Is there really no easier way of generating a mesh from a custom point cloud with the provided code?

pengsongyou commented 3 years ago

Hi,

I am not sure about this issue. What I suggest is, you can first try to run the demo, use pdb to check how the whole thing works. Also, you can check what is inside those points.npy files, which should give you some ideas of how you should make your data look like.

SrinjaySarkar commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

raphaelsulzer commented 3 years ago

Hi,

I used the sample_mesh.py script from the ONet implementation here to get the pointcloud.npz and points.npz files. They therefore should have the exact same content as similar files from the ShapeNet dataset. However, I still cannot get rid of the error I mentioned in my last post.

Did you manage to run the code on your own dataset?

pengsongyou commented 3 years ago

Can you send me your config file and if possible, send me an email your sampled point files. I can try it.

pengsongyou commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar

pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points.
points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy.

Hope this helps.

csyhping commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar

pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy.

Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

csyhping commented 3 years ago

Hi,

I used the sample_mesh.py script from the ONet implementation here to get the pointcloud.npz and points.npz files. They therefore should have the exact same content as similar files from the ShapeNet dataset. However, I still cannot get rid of the error I mentioned in my last post.

Did you manage to run the code on your own dataset?

Hi @raphaelsulzer , have you solve the problem? Could you please provide the config file and an example of own dataset for reference? Thanks!

pengsongyou commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy. Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

Hi,

Yes, to train the network, we always require to have the ground truth occupancy in the PointField, and the input point cloud in the PointCloudField. Therefore, we need two .npz.

csyhping commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy. Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

Hi,

Yes, to train the network, we always require to have the ground truth occupancy in the PointField, and the input point cloud in the PointCloudField. Therefore, we need two .npz.

Hi @pengsongyou , thanks for your quick reply! Could you please provide one example of these two .npz files for reference? And may i use the sample_mesh.py script from OccNet to generate these two .npz? If not, how can I create these .npz files on my own data?

pengsongyou commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy. Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

Hi, Yes, to train the network, we always require to have the ground truth occupancy in the PointField, and the input point cloud in the PointCloudField. Therefore, we need two .npz.

Hi @pengsongyou , thanks for your quick reply! Could you please provide one example of these two .npz files for reference? And may i use the sample_mesh.py script from OccNet to generate these two .npz? If not, how can I create these .npz files on my own data?

Yes, you can use sample_mesh.py to generate .npz files. To get an example, just download the ShapeNet / Synthetic Room dataset that I provided in the README.md.

csyhping commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy. Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

Hi, Yes, to train the network, we always require to have the ground truth occupancy in the PointField, and the input point cloud in the PointCloudField. Therefore, we need two .npz.

Hi @pengsongyou , thanks for your quick reply! Could you please provide one example of these two .npz files for reference? And may i use the sample_mesh.py script from OccNet to generate these two .npz? If not, how can I create these .npz files on my own data?

Yes, you can use sample_mesh.py to generate .npz files. To get an example, just download the ShapeNet / Synthetic Room dataset that I provided in the README.md.

Hi @pengsongyou , I used the sample_mesh.py and got the npz files. Here is the capture of points.npz and pointcloud.npz, could you please take a look if my files are correct? Thanks!

points.npz image

pointclouds.npz ![Uploading image.png…]()

csyhping commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy. Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

Hi,

Yes, to train the network, we always require to have the ground truth occupancy in the PointField, and the input point cloud in the PointCloudField. Therefore, we need two .npz.

Hi @pengsongyou ,and is there a config.yaml for reference to train with pointcloud? Should I follow files like configs/pointcloud/shapenet_3plane.yaml? Besides, I notice there are shapenet_3plane and shapenet_3plane_partial, may I ask the difference with/without partial?

pengsongyou commented 3 years ago

Did you sample the 100,000 points from the volume and store them in the points.npz file ? If so, how did you get the occupancy values for those 100,000 points? Also can you give a description of the files in the dataset? I was able to reproduce your results for the dataset you mentioned but could you give a brief description of the files mainly (pointcloud.npz and points.npz) to make sure my custom data looks like yours?

Hi @SrinjaySarkar pointcloud.npz contains the points sampled from surfaces. You should have ['points', 'normals'] as keys in pointcloud.npz, which are the 3D position and surface normals of the sampled points. points.npz contains occupancy information of the uniformly sampled points in the space. You should have ['points', 'occupancies'] as keys, which are the 3D positions and occupancy. Hope this helps.

Hi @pengsongyou , may I ask, is these two .npz exactly what we need to train the Occupancy Network and Convolution Occupancy Net?

Hi, Yes, to train the network, we always require to have the ground truth occupancy in the PointField, and the input point cloud in the PointCloudField. Therefore, we need two .npz.

Hi @pengsongyou ,and is there a config.yaml for reference to train with pointcloud? Should I follow files like configs/pointcloud/shapenet_3plane.yaml? Besides, I notice there are shapenet_3plane and shapenet_3plane_partial, may I ask the difference with/without partial?

Please follow the instruction in Readme to run the code yourself, and use pdb or print to understand the code and yaml file. As for the shapenet_3plane_partial, it corresponds to the 3D reconstruction from partial point cloud experiment mentioned in the supplementary material.