qinglew / PCN-PyTorch

Implementation of PCN(Point Completion Network) in PyTorch.
138 stars 31 forks source link

Possibility of using .pcd files to train the model? #13

Closed MarioCavero closed 1 year ago

MarioCavero commented 2 years ago

I have read the sample and render and all use cad models. Is there any possibility to train the model with PointClouds(PCs) .pcd files? (PCs obtained from rgb and depth images in pyBullet and open3d).

As cad models are mentioned, if pcd files are not possible, it would make sense to me it's possible to use YCB object set CAD models as well to train the network?

MarioCavero commented 2 years ago

I have managed to do all the building and compiling. Render process is also done. In sample, before running mesh_sampling inside the build folder,


./mesh_sampling ../../outputdir/pcd/
Syntax is: ./mesh_sampling input.{ply,obj} output.pcd <options>
  where options are:
                -n_samples X   = number of samples (default: 100000)
                -leaf_size X   = the XYZ leaf size for the VoxelGrid -- for data reduction (default: 0.010000 m)
                -write_normals = flag to write normals to the output pcd
                -no_vis_result = flag to stop visualizing the generated pcd
                -no_vox_filter = flag to stop downsampling the generated pcd

It is a bit confusing. Starting with ply, which is not needed at all at any moment in the project nor is created in the render process, rather than pcds instead. Obj will probably mean the cad model obj. The pathing to specify each object seems confusing to run ./mesh_sampling inside build. Screenshot from 2022-05-12 12-00-37

In the picture, it can be seen my output directory (outputdir), with depth, exr, pcd and pose, as well as the dataset ycb_cad_models_specific with a nontextured.ply (this came within the dataset, should this be used?). The obj used to render the dataset was (model_watertight_1000def.obj) although I am not sure if that one should be used or just textured.obj.

So the syntax of ./mesh_sampling seems very confusing.