Open hjsg1010 opened 6 years ago
Hi,
It depends on what kind of data you have.
For training:
For testing:
Note that the current state of this repository only contains data for testing. That's why only the data_testing.h5 can be generated. You could roll back to a previous commit add chairs to the training set to see the training data.
hello,i am sorry to bother you about how to prepare the raw data,it seems that the points/ply needs to be the same size(2048), id like to know why and how to guarentee the number of the points to be 2048
Hi @weiweimanger, You could try a uniform sampling algorithm (e.g., Poisson disk sampling) to select 2048 points from a point cloud. Instead of implementing such an algorithm yourself, the simplest way is to use the geometry processing tool MeshLab, where such point sampling interfaces are provided.
Hi @IsaacGuan, so i have .ply file, as you mentioned that i need to collect the .seg file for each .ply, how do i do that ?
Hi @dong274, For preparing .seg for your own shapes, you will need to use some mesh segmentation tools. But if you are using data from ShapeNet, you will find the meshes are ready segmented.
@IsaacGuan I am using .csv files per image. It contains multidimensional features. Can I use .csv file instead of .ply file for .h5 file? Thanks
Hi @csitaula, Sure, any format of point cloud is OK, as long as you read it as NumPy array so that it can be written into HDF5 file using the h5py package.
Hi @IsaacGuan Thanks for sharing this work. I am new to these ply and HDF5 files. I want to prepare my own dataset for PointNet2 and from there I got the link of your repo. I am trying to follow the steps u mentioned above but facing the problem. After running the file sequentially, the data_testing.h5 generated but it is empty. Not taking any values.
Hello @IsaacGuan
Please check that I understand how you prepare your data
If not, could you tell me the right order for preparing my own data using your git codes?
Sorry for my poor english skills, and I will really really appreciate if you reply me.
I run your code as follow order without ply, hdf5_data, points_label folder
however, when I run write_hdf5.py, it shows that it needs points_label dats, so I put it. Then only data_testing.h5 came out. can you help me?
I'am trying to classify pointcloud data, but, I can't make my own data as hdf5 format :( plz help me