Open aniqueakhtar opened 3 years ago
I'm also trying for a different up_ratio. Were you able to do it?
Yes. I got it to work. Looking back at my code now, I see I used this in model.py:
self.input_y = tf.placeholder(tf.float32, shape=[self.opts.batch_size, int(self.opts.up_ratio*self.opts.num_point),3])
I trained multiple versions of PU-GAN with different upsampling ratios.
During testing, I would edit the lines 22, 23, and 24 in pu_gan.py
. I would also edit the configs.py
file for each upsampling ratio.
I also found another mistake during the testing phase. It is in line 262, 263, and 264. The code is measuring the output points for the first point and is then assuming that all the other point clouds being fed in are the same size. So if your first point cloud was 2k point with 4x upsampling. The testing would output 8k points. Even when you feed it 8k point cloud input, the output would be 8k points. So you should find the number of output points within the loop. I added these lines after line 270, while commenting out line 262, 263, and 264.
self.opts.num_point = pc.shape[0]
out_point_num = int(self.opts.num_point*self.opts.up_ratio)
I am not sure what other changes I made to the code since it has been a long time.
Feel free to send me an email and I can share my code (a bit messy). We can discuss this work further. I was also able to compare PU-GAN to my own implementation of Upsampling for point clouds called PU-Dense.
Hey !! ty for the response!
I tried to do what you've done but i still got this error:
KeyError: "Unable to open object (object 'poisson_2560' doesn't exist)"
Its because the h5 file. Since isto not 4256 =1024 (the one that is available) and is 10256=2560 i got this error? Did you changed something on the data_loader too?
Just comparing the two files, I can't see any changes. I trained it on my own dataset. Which I generated with a code that looked similar to this:
import h5py
import glob
import open3d as o3d
import numpy as np
import os
data_loc = './ShapeNet/'
save_path = 'train/data_8x.h5'
data_files = sorted(glob.glob(data_loc+'*.h5'))
N = 256
r = 8
poisson_256 = []
poisson_2048 = []
for i,m in enumerate(data_files):
if i%500==0:
print(i, ' / ', len(data_files))
coords = h5py.File(m, 'r')['data'][:,:3]
pcd = o3d.geometry.PointCloud()
pcd.points = o3d.utility.Vector3dVector(coords)
pcd_tree = o3d.geometry.KDTreeFlann(pcd)
n = np.random.randint(0,len(coords),1)[0]
[_, idx,_ ] = pcd_tree.search_knn_vector_3d(coords[n], N*r)
pc = coords[idx]
if pc.shape[0] != N*r:
print(i)
print(len(idx))
continue
pc -= pc.min(0)
pc = pc / pc.max()
pc_down = pc[np.random.randint(0,len(pc),N)]
poisson_2048.append(pc)
poisson_256.append(pc_down)
poisson_2048 = np.stack(poisson_2048)
poisson_256 = np.stack(poisson_256)
h5f = h5py.File(save_path, 'w')
h5f.create_dataset('poisson_2048', data=poisson_2048)
h5f.create_dataset('poisson_256', data=poisson_256)
h5f.close()
So basically randomly choosing a single point on each point cloud to extract a single patch. I believe I had 24k point clouds. so ended up with 24k patches.
Thank you so much!!!
I am trying to trian the model for up_ratio=8 but I realized there are a lot of error in the code that doesn't make it possible to train for that up_ratio.
E.g.: in model.py, line 34:
self.input_y = tf.placeholder(tf.float32, shape=[self.opts.batch_size, int(4*self.opts.num_point),3])
Maybe this should be:self.input_y = tf.placeholder(tf.float32, shape=[self.opts.batch_size, int(self.opts.up_ratio*self.opts.num_point),3])
Similarly, in model.py, line 253:
This second issue I do not understand. What is the purpose of this line? Should this not be here?
Also found this issue in data_loader.py, line 36:
num_4X_point = int(opts.num_point*4)
Any help would be appreciated. Thanks