Hi, thank you for your work! I am implementing the network with custom data and I am struggling in the dataloading.
I tried cutting squares but I ended up with different input sizes and forced to keep batch =1.
I want to cut and save the input files as samples before the training - I have very huge point clouds and very sparse in some regions - so I was thinking on a strategy to have consistent, fixed size inputs.
Hi, thank you for your work! I am implementing the network with custom data and I am struggling in the dataloading. I tried cutting squares but I ended up with different input sizes and forced to keep batch =1. I want to cut and save the input files as samples before the training - I have very huge point clouds and very sparse in some regions - so I was thinking on a strategy to have consistent, fixed size inputs.
Looking at the code here I was wandering if: