vishwa91 / MINER

MINER: Multiscale Implicit Neural Representation
MIT License
45 stars 10 forks source link

About dataprocessing #3

Closed csyhping closed 7 months ago

csyhping commented 8 months ago

Hi @vishwa91 , thanks for your amazing work. May I ask how did you process to get the example Lucy.mat? Because I wanna try with my own dataset with .ply or .obj files. Thanks!

vishwa91 commented 7 months ago

@csyhping could you check the link here: https://github.com/vishwa91/wire/issues/11#issuecomment-1773700061

I shared a piece of code to convert ply files to mat files. Hope that helps!

csyhping commented 7 months ago

@vishwa91 Thank you so much!

csyhping commented 7 months ago

@vishwa91 Sorry for asking again, but it may be proper to ask under this issue. In ACORN, it samples 20000000 points. I run the code you shared on one mesh, but it takes too long (several hours for even one shape). Is this normal? Or have you modified the number of samples?

BTW: or did I misunderstand the parameter nsample=0 you used? So, actually, we don't need to sample on the surface for generating mat? It seems the MeshIntersector did the occupancy evaluation. If so, may I ask the resolution you use in MeshIntersector?

Thanks!

vishwa91 commented 7 months ago

@csyhping I may have answered this question in the wire repository. In short: Yes, it does take a while to sample. I used a computer with more than 128GB RAM, and at least 12 cores to sample it, and it took me an order of thirty minutes. Hope that helps!

csyhping commented 7 months ago

Thank you so much!

csyhping commented 7 months ago

@vishwa91 , may I ask about the transformation/normalization of the result? The result shifts and scales compared to the input mesh; what post-processing procedure should I use to let the result match the input mesh, there seems also a rotation of 90 degrees along z-axis? I tried re-centering the result and normalizing it by its bbox/2 * 1.03; it almost matches the input, but is there a better way? Thanks!

vishwa91 commented 7 months ago

The result should look similar to the original mesh (albeit with some loss of details). I did not do any further postprocessing, as you may have seen in the example script. It could potentially be rotated 90 degrees, that is something I did not check for.

csyhping commented 7 months ago

test.zip @vishwa91, could you please kindly check this? Attached is a zip of [input.obj, input.mat, result.obj], you can see there is a different scale and shift. (ignore the rotation)

Of course, I can do a post-processing to recover the scale and shift, but it is a little weird, as you mentioned.

[Update]About rotation, it is the axis order. For anyone concerned, you should do a vertices[:, [1, 0, 2]].