jiangzhongshi / SurfaceNetworks

Source code for CVPR 2018 Oral paper "Surface Networks"
91 stars 20 forks source link

Could you share the preprocessing code? #4

Open spk921 opened 5 years ago

spk921 commented 5 years ago

I would like to try running ModelNet40 classification task and hard to find preprocessing the data. Could you share the code for the preprocessing? Thank you.

jiangzhongshi commented 5 years ago

Hi,

The preprocessing you need would be Laplacian and Dirac matrix for every mesh. A python version is implemented here and I suggest to try out the Laplacian version first since it's faster. https://github.com/jiangzhongshi/SurfaceNetworks/blob/master/src/utils/mesh.py#L114

Also, if you are familiar with C++ python binding, you can also use the more efficient one from libigl https://github.com/jiangzhongshi/libigl/blob/pyigl/python/py_igl/py_cotmatrix.cpp for Laplacian and https://github.com/jiangzhongshi/libigl/blob/pyigl/python/py_igl/py_dirac_operator.cpp for Dirac

jiangzhongshi commented 5 years ago

On a side note, some meshes in ModelNet40 may contain degenerate triangles etc and in that case Laplacian will contain entries that are NaN. In that case, you can also try the intrinsic laplacian as the operator: https://github.com/libigl/libigl/blob/master/include/igl/cotmatrix_intrinsic.h

SimonPig commented 4 years ago

On a side note, some meshes in ModelNet40 may contain degenerate triangles etc and in that case Laplacian will contain entries that are NaN. In that case, you can also try the intrinsic laplacian as the operator: https://github.com/libigl/libigl/blob/master/include/igl/cotmatrix_intrinsic.h

When i run add_laplacian.py an error pop up at train_data =np.load: '' _pickle.UnpicklingError:invalid load key,' ' ''...is this related to the issue above?

jiangzhongshi commented 4 years ago

@SimonPig No, I think it is most likely to be the version incompatibility between python 2 and 3. In np.load, supply the additional parametre encoding=latin1 and see if that helps.

SimonPig commented 4 years ago

@jiangzhongshi i've re-produced train.npy with create_data.py,but the size of my train.py is 4.5GB which is too big for np.load() in add_laplacian(EOFError: Ran out of input), i dont understand why yours are only 1GB? :::(

jiangzhongshi commented 4 years ago

@SimonPig I am really not familiar with the storage scheme. But a solution for the size is to save different sequence to different .npy files and I believe we did it for as_rigid_as_possible

SimonPig commented 4 years ago

@jiangzhongshi Perhaps one module called ‘seism’ is missing?it‘s supposed to be imported in utils.mesh,line 138

jiangzhongshi commented 4 years ago

Sorry about that, it’s something else that slipped through for some ad hoc experiments.

On Thu, Nov 7, 2019 at 10:45 PM SimonPig notifications@github.com wrote:

@jiangzhongshi https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jiangzhongshi&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=2v3jiwtgToNyGgWlUiot8g&m=6E-AoOBT31rLtW4adlbwA-dWBcwtOyt-2KdrrWNouZ4&s=diiOvMld7nF0zHXWV32L-tuTLon8-DgCle3l7nCKFjU&e= Perhaps one module called ‘seism’ is missing?it‘s supposed to be imported in utils.mesh,line 138

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jiangzhongshi_SurfaceNetworks_issues_4-3Femail-5Fsource-3Dnotifications-26email-5Ftoken-3DADBLCF2SDWH64UVXVF4PYULQSTOGRA5CNFSM4H47DL3KYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEDOULTI-23issuecomment-2D551372237&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=2v3jiwtgToNyGgWlUiot8g&m=6E-AoOBT31rLtW4adlbwA-dWBcwtOyt-2KdrrWNouZ4&s=XTIAYl1UOVx41L5h_qG5xBAZmQSaZtdMPQkM2XMgAjE&e=, or unsubscribe https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ADBLCF4GNUMHWJTCLNYJHTDQSTOGRANCNFSM4H47DL3A&d=DwMFaQ&c=slrrB7dE8n7gBJbeO0g-IQ&r=2v3jiwtgToNyGgWlUiot8g&m=6E-AoOBT31rLtW4adlbwA-dWBcwtOyt-2KdrrWNouZ4&s=n3ZIxtUqCWGUW6K8NSXzVFXKt-uIEas5sR6HZ1--aTE&e= .

SimonPig commented 4 years ago

@jiangzhongshi No problem,thank you for keeping replying ;)so u will upload it later?