FreyrS / dMaSIF

Other
191 stars 44 forks source link

Transform Dataset object to list #23

Closed geraseva closed 2 years ago

geraseva commented 2 years ago

https://github.com/FreyrS/dMaSIF/blob/0dcc26c3c218a39d5fe26beb2e788b95fb028896/main_training.py#L55 This operation converts the dataset into list. IMHO this may lead to loss of Dataset properties, for example data transformation (random rotation) between epochs.

linminhtoo commented 2 years ago

I'm also confused about this. if args.random_rotation, the transforms are passed to the training dataset. however, inside the training loop, the same transforms are done again, but looping each sample inside the batch.

It seems inefficient, and also quite confusing, because I also realised the rotation operations inside iterate_surface_precompute are using P1["rand_rot"].T while the rotation operations in the main training loop (data_iteration.iterate) are using P1["rand_rot"] . So are we actually reversing the rotation ...? (then why do we rotate in the first place?)

geraseva commented 2 years ago

I guess that these augmentations are used only for surface precomputation. So, they are undone in the main training loop. It is rather strange, but i get it.

linminhtoo commented 2 years ago

I guess that these augmentations are used only for surface precomputation. So, they are undone in the main training loop. It is rather strange, but i get it.

Could you please clarify why this makes sense? Why do we need to undo the augmentation? Don't we want the model to not care about the rotation & center positions?

geraseva commented 2 years ago

I guess that these augmentations are used only for surface precomputation. So, they are undone in the main training loop. It is rather strange, but i get it.

Could you please clarify why this makes sense? Why do we need to undo the augmentation? Don't we want the model to not care about the rotation & center positions?

I think, rotations and translations do not matter, when you use masif. But they also use dgcnn for comparison, and this model does not have rotational invariance, so augmentations in the main training loop are necessary. That is why it is strange that they do not use them.

linminhtoo commented 2 years ago

@FreyrS Hi Frey, could you provide some clarification to this part? It would be very helpful for us