AutoResearch / EEG-GAN

Other
19 stars 1 forks source link

patch_size padding may not be working #98

Closed chadcwilliams closed 2 months ago

chadcwilliams commented 3 months ago

We need to check to make sure that padding the data when patch_size cannot cut it equally is working. It does not work on the main branch and I have had suspicions that it was not working before.

whyhardt commented 2 months ago

automatic padding faces an issue with AE-GAN. For normal GAN it was simple - just add n zeros to the real sequence. Then sequence_length % patch_size == 0. With AE-GAN we cannot do it like that because adding zeros to the dataset does not work with AE(target=['full', 'time']). Adding an automatic mechanism that unifies GAN and AE-GAN would be to adjust the generation method to (1) create a sequence of length len % patch_size==0 and (2) remove the unnecessary points in the array with generated[:, :-padding, :] after generation but before returning the sequence. But since this would modify the fundamental generation method and we did not test it, I would simply remove automatic padding and print an error that it does not work with the defined padding size. Then the user has two options: adjust sequence_length of the data or patch_size. Since patch_size is a simple argument that would be the preferred solution. If the sequence_length is a prime number -> not our problem; better don't use such a weird format anyways ;)

whyhardt commented 2 months ago

I added a dynamic error message which distinguished between GAN and AE-GAN:

For GAN:

ValueError: Sequence length (100) must be a multiple of patch size (7). Please adjust the 'patch_size' or adjust the sequence length of the dataset.

For AE-GAN:

ValueError: Sequence length (10) must be a multiple of patch size (7). Please adjust the 'patch_size' or adjust the output sequence length of the autoencoder ('time_out').