Open nkatebi opened 4 years ago
(I'm not the author but have looked at the code in detail so take the below just as my opinions and not the authors')
The batch size is a command-line argument and is set by default to 200 (see argument parser in train.py).
x_real needs to have its first dimension equal to one before it is fed into the rolling_window function where samples will be created. I suppose this is a useful check if, say, you were to add your own dataset to make sure that the output of the function you'd have created has the right dimensions.
Thank you for your response. Could you please explain the input dimension for training the model? For example, in case of ECG signals, is it possible to use more than one recordings of ECG as input data? I've changed the generator to include recordings 100 and 101: generator = [('id=100', dict(filenames=['100','101']))] However, I got AssertionError.
Anyone knows if it is possible to set the batch size>1? Also, what is the reason for checking this condition? "assert x_real.shape[0] == 1"