I would like to ask you how the hyperparameters of the network are set, I run your source code, and can only run 80% accuracy on the COVID-19 data set test set. In the paper mentioned that in the data preprocessing, all data sets use linear interpolation method to adjust the data dimension to 1024, but it is not reflected in the source code, only the number of channels in the network to 1024, whether I understand the error. Finally, if the EVs data set used in the paper can be made public, my email is pengjuren99@163.com. Thank you very much!
The hyperparameters of our method is demonstrated in the file named config.py, which matches the checkpoint we provide in checkpoints/COV/PACE/924.pth.
If you achieved lower accuracy with the same hyperparameters, the problems may lie in the training strategy such as batch size, learning rate and the number of training epoches, etc. Since the data size of COVID-19 is small, performance is sensitive to these settings.
I am sorry to give you the confuse about the preprocessing, the codes here employs adaptive pooling rather than linear interpplation to resize the data. We may check and fix this latter.
I apologize that we are unable to provide raw data about EVs due to certain confidentiality terms.
I would like to ask you how the hyperparameters of the network are set, I run your source code, and can only run 80% accuracy on the COVID-19 data set test set. In the paper mentioned that in the data preprocessing, all data sets use linear interpolation method to adjust the data dimension to 1024, but it is not reflected in the source code, only the number of channels in the network to 1024, whether I understand the error. Finally, if the EVs data set used in the paper can be made public, my email is pengjuren99@163.com. Thank you very much!