Closed anzosasuke closed 3 years ago
why pre-padding increases accuracy?
In our study the input vector has a fixed size of 2048 bytes. Usually the binary file is bigger than that amount and there is no problem. However, we need to use padding during training (training only!), so the network can learn how to classify when we have less than 2048 bytes available.
If at some point, during testing or inference, the amount of data available is less than 2048 bytes, we can just fill the remaining with 0. The network already learned how to deal with that during training.
So pre-padding does not increase accuracy, but prevents accuracy drops if we use less bytes during testing or inference.
thank you so much.
I could not understand padding section of the paper. Why did you truncate and padding? Can you may be explain?