Navidfoumani / ConvTran

This is a PyTorch implementation of ConvTran
MIT License
123 stars 8 forks source link

eRPE issue #10

Closed Bestlzz closed 2 months ago

Bestlzz commented 4 months ago

Hi, I have two questions:

  1. In your model, when Transformer has L layers, do you need to add an eRPE to each layer?
  2. If yes, are the eRPEs the same one?
Navidfoumani commented 2 months ago

Do you need to add an eRPE to each layer? Yes, you can add an eRPE to each layer of the Transformer.

Are the eRPEs the same one? No, the eRPEs are not the same for each layer. Each layer has its own set of parameters to learn, as they might encode various abstractions of the data. Therefore, they do not share parameters.