xiemk / SPML-LAC

16 stars 0 forks source link

Some questions about the implementation of label-wise embedding decoder #1

Closed Youngluc closed 1 year ago

Youngluc commented 1 year ago

good work! But I have some questions about the implementation of label-wise embedding decoder. In the paper, it is described as "The label-wise embedding decoder consists of a standard self-attention block and a cross-attention block ", and I find that the default parameters of num_decoder_layers is equal to 2 in encoder.py, and this parameters is not changed in the subsequent definition. Could you provide the specific implementation of label-wise embedding decoder?

xiemk commented 1 year ago

good work! But I have some questions about the implementation of label-wise embedding decoder. In the paper, it is described as "The label-wise embedding decoder consists of a standard self-attention block and a cross-attention block ", and I find that the default parameters of num_decoder_layers is equal to 2 in encoder.py, and this parameters is not changed in the subsequent definition. Could you provide the specific implementation of label-wise embedding decoder?

Hope that I have understood your question. This may be a naming mistake. The file encoder.py contains all the implementation details of the label-wise embedding decoder. Maybe the file should be renamed as le_decoder.py. Very sorry for the mistake and thank you very much!

Youngluc commented 1 year ago

Thanks for your reply!