Closed Crabbit-F closed 1 year ago
Hi, what do you mean exactly? The "+ NE emb." architecture from the paper corresponds to the conformer_with_tags
architecture in the code with the --add-tags-embeddings
argument (see the training command in https://github.com/hlt-mt/FBK-fairseq/blob/master/fbk_works/JOINT_ST_NER2023.md#parallel-joint-st-and-ner).
In the code, this corresponds to the ConformerWithTagsModel
model, and the learned weights for the NE tags can be taken from the the decoder (TransformerDecoderWithTags
), in particular in the field tags_embeddings
.
Thanks a lot! ConformerWithTagsModel mode is helpful to me.
Glad that it helped.
Would you mind introduce the 'NE emb' detailly? How to get the vector of it? Thanks!