UKPLab / emnlp2017-relation-extraction

Context-Aware Representations for Knowledge Base Relation Extraction
Apache License 2.0
289 stars 71 forks source link

Question about the TimeDistributed #20

Closed kangbrilliant closed 4 years ago

kangbrilliant commented 5 years ago

I'm little confused why do you not choose layers.wrappers.Bidirectional ?

daniilsorokin commented 5 years ago

We do, for instance https://github.com/UKPLab/emnlp2017-relation-extraction/blob/bed36b0fdf4dcfeeebf4c7e97b201b7cad1febba/relation_extraction/core/keras_models.py#L147

TimeDistributed is needed to process a bulk of relations from the same sentence. It is not used to distribute across time but rather across different relations in the sentence. It was just easier to implement that way.

kangbrilliant commented 5 years ago

@daniilsorokin Thank you for your prompt answer,I‘m too careless. ☺ Another question, in the function : def get_entity_indexed_vector(tokens, edge, mode="mark-bi"): I don't quite understand that when mode is "mark-bi", when will the number 4 appear ?

daniilsorokin commented 5 years ago

4 as a token marker in the mark-bi was used for some new experiments that we did but were not used in the paper. The idea was to concatenate more than one sentence together and then mark with 4 the tokens in the adjacent sentences, but not the target sentence. It is turned off by default for the current models.