dmlc / gluon-nlp

NLP made easy
https://nlp.gluon.ai/
Apache License 2.0
2.56k stars 538 forks source link

Wrong order of positional parameters in calls to Seq2SeqEncoder #1367

Closed logogin closed 3 years ago

logogin commented 3 years ago

Description

Seq2SeqEncoder has next signature for the call:

def __call__(self, inputs, valid_length=None, states=None)

but both child implementations, TransformerEncoder and BERTEncoder have signatures

def __call__(self, inputs, states=None, valid_length=None)

i.e. flipped order of states and valid_length positions. Both implementations also call super with positional arguments. The bug gets compensated by the fact that Seq2SeqEncoder simply passes through parameters without transformation and parameters flip back to the original positions.

sxjscience commented 3 years ago

@logogin I think this should have been solved via https://github.com/dmlc/gluon-nlp/pull/1368 so I closed the issue.