the args tuple is passing an ordered set of arguments to model. The order of the arguments depends upon the model.forward function signature. For example, Bert-base via transformer library forward signature is:
The order of args doesn't match the order given in the model's forward signature, which can lead to a mismatch. In my use-case, this mismatch fails quietly, producing features that aren't correct.
In the example provided for text models you create a wrapper that correctly sets the order of the arguments. I'm wondering if there is a more robust way to provide batch of data to LLM via a dictionary of data and using keyword arguments, which would fix the ordering issue and hopefully match up with transformers.pipeline framework?
Not a bug if text model is wrapped as given in qnli example, but something to possibly make users aware of/emphasize:
In lines 390-394 of modelout_functions.TextClassificationModelOutput:
the args tuple is passing an ordered set of arguments to model. The order of the arguments depends upon the model.forward function signature. For example, Bert-base via transformer library forward signature is:
The order of args doesn't match the order given in the model's forward signature, which can lead to a mismatch. In my use-case, this mismatch fails quietly, producing features that aren't correct.
In the example provided for text models you create a wrapper that correctly sets the order of the arguments. I'm wondering if there is a more robust way to provide batch of data to LLM via a dictionary of data and using keyword arguments, which would fix the ordering issue and hopefully match up with transformers.pipeline framework?