Closed f-lng closed 5 years ago
I haven't tried this, but my first suggestion would be to start with would be to create a DataLoader that provides both the BERT input and the numerical features as part of X
, and then to create an input module that uses the correct parts of that X
in its forward()
method -- i.e. BERT operates on the text, your feed-forward NN operates on the numerical features, the two are concatenated, and then fed into your second small feed-forward NN, all within forward()
. The output of that procedure would then be the common encoding the heads of your model use.
Thank you very much, I will try this method.
Hello :-)
I wonder if it is possible to train a 2-branch metal model with different modalities for the 2 sets of input features.
It would be raw text on the one hand (which should feed into a BERT model) and a vector with numerical features on the other hand (which should feed into a simple feed forward network).
So basically the input and body would be different for each branch, and they should be concatenated (and eventually fed into a small feed forward network as well) before the head.
Best regards, Fabian