Currently, there are keys around pretrained embeddings, projecting the embeddings, dropout, and so on, that are flat parameters to TextTrainer. There's also an embedding_dim parameter, which is a dict, with arbitrary allowed keys. We should make the flat parameters also a part of this dictionary, so the parameters look something like this:
Currently, there are keys around pretrained embeddings, projecting the embeddings, dropout, and so on, that are flat parameters to
TextTrainer
. There's also anembedding_dim
parameter, which is a dict, with arbitrary allowed keys. We should make the flat parameters also a part of this dictionary, so the parameters look something like this: