Closed bg-uni closed 1 year ago
No, there is no direct way to do so. However, it should be relatively easy to add such a module.
I think you should be able to recycle most of the code given in the huggingface-examples and replace the BERT/Encoder-models of the examples with the T5EncoderModel. You might also need to adjust the mask token (T5 only has those span tokens, e.g.
Hello, I have a question.
Is there a way to generate a 1D amino acid sequence from embedding obtained from last_hidden_state in T5EncoderModel?
Thanks for your help!