Hi, I wanted to use local models downloaded from HuggingFaces, but I haven't seen any issue or anything in the documentation about this. Is this doable right now? You can use a word2vec model saved locally, but can you use local models with BackTranslation for example?
For anyone wondering, if you download the model and indicate a path to the directory with everything from model and tokenizer downloaded, it works just fine.
Hi, I wanted to use local models downloaded from HuggingFaces, but I haven't seen any issue or anything in the documentation about this. Is this doable right now? You can use a word2vec model saved locally, but can you use local models with BackTranslation for example?