Open HURIMOZ opened 3 weeks ago
It should in theory be supported, but was not extensively retested in quite some time. I think your main issue here is that it should not be in the model's embeddings config, but at the "root" level of the config. (Such things should be made clearer, I agree.) See the corresponding config definitions around here: https://github.com/eole-nlp/eole/blob/5120fdbd06132cd7d16b9fe65384c2affe95b199/eole/config/data.py#L52-L54
Hi everyone, I want to use 256-dimension-truncated pre-trained embeddings from fasttext. I get this error:
Are pre-trained embeddings supported? If so, is word2vec supported?
Here is the relevant part of my config file: