Since StanfordNLP's pretrained word embeddings are already loaded into memory (I think), this seems to be the easiest way to acccess them. Another option, I guess, would be to load them as external vectors into Spacy's vocabulary, but not sure if there would be any advantage, given that it would also duplicate memory.
Note, PR also updates the Spacy dependency from 2.1.0 alpha (nightly) to 2.1.0 stable version.
Since StanfordNLP's pretrained word embeddings are already loaded into memory (I think), this seems to be the easiest way to acccess them. Another option, I guess, would be to load them as external vectors into Spacy's vocabulary, but not sure if there would be any advantage, given that it would also duplicate memory.
Note, PR also updates the Spacy dependency from 2.1.0 alpha (nightly) to 2.1.0 stable version.