Open yongzx opened 4 years ago
The issue as mentioned in the title. I am wondering if this is normal since English Constituency Parsing requires way fewer GPU resources (can be done with less than 12 GPU RAM).
My code (following https://spacy.io/universe/project/self-attentive-parser)
import spacy from benepar.spacy_plugin import BeneparComponent !python -m spacy download de_core_news_sm import de_core_news_sm nlp = de_core_news_sm.load() nlp.add_pipe(BeneparComponent('benepar_de')) doc = nlp("Guten Morgen. ")
The issue as mentioned in the title. I am wondering if this is normal since English Constituency Parsing requires way fewer GPU resources (can be done with less than 12 GPU RAM).
My code (following https://spacy.io/universe/project/self-attentive-parser)