Hyperparticle / udify

A single model that parses Universal Dependencies across 75 languages. Given a sentence, jointly predicts part-of-speech tags, morphology tags, lemmas, and dependency trees.
https://arxiv.org/abs/1904.02099
MIT License
219 stars 56 forks source link

Multi-GPU training? #9

Closed antonisa closed 4 years ago

antonisa commented 4 years ago

Hi, cool work! I was wondering if the toolkit supported multi-GPU training as-is, and, if not, if there are any plans to support it.

Thanks!

Hyperparticle commented 4 years ago

As of last week, looks like AllenNLP now has support for distributed training on multiple GPUs. You'll have to install the package from the AllenNLP repo (see the README). I don't have much time to look into it to explain how to set it up, but you can refer here for info on progress with the new multi GPU features and here for the new changes.

I hope you can set it up using this info without much trouble.

antonisa commented 4 years ago

Sounds good, I'll take a look!