Hyperparticle / udify

A single model that parses Universal Dependencies across 75 languages. Given a sentence, jointly predicts part-of-speech tags, morphology tags, lemmas, and dependency trees.
https://arxiv.org/abs/1904.02099
MIT License
219 stars 56 forks source link

Pretrained proxy unavailable for raw text predictor #19

Open tonytan48 opened 4 years ago

tonytan48 commented 4 years ago

Hi, really impressed by your work. I was trying to utilize the pretrained multilingual bert model as a tagger for downstream task, but not able to download the pre-trained model at http://hdl.handle.net/11234/1-3042. Is there any other way to obtain that? Thank you!

Hyperparticle commented 4 years ago

Are you having problems downloading the files, or are you having trouble finding them? I can verify that I can download the weights at the bottom of the link. Sometimes it might take a long time to initiate the download, but if you try again at a later time it might work.

Thanks for the feedback.

Jivnesh commented 3 years ago

I too faced same problem. I was trying two download from this link. I tried downloading directly and also with this command. curl --remote-name-all https://lindat.mff.cuni.cz/repository/xmlui/bitstream/handle/11234/1-3042{/udify-model.tar.gz,/udify-bert.tar.gz} From both the ways, download is not getting completed. It gets struck after 50-60 MB. Is there any alternative way to download udify-model.tar.gz and udify-bert.tar.gz