thunlp / OpenPrompt

An Open-Source Framework for Prompt-Learning.
https://thunlp.github.io/OpenPrompt/
Apache License 2.0
4.38k stars 455 forks source link

How to accelerate the downloading of pytorch_model.bin? #181

Closed FelliYang closed 2 years ago

FelliYang commented 2 years ago
image

hello, when I try to run the example in the readme, I found the download speed is too slow. I use miniconda with python3.8 and cuda11.1. Are there any ways to speed up the download?

FelliYang commented 2 years ago

I am in China, and I find in web that hugging face has a option "mirror='tuna'" when load pretrained_model. Can openPrompt support this ?

FelliYang commented 2 years ago

tuna mirror is not supported anymore, any other ways? I try to downloaded the pretrained from huggingface web and then upload to server, however, I don't know where upload to the .cache/transfomer/ dir will make it over or not? if I can ,how about the file name? It seems the filename in the cache dir is encodered?

FelliYang commented 2 years ago

I had solved this problem, we can download the model in local dir. And setting the path with the original params like this:

image

this is like using huggingface model.