Closed wntg closed 2 years ago
就是去哪里下载clip的图像encoder权重
Please follow the step in INSTRUCTION. You have to use the code here to generate the weight.
BTW, if you only want to use the pretrained weight of UniFormerV2, you can set pretrained=False
in the code, thus it will skip loading CLIP pretrained weight.
Thanks for your response! Thanks for your work!
Thanks for your try! As there is no more activity, I am closing the issue, don't hesitate to reopen it if necessary.
请问在运行extract.ipynb文件时出现以下错误需要怎么解决呢
conda环境中clip已经安装了
加载预训练模型的时候,clip的模型是pt后缀,为什么你用的是pth,想问下在哪下载的? 我修改了后缀名,加载clip的pt文件但是后面还是会报错。 _MODELS = { "ViT-B/16": os.path.join(MODEL_PATH, "ViT-B-16.pth"), "ViT-L/14": os.path.join(MODEL_PATH, "vit_l14.pth"), "ViT-L/14_336": os.path.join(MODEL_PATH, "vit_l14_336.pth"), }