Algolzw / daclip-uir

[ICLR 2024] Controlling Vision-Language Models for Universal Image Restoration. 5th place in the NTIRE 2024 Restore Any Image Model in the Wild Challenge.
https://algolzw.github.io/daclip-uir
MIT License
628 stars 29 forks source link

Pretrained DACLIP in options file #66

Open Lincoln20030413 opened 1 month ago

Lincoln20030413 commented 1 month ago

Traceback (most recent call last): File "/opt/data/private/daclip-uir-main/universal-image-restoration/config/daclip-sde/train.py", line 356, in main() File "/opt/data/private/daclip-uir-main/universal-image-restoration/config/daclip-sde/train.py", line 205, in main clip_model, preprocess = open_clip.create_model_from_pretrained('daclip_ViT-B-32', pretrained=opt['path']['daclip']) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/data/private/daclip-uir-main/universal-image-restoration/config/daclip-sde/../../open_clip/factory.py", line 379, in create_model_from_pretrained model = create_model( ^^^^^^^^^^^^^ File "/opt/data/private/daclip-uir-main/universal-image-restoration/config/daclip-sde/../../open_clip/factory.py", line 247, in create_model raise RuntimeError(error_str) RuntimeError: Pretrained weights (pretrained/daclip_ViT-B-32.pt) not found for model daclip_ViT-B-32.Available pretrained tags ([].

Lincoln20030413 commented 1 month ago

作者您好!感谢您精彩的工作,我在尝试训练代码时对options文件中opt['path']['daclip']这个路径感到有点疑惑,我如果要完全重新训练整个模型的话应该要给这个路径赋空对吧?

Lincoln20030413 commented 1 month ago

为什么我将该路径置为空后还是要从huggingface上下载模型呢?请问如何从头训练模型呢?

Lincoln20030413 commented 1 month ago

我好像明白了,是要先在DACLIP文件夹里训练出daclip对吧,然后再把训练得到的daclip权重与IR-SDE结合训练,是一个两阶段训练的过程

Algolzw commented 1 month ago

是的,预训练的daclip模型可以从readme里下载哈。

Lincoln20030413 commented 1 month ago

我还有一个问题,就是laion2b_s34b_b79k这个文件到底是什么呢,我看daclip以及universal IR都提到了这个文件,但是在universal IR中是在pretrained path未提供时才调用,与daclip_ViT-B-32是if else的关系,dacliP直接是pretrained path(所以我在完全重新训练时也是要置空对吧)。

Algolzw commented 1 month ago

laion2b_s34b_b79k是openai CLIP的官方预训练模型,可通过huggingface下载的。不过Universal IR只使用预训练的DACLIP,而DACLIP的训练是加载的预训练openai CLIP。(代码里的if else是当时测试用留下来的代码,可以忽略掉)

ZG-yuan commented 2 weeks ago

Hello, when I was training Daclip, I found that my loss did not converge. It reached the minimum at about 10 epochs, then it increased and stabilized. Is this normal? The figure below shows some indicators of my training process. (My degenerate settings are the same, I don’t know if this has any impact) Thank you for your reply. aa45e373216664b1e76ea91f7adcdf1

ZG-yuan commented 2 weeks ago

@Algolzw

Algolzw commented 2 weeks ago

@ZG-yuan Hi! The degradation loss is easier to converge, but your training is ok since the contrastive loss still decreases in the last few epochs.