Algolzw / daclip-uir

[ICLR 2024] Controlling Vision-Language Models for Universal Image Restoration. 5th place in the NTIRE 2024 Restore Any Image Model in the Wild Challenge.
https://algolzw.github.io/daclip-uir
MIT License
638 stars 30 forks source link

Pre-trained weight loading in DA-Clip #55

Open Trevor-Philips-cbd opened 4 months ago

Trevor-Philips-cbd commented 4 months ago

Hello, could you explain how the pre-trained model you loaded during the training of DA-Clip was obtained? Because I noticed that the model “laion2b_s34b_b79k” is used in your training command, but in your readme, you are asked to download the “daclip_ViT-B-32.pt” file. What is the difference between these two? According to the description in your paper, the training weights of Clip should be loaded here and remain unchanged throughout the training process. I don’t understand the difference between these two sets of model weights, and I hope you can explain it, thank you.

Algolzw commented 4 months ago

Hi! We actually modified the original CLIP model (in the code) with a controller for image degradation. The controller is retrained and saved as the weight "daclip_ViT-B-32.pt". So you need to load our specific weight for degraded images.

Trevor-Philips-cbd commented 4 months ago

Hi! We actually modified the original CLIP model (in the code) with a controller for image degradation. The controller is retrained and saved as the weight "daclip_ViT-B-32.pt". So you need to load our specific weight for degraded images.

Hello, if I want to retrain your controller, what should I do? I saw in your documentation that I need to load the “laion2b_s34b_b79k” weights, but I couldn’t find the relevant weights. Could you please provide a download link?

Algolzw commented 4 months ago

The training instructions are in this readme file. Usually, the model weights can be automatically downloaded (or you can manually download it via Huggingface).

Trevor-Philips-cbd commented 4 months ago

The training instructions are in this readme file. Usually, the model weights can be automatically downloaded (or you can manually download it via Huggingface).

Hello, I have loaded the weights for the clip according to the instructions, but there is an error saying that the weights do not match. I have followed the instructions in your project and openclip to download the weights of “laion2b_s34b_b79k” and load them into the model. I am not sure which step I got wrong, and I hope you can give me some guidance.

Algolzw commented 3 months ago

Can you show your test code? Since we changed the openclip code for image degradation, you need to load the weight with our own da-clip code, as in the evaluate.py script.