Closed Asthestarsfalll closed 2 years ago
I notice that there is a warmup step of CLIP, could you please tell me what the issue is?
with torch.no_grad(): self.model(self.encoded_texts["max"][0]) # one warmup step due to issue with CLIP and CUDA
I notice that there is a warmup step of CLIP, could you please tell me what the issue is?