Open OLDDDDDDD opened 1 year ago
In Code: unet, optimizer, train_dataloader, lr_scheduler = accelerator.prepare( unet, optimizer, train_dataloader, lr_scheduler ) raise Error:
unet, optimizer, train_dataloader, lr_scheduler = accelerator.prepare( unet, optimizer, train_dataloader, lr_scheduler )
ValueError: Using torch.compile requires PyTorch 2.0 or higher.
But in requirments.txt, torch == 1.12.1
In Code:
unet, optimizer, train_dataloader, lr_scheduler = accelerator.prepare( unet, optimizer, train_dataloader, lr_scheduler )
raise Error:But in requirments.txt, torch == 1.12.1