Open badreddinemerabet opened 6 months ago
Try https://github.com/mit-han-lab/torchquantum/tree/main/examples/quanvolution (with #147 to increase accuracy); I’ll ask about updating the link!
thank you, it worked
transform = transforms.Compose([ transforms.Grayscale(num_output_channels=1), transforms.Resize((28, 28)), transforms.ToTensor(), transforms.Normalize((0.5,), (0.5,)) ])
data_dir = 'OAM28' # Replace with the path to your dataset full_dataset = datasets.ImageFolder(root=data_dir, transform=transform)
train_size = int(0.9 * len(full_dataset)) test_size = len(full_dataset) - train_size train_dataset, test_dataset = random_split(full_dataset, [train_size, test_size])
batch_size = 16 train_loader = DataLoader(train_dataset, batch_size=batch_size, shuffle=True, num_workers=8, pin_memory=True) test_loader = DataLoader(test_dataset, batch_size=batch_size, shuffle=False, num_workers=8, pin_memory=True)
dataflow = {'train': train_loader, 'test': test_loader}
Hello, do you have a follow up? Happy to help but am a bit unsure what you are asking!
when i tried to run the example from the youtube video "https://www.youtube.com/watch?v=-Grfxkg3-DI" I got this error when I try to run the training cell Epoch 1:
RuntimeError Traceback (most recent call last) in <cell line: 46>()
47 # train
48 print(f"Epoch {epoch}:")
---> 49 train(dataflow, model, device, optimizer)
50 print(optimizer.param_groups[0]['lr'])
51
13 frames /content/torchquantum/torchquantum/functional/gate_wrapper.py in apply_unitary_bmm(state, mat, wires) 129 if len(mat.shape) > 2: 130 # both matrix and state are in batch mode --> 131 new_state = mat.bmm(permuted) 132 else: 133 # matrix no batch, state in batch mode
RuntimeError: Expected size for first two dimensions of batch2 tensor to be: [10, 2] but got: [1, 2].
i didn't change anything