Hello,
I tried to run a posteriori learning using your code. Can you give some hints about how to build the DynamicalDataset and its related dataloader? more specifically:
what is the size of the inputs and labels at __init__ ? and is iters the len (inputs) ?
For a priori learning, one example of data from dataloader would be [batch_size, 2, Ny, Nx].
Thank you.
Hello, I tried to run a posteriori learning using your code. Can you give some hints about how to build the DynamicalDataset and its related dataloader? more specifically:
class DynamicalDataset(torch.utils.data.Dataset): def __init__(self, inputs, labels, steps, iters, dt, t0): self.inputs = inputs self.labels = labels self.iters = iters
what is the size of the inputs and labels at
__init__
? and is iters the len (inputs) ? For a priori learning, one example of data from dataloader would be [batch_size, 2, Ny, Nx]. Thank you.