Closed yaseryacoob closed 3 years ago
I think your issue can be solved. I've also made the same question myself, but then I found maybe a reasonable solution.
If you notice, looking at the code written within dataio.py
source code file, and more precisely looking at def get_mgrid(sidelen, dim=2) function implementation, you can notice that the sidelen
input variable for this function is nothing but the sidelength
input variable that is provided to class Implicit2DWrapper(torch.utils.data.Dataset) class within its def __init__(self, dataset, sidelength=None, compute_diff=None)
constructor method.
So you, as reading the first rows at the very beggining of that class implementation, can notice that sidelength
can be either a integer, if you provide a square image or a list, tuple, whatever you want but that is at least useful for describing the width and height of your input image that might or not be rectagle image, instead.
I've tried it myself, and it seems to work.
Otherwise, you should re-write the get_mgrid
that instead, I think in my opinion, was simplified in its implementation within the explore_siren.ipynb, in order to accomodate the fact that sidelen variable, instead, should be also seen as a list or tuple and not just an integer.
In other words, you should made get_mgrid
function within explore_siren.ipynb as much as possible similar to the corresponding function, with still the same name, but within the dataio.py source file.
If I'm wrong let me notice that, but, so far, I hope this description might be handy for your issue.
Many thanks, I followed your suggestion,
I replaced the get_mgrid in explore_siren.py with the one defined in dataio.py
Made sure to remove the Resize from the Compose that creates the transform
I had to flip the image dimensions for 1 (given how I load the images).
All is worked well.
I noticed in all the examples and looking at the code that images and video are equal in column and row counts (e.g., 512x512). I am considering arbitrary sized images (without resizing), how do I do that? thanks