tiny-smart / dysample

(ICCV'23) Learning to Upsample by Learning to Sample
MIT License
101 stars 3 forks source link

Hi, I am getting the following error after I introduced DySample to the network and would like to hear from you. #8

Closed MMagicLoren closed 2 months ago

MMagicLoren commented 5 months ago

TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect. normalizer = torch.tensor([W, H], dtype=x.dtype, device=x.device).view(1, 2, 1, 1, 1)

[00:00<?, ?it/s]D:\SoftWare\Python38\lib\site-packages\torch\autograd__init__.py:204: UserWarning: grid_sampler_2d_backward_cuda does not have a deterministic implementation, but you set 'torch.use_deterministic_algorithms(True, warn_only=True)'. You can file an issue at https://github.com/pytorch/pytorch/issues to help us prioritize adding deterministic support for this operation. (Triggered internally at C:\actions-runner_work\pytorch\pytorch\builder\windows\pytorch\aten\src\ATen\Context.cpp:75.) Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass

poppuppy commented 5 months ago

It seems the error is caused by torch.tensor(). You can try to directly multiply the scale. For example,

coords = 2 * (coords + offset) coords[:, 0, ...] = coords[:, 0, ...] / W coords[:, 1, ...] = coords[:, 1, ...] / H coords = coords - 1

may help.

MMagicLoren commented 5 months ago

Thank you for your reply.Thank you for your reply, two of the above warnings have been resolved, the other was resolved by adding the following line of code: def forward(self, x): torch.use_deterministic_algorithms(False) if self.style == 'pl': return self.forward_pl(x) return self.forward_lp(x)

poppuppy commented 5 months ago

Thank you. Your solution may help others, too.