hi,I use test_warpingSO3.lua to achieve the three-dimensional rotation of the image. Next I intend to implement the scale transformation and translation of the picture. I used the following code:
require 'image'
require 'nn'
require 'torch'
dofile('imagewarpingSO3.lua')
x = image.loadPNG('linen1.png')
input = torch.Tensor(1,1,240,320)
input[1] = x
input = nn.Transpose({2,3},{3,4}):forward(input)
r = torch.Tensor(1,6):zero()
r[1][1] = 1
r[1][5] = 1
hi,I use test_warpingSO3.lua to achieve the three-dimensional rotation of the image. Next I intend to implement the scale transformation and translation of the picture. I used the following code: require 'image' require 'nn' require 'torch' dofile('imagewarpingSO3.lua') x = image.loadPNG('linen1.png') input = torch.Tensor(1,1,240,320) input[1] = x input = nn.Transpose({2,3},{3,4}):forward(input)
r = torch.Tensor(1,6):zero() r[1][1] = 1 r[1][5] = 1
out_r = nn.AffineTransformMatrixGenerator(True,True,True):forward(r) out_grid = nn.AffineGridGeneratorBHWD(240,320):forward(out_r)
t = {input, out_grid} out_img = nn.BilinearSamplerBHWD():forward(t) out_img = nn.Transpose({3,4},{2,3}):forward(out_img) out_img = out_img[1] image.display(out_img)
But I got a bad image and I don't know where it went wrong. Thank you!