I was trying to generate the same angles as in the ground truth dataset (so 72 images in total) for PSNR calculation with render_xray_G.py, but the generated images seem a bit shifted in the rotation angle.
As far as I understand I should keep N_poses = 1 if I am using only 1 target image, but then I use 72 (or other number if I want to get a different number of frames) in lines 47, 50 and 75:
phi_rot = min(int(range_phi[1] - range_phi[0]), 72) # at least 1 frame per degree
zrot = z[0].clone().unsqueeze(1).expand(-1, 72, -1).flatten(0, 1)
reshape = lambda x: x.view(N_samples, 72, *x.shape[1:])
My target image is 01_xray0000.png and the angles look ok for get_render_poses() at line 49 because I get the desired angle values
_np.linspace(angle_range[0],anglerange[1],N+1)[:-1] = array([ 0., 5., 10., 15., 20., 25., 30., 35., 40., 45., 50., 55., 60., 65., 70., 75., .... 300., 305., 310., 315., 320., 325., 330., 335., 340., 345., 350., 355.])
I picked 6 images to show the shift (image indexes 0, 15, 30, 45, 60 and 71). These should correspond to angles 0˚, 155˚, 150˚, 225˚, 300°and 355˚.
I used the pretrained model from this repository to generate these images, but as you can see the angles don't correspond.
And for the chest dataset it is similar. The ground truth...
And generated images
Hello,
I was trying to generate the same angles as in the ground truth dataset (so 72 images in total) for PSNR calculation with render_xray_G.py, but the generated images seem a bit shifted in the rotation angle. As far as I understand I should keep
N_poses = 1
if I am using only 1 target image, but then I use 72 (or other number if I want to get a different number of frames) in lines 47, 50 and 75:My target image is
01_xray0000.png
and the angles look ok forget_render_poses()
at line 49 because I get the desired angle values _np.linspace(angle_range[0],anglerange[1],N+1)[:-1] = array([ 0., 5., 10., 15., 20., 25., 30., 35., 40., 45., 50., 55., 60., 65., 70., 75., .... 300., 305., 310., 315., 320., 325., 330., 335., 340., 345., 350., 355.])I picked 6 images to show the shift (image indexes 0, 15, 30, 45, 60 and 71). These should correspond to angles 0˚, 155˚, 150˚, 225˚, 300°and 355˚. I used the pretrained model from this repository to generate these images, but as you can see the angles don't correspond.
And for the chest dataset it is similar. The ground truth... And generated images
Thank you for your advice!