jzbontar / mc-cnn

Stereo Matching by Training a Convolutional Neural Network to Compare Image Patches
BSD 2-Clause "Simplified" License
707 stars 232 forks source link

Trying to resize storage that is not resizable #37

Open GoodStudyDayUpUp opened 7 years ago

GoodStudyDayUpUp commented 7 years ago

Hi everyone,

I'm trying to run this script: https://github.com/jzbontar/mc-cnn (I am trying to transform left.bin and right.bin to .png file which can be shown) which produces the following error:

$ ...luajit samples/bin2png.lua Writing left.png luajit: ...ocal/torch_update/install/share/lua/5.1/torch/Tensor.lua:462: Trying to resize storage that is not resizable at /usr/local/torch_update/pkg/torch/lib/TH/generic/THStorage.c:183 stack traceback: [C]: in function 'set' ...ocal/torch_update/install/share/lua/5.1/torch/Tensor.lua:462: in function 'view' samples/bin2png.lua:9: in main chunk [C]: at 0x00406670

Anyone knows what is wrong? I found someone said like this:

"""This happens because of this commit https://github.com/torch/torch7/pull/389 I think that the author of the mc-cnn code should update his normalize script to take into account this change in torch"""

If that is the reason, can I update the normalize script (Normalize2.lua?) by myself to make the code work? How?

I am new to machine learning and GPU programming! I use ubuntu 16.04 and a server (from intranet) with GeForce GTX TITAN installed. I appreciate any hints from you!^_^

BroadDong commented 7 years ago

I have also run into the same issue.Do you solve it?

rohanchabra commented 7 years ago

I guess new updates on Torch causes these issues. A fix on bin2png.lua:9:-

s = d h w

left = torch.FloatTensor(torch.FloatStorage(s)) torch.DiskFile('left.bin','r'):binary():readFloat(left:storage())

You might have to do similar thing in fromfile(fname) function in main.lua if you want to train network. Here is my fix:- function fromfile(fname) local file = io.open(fname .. '.dim') local dim = {} for line in file:lines() do table.insert(dim, tonumber(line)) end if #dim == 1 and dim[1] == 0 then return torch.Tensor() end

local file = io.open(fname .. '.type') local type = file:read('*all')

local d = torch.LongStorage(dim)

local s= 1
for i = 1,d:size() do s = s * d[i] end

local x if type == 'float32' then --x = torch.FloatTensor(torch.FloatStorage(fname)) x = torch.FloatTensor(torch.FloatStorage(s)) torch.DiskFile(fname,'r'):binary():readFloat(x:storage()) elseif type == 'int32' then --x = torch.IntTensor(torch.IntStorage(fname)) x = torch.IntTensor(torch.IntStorage(s)) torch.DiskFile(fname,'r'):binary():readInt(x:storage()) elseif type == 'int64' then --x = torch.LongTensor(torch.LongStorage(fname)) x = torch.LongTensor(torch.LongStorage(s)) torch.DiskFile(fname,'r'):binary():readLong(x:storage()) else print(fname, type) assert(false) end

--x = x:reshape(torch.LongStorage(dim)) x = x:reshape(d) return x end

GoodStudyDayUpUp commented 7 years ago

@BroadDong Thanks for your reply, unfortunately, I haven't solved the problem. According to @rohanchabra rohanchabra's suggestion, I changed the code. However, the following problem appeared when I was trying to run bin2png.lua:

luajit: samples/bin2png.lua:15: unexpected symbol near '='

So I added a "" before comma in line 15 of bin2png.lua as what it was originally, ( i.e. , left = left:min(2) ), but another problem happened which is:

Writing left.png luajit: samples/bin2png.lua:16: attempt to index global 'left_' (a nil value) stack traceback: samples/bin2png.lua:16: in main chunk [C]: at 0x00406670

GoodStudyDayUpUp commented 7 years ago

@rohanchabra Thanks a lot for your suggestion. I changed the code. However, the following problem appeared when I was trying to run bin2png.lua:

luajit: samples/bin2png.lua:15: unexpected symbol near '='

So I added a "" before comma in line 15 of bin2png.lua as what it was originally, ( i.e. , left = left:min(2) ), but another problem happened which is:

Writing left.png luajit: samples/bin2png.lua:16: attempt to index global 'left_' (a nil value) stack traceback: samples/bin2png.lua:16: in main chunk [C]: at 0x00406670

rohanchabra commented 7 years ago

@GoodStudyDayUpUp Sorry, I might haven't given whole code that I have updated.

s = d h w

print('Writing left.png') left = torch.FloatTensor(torch.FloatStorage(s)) torch.DiskFile('left.bin','r'):binary():readFloat(left:storage()) left = left:view(1, d, h, w):cuda()

Rest should be same

Maybe this will help

GoodStudyDayUpUp commented 7 years ago

@rohanchabra Thanks! Actually, last time you have given me the whole code. ( I changed both main.lua and bin2png.lua according to your suggestion.) As for the main.lua, I made it 100% the same as you did:

function fromfile(fname) local file = io.open(fname .. '.dim') local dim = {} for line in file:lines() do table.insert(dim, tonumber(line)) end if #dim == 1 and dim[1] == 0 then return torch.Tensor() end

local file = io.open(fname .. '.type') local type = file:read('*all')

local d = torch.LongStorage(dim)

local s= 1 for i = 1,d:size() do s = s * d[i] end

local x if type == 'float32' then --x = torch.FloatTensor(torch.FloatStorage(fname)) x = torch.FloatTensor(torch.FloatStorage(s)) torch.DiskFile(fname,'r'):binary():readFloat(x:storage()) elseif type == 'int32' then --x = torch.IntTensor(torch.IntStorage(fname)) x = torch.IntTensor(torch.IntStorage(s)) torch.DiskFile(fname,'r'):binary():readInt(x:storage()) elseif type == 'int64' then --x = torch.LongTensor(torch.LongStorage(fname)) x = torch.LongTensor(torch.LongStorage(s)) torch.DiskFile(fname,'r'):binary():readLong(x:storage()) else print(fname, type) assert(false) end

--x = x:reshape(torch.LongStorage(dim)) x = x:reshape(d) return x end

Then I revised the bin2png.lua as:

require 'cutorch' require 'image' require 'torch'

d = 70---48 h = 370---512 w = 1226---612

s = d h w

print('Writing left.png') left = torch.FloatTensor(torch.FloatStorage(s)) torch.DiskFile('left.bin','r'):binary():readFloat(left:storage()) left = left:view(1, d, h, w):cuda() , left = left:min(2) image.save('left.png', left_[1]:float():div(d))

print('Writing right.png') right = torch.FloatTensor(torch.FloatStorage(s)) torch.DiskFile('right.bin','r'):binary():readFloat(right:storage()) right = right:view(1, d, h, w):cuda() , right = right:min(2) image.save('right.png', right_[1]:float():div(d))

print('Writing disp.png') disp = torch.FloatTensor(torch.FloatStorage(h*w)) torch.DiskFile('disp.bin','r'):binary():readFloat(disp:storage()) disp = disp:view(1, 1, h, w) image.save('disp.png', disp[1]:div(d))

THE PROBLEM HAPPENED:

luajit: samples/bin2png.lua:15: unexpected symbol near '='

So I added a "_" myself before comma in line 15 and 22 to change the code

, left = left:min(2) , right = right:min(2)

THEN THE PROBLEM BECAME AS:

Writing left.png luajit: samples/bin2png.lua:16: attempt to index global 'left_' (a nil value) stack traceback: samples/bin2png.lua:16: in main chunk [C]: at 0x00406670

I don't know why the error happened in my server.

jzbontar commented 7 years ago

Hey guys, can you try again? The bin2png.lua script broke when the newer version of torch came out. It should work now.

GoodStudyDayUpUp commented 7 years ago

@jzbontar Hi Jure Zbontar,

Thanks a lot for your update! I have written my own code to transform bin to png but I will try your updated code later! Now I meet with an interesting problem. That is: When I was trying to use the middleburry net (I have tried both: net_mbfast-a_train_all; net_mbslow-a_train_all), the left side of the disparity map is totally white (I have checked the disp.bin, the left several columns all have the same value which is the largest through the whole binary file. And in my own 'bin2png' code, I will assign 255 to these pixels for suitable display in black and white range.). And the more interesting thing is: the number of left rows being white is just the value of 'disp_max-1' (disp_max is the disparity range you defined. e.g. ./main.lua kitti fast -a predict -net_fname net/net_kittifast-a_train_all.t7 -left samples/input/kittiL.png -right samples/input/kittiR.png -disp_max 70)! Do you know what is wrong?

Best regards!