DigitalMediaProfessionals / tool

DV network conversion tool
Apache License 2.0
0 stars 1 forks source link

Pad convolution weight matrix if the kernel size is even. #5

Closed ZongHong-Lyu closed 5 years ago

ZongHong-Lyu commented 5 years ago

Hi Tung,

The original branch you created is not so clean, so i created a new branch and cleaned up the code a bit. Can you continue your work from here?

nhatuan84 commented 5 years ago

Hi Lyu, https://dmprofs-my.sharepoint.com/:i:/g/personal/tuan_nhanh_dmprof_com/Eb6vTURDggRJlBeVOEg9HUIBNQUTm9B-dbKvV5SuFv2Cqw?e=hlKBxd

From the message above, did you test for the case (1,3) so the p value will be 0x301? I tried to hard code it " conf.run[0].p = 0x301; " but the value of output is also not correct.

ZongHong-Lyu commented 5 years ago

No, I didn't tested it.

ZongHong-Lyu commented 5 years ago

Hi Tuan,

So are the 2x2 case still correct? Maybe my implementation is not correct for all padding?

nhatuan84 commented 5 years ago

Hi Lyu, It is: 2x2 padding='same' => ok 2x2 padding='valid' = > ok, if using strides => gen wrongly: conf.run[0].conv_pad = 0x-10001;

ZongHong-Lyu commented 5 years ago

Hi Tuan,

Note that the kernel size parameter for keras is in (height, width) order, so if your kernel_size = (1, 3) then it actually means kernel_width = 3, kernel_height = 1, so the config should be conf.run[0].p = 0x103;. The lower byte is the kernel width, and upper byte is kernel_height. So maybe you should test with the opposite setting if that is the case.

nhatuan84 commented 5 years ago

Hi Lyu,

I tested both cases. Both are not correct :(

ZongHong-Lyu commented 5 years ago

Hi Tuan,

There is several issues.

  1. There is a bug when parsing kernel sizes of Keras network. As I said in previous message, the Keras kernel size is in (height, width) order, bug the tool currently treat it as (with, height) order.
  2. I think your test program also had a bug. The input image need to be transposed. so the parameter transpose you passed to preproc_image function should be set to true. Also the output is also transposed so to compare with Keras output, you also need to transpose the output.
nhatuan84 commented 5 years ago

Hi Lyu,

Ah, I created input picture that is square matrix with all values are 1. So transpose is true or false the input is same. am i right? Moreover the output has some values that are not similar.

ZongHong-Lyu commented 5 years ago

Hi Tuan,

~~I pushed one commit to fix the 1. problem in my last message. Can you try the (1x3) kernel again?~~

Note: the conf.run[0].p output generated by the tool is not fixed yet, so you need to fix it manually

EDIT: I removed the commit since it is not complete, and it should be in a separate pull request. I'll rebase this branch and let you know.

nhatuan84 commented 5 years ago

Hi Lyu,

1x3 with padding='valid' and 'same' are correct now. But 3x1 is not correct.

ZongHong-Lyu commented 5 years ago

Hi Tuan,

Can you try to find the reason for why 3x1is not working, and fix it? I am going to have a business trip starting from tomorrow so will not be able to respond or investigate the issues.

ZongHong-Lyu commented 5 years ago

Hi Tuan,

To help you debugging, You should first compare the output of tool with Keras. In Keras, you can get the dimension of each layer by the following command:

model.summary()

You should compare the output with the tool output and make sure the output dimension of each layer match. If they don't match, there is bug when parsing the network or in the network conversion.

Note that the Keras output the dimension in (height, width, channels) order, while the tool output the dimension in (width, height, channels) order.

nhatuan84 commented 5 years ago

Hi Lyu,

Yes, i will try it soon after current task.

ZongHong-Lyu commented 5 years ago

Hi Tuan,

I edited the message about the last commit. I feel that fix should be a different pull request so I removed it. I'll rebase this pull request after I merged that fix to master branch.

EDIT: I have merged the fix to master branch, and the fix of this branch is rebased on master again.

nhatuan84 commented 5 years ago

Hi Lyu,

With current padding (top-right) I compared the output dimension of both. Both are the same. I tried to padding 0 at different positions and output looks better but still not correct.

ZongHong-Lyu commented 5 years ago

Hi Tuan,

I discussed with Benjamin, and we decided it is faster for me to fix this issue by me. You can go ahead to work on your other tasks.

nhatuan84 commented 5 years ago

Hi Lyu,

Please let me know when you have fixed this issue. I just want to lean more.

Thanks.

ZongHong-Lyu commented 5 years ago

These change seems to handle non-square kernels correctly. I am going to merge and close this pull request.