Open 497866364 opened 6 years ago
Hi Wilson, I am so glad that you review the code very carefully.
Thanks for your review again. If you have any question, please feel free to contact me.
Best regards, Yuqi Ding
Hi Yuqi,
Sorry, I am not clear about your loss function. Based on your project I added an inference.m file used for test the result. Attached is the project, could you help test?
I am a FPGA engineer, just want to implement Lenet on chip. Found your network is the most detailed one on github.
Thanks Wilson
Hi Wilson, I carefully review my code today. I fixed some bugs. The updated code will be committed later. First, I am so sorry about the Loss energy. I didn't give you the right explanation. I read the paper again, and find this Loss energy should participate in updating the parameter. I confused it before because I wrote another algorithm which it doesn't need Loss energy to update the parameter. Second, I think you give me a good opportunity to review my code. Thanks. Third, I refer you to read the original paper if you don't. I think reading the papes and implementing their algorithms is the good skill as a graduate student.
Best regards, Yuqi Ding
Hi Yuqi,
Last week I was too busy, sorry for reply to you so late. I will read the paper and learning the trainning algorithm following your code. Let's keep go on make this project more perfect.
Thanks Wilson
Hi Sir,
I have some question about your LeNet-5 MATLAB project. Thanks for your answer. 1, In backpropagation.m why loss doesn't as an input be used? Due to loss doesn't use, so Lable will not take affect in network train process. 2, As for full connect layer, I want to know how to transform 84 to 10, I found there is no softmax layer in your project.
Hope you continued to work on this project, really very nice project. Thank you!
Regards Wilson