Open LBB-Liuwl opened 7 years ago
Hi! Thank you for your interest! This project is done by Caffe framework (https://github.com/BVLC/caffe) with little changes to code. We only changes some of the network structure (e.g., adding attention modules), which can be done by simply modifying prototxt. The details of the framework can be found in the paper, and if you have any questions about the structure please feel free to ask me.
Hi! In the paper, it said that the HP-net was trained in a stage-wise fashion. And which loss is used when training the M-net and fine-turning the AF-net? Could you share the Caffe prototxt files ? Thank you!
Hi! In each stage of training, we always use weighted sigmoid cross entropy loss, as illustrated in the paper. The weights for positive and negative examples aim to balance the loss of positive and negative samples. We will detail code and prototxt later. Thank you!
@xh-liu Thanks!
Hi,I'm very interested in your paper. Can you share your code for me? @xh-liu
Can you detail code and prototxt? Because what I have done can not reconstruct your result. Thank you very much! @xh-liu
@Li1991 How do you combine the result?
We will give the detailed code later. Thank you for your interest!
Hi @xh-liu, will the model or the prototxt file be released resently?
I have added some example prototxts in the folder prototxt_example. a0 and a3 are two branches out of total nine branches. You can re-implement the other branches based on it. fusion denotes the whole net for fusing the features from nine branches and the main branch. For computation simplicity, we extract features of the nine branches offline and use the extracted features to train the final fusion layer and classifiers.
Hi, after seeing your example prototxts, I found you use a layer called NNInterp, but I can not find it. Could you please offer the original code of this layer? Thank you! @xh-liu
@Li1991 I have added the code for nninterp layer in folder 'layers'.
Thank you for your kindness. @xh-liu And where is your python layer --FeatureConcatDataLayer?
Hi,I'm very interested in your results. Can you share your code for me?