Open hengck23 opened 8 years ago
Hi, then you need to follow the 4-step training in paper to train a region proposal network and a R-FCN network, and to share features between them.
@daijifeng001 Thank you for the reply. Can I confirm that the demo code do not provide end-to-end detection at test time? Becuase in faster-rcnn demo, we can just input an image and both proposal and final detection can be output.
(note: I am referring to detection and not training. I want to test my own images but I do not have their proposal mat files)
The demo does not provide proposal generation, but it is very fast in inference and would not be a trouble for computation.
Can you try the dropbox link?
From: hengck23 [mailto:notifications@github.com] Sent: Thursday, July 21, 2016 3:33 PM To: daijifeng001/R-FCN R-FCN@noreply.github.com Cc: Jifeng Dai daijifeng001@gmail.com; Mention mention@noreply.github.com Subject: Re: [daijifeng001/R-FCN] how to generate proposal? (#10)
@daijifeng001 https://github.com/daijifeng001 Thank you for the reply. Can I confirm that the demo code do not provide end-to-end detection at test time? Becuase in faster-rcnn demo, we can just input an image and both proposal and final detection can be output.
(note: I am referring to detection and not training. I want to test my own images but I do not have their proposal mat files)
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/daijifeng001/R-FCN/issues/10#issuecomment-234178890 , or mute the thread https://github.com/notifications/unsubscribe-auth/AH9IlcbROlYFyKggzS8OyJry3hYKc2L0ks5qXyDIgaJpZM4JRhFM . https://github.com/notifications/beacon/AH9IlRTMNVd9bCw0U6UhEE5VRHRwsv0tks5qXyDIgaJpZM4JRhFM.gif
@daijifeng001 I can download the model at "Demo R-FCN model: DropBox, BaiduYun"
If these models already share features for RPN and frcnn, then I can write the code for end-to-end detection and share it here. Currently I am convert your script_rfcn_demo.m to C++ and run the demo in windows C++ caffe only. I convert the proposal mat files to text file so that it can be read by c++. This is now OK.
Let try to write some code using demo model to make proposals and get back to you later. Do I use the same anchor boxes as faster RCNN? (i.e. 9 aspect ratio at each location) Thanks!
Yes.
From: hengck23 [mailto:notifications@github.com] Sent: Thursday, July 21, 2016 4:24 PM To: daijifeng001/R-FCN R-FCN@noreply.github.com Cc: Jifeng Dai daijifeng001@gmail.com; Mention mention@noreply.github.com Subject: Re: [daijifeng001/R-FCN] how to generate proposal? (#10)
@daijifeng001 https://github.com/daijifeng001
I can download the model at "Demo R-FCN model: DropBox, BaiduYun"
If these models already share features for RPN and frcnn, then I can write the code for end-to-end detection and share it here. Currently I am convert your script_rfcn_demo.m to C++ and run the demo in windows C++ caffe only. I convert the proposal mat files to text file so that it can be read by c++. This is now OK.
Let try to write some code using demo model to make proposals and get back to you later. Do I use the same anchor boxes as faster RCNN? (i.e. 9 aspect ratio at each location) Thanks!
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/daijifeng001/R-FCN/issues/10#issuecomment-234188675 , or mute the thread https://github.com/notifications/unsubscribe-auth/AH9IlZU_oNCrLSMcEYuhAOSpv4PvOJtQks5qXyyqgaJpZM4JRhFM . https://github.com/notifications/beacon/AH9IlXZvFlZ7zUHcGdza4XywITCGDq02ks5qXyyqgaJpZM4JRhFM.gif
Should I using the same way like in faster-rcnn to get proposals in a different function using a different caffe prototxt define network and model? Or, just using the current model, but how? @hengck23 @daijifeng001
I was running the demo code: https://github.com/daijifeng001/R-FCN/blob/master/experiments/script_rfcn_demo.m
the proposal are loaded from file: proposals = load(fullfile(demo_dir, [im_names{j}, '_boxes.mat']));
How can I run the detection of one image end-to-end?
Thank you very much