LCSL / incremental_multiclass_RLSC

Camoriano, R.^, Pasquale, G.^, Ciliberto, C., Natale, L., Rosasco, L. and Metta, G., Incremental robot learning of new objects with fixed update time. In IEEE International Conference on Robotics and Automation (ICRA), May 2017.
https://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7989364
MIT License
6 stars 6 forks source link

RGB-D Washinton and iCubWorld28 features #1

Open raffaello-camoriano opened 7 years ago

raffaello-camoriano commented 7 years ago

If there is enough interest, RGB-D Washinton and iCubWorld28 features can be added.

@GiuliaP

paulojunqueira commented 5 years ago

Hello, I am new to this area and I am studying your paper, and it is very interesting! Is there a way to test other datasets in the matlab code? thanks

GiuliaP commented 5 years ago

Hello @paulojunqueira

As you probably have noticed the code was released only for the case of the MNIST dataset, but nothing prevents from adapting it to use also iCW or RGBD-Washington datasets (i.e., the ones used in the paper) or even other datasets.

I (and maybe also @raffaello-camoriano ) can try to provide some support to issues that may arise, in the case you are interested in adding them and making a contribution.

Giulia

paulojunqueira commented 5 years ago

Hello @GiuliaP Thank you, I appreciate. There is a matlab file in the dataset folder called MNIST.m, Its functions is still unclear to me. Is it to prepare the mnist dataset for training? I am able to adapt it to prepare another dataset?

Could you give me some guidance on what part or function should I start adapting to use with another dataset ?

thank you.

Paulo

GiuliaP commented 5 years ago

@paulojunqueira yes, the code is structured with a generic dataset class, from which the specific MNIST class inherits. You can try to follow this logic, i.e., ''clone'' the MNIST class, and customize it to your own data. I think @raffaello-camoriano can confirm this.

paulojunqueira commented 5 years ago

Thanks @GiuliaP . I have one more doubt: in the dataConf_MNIST_inc.m file, there are some variables that are used,like ntr, ntem, nLow... And no comments on the code. Is the ntr the training number for all classes? what about the nLow? I am trying to identify which one are the variables that you cite on your paper , like nbal, nimb and ntest.

thank you

raffaello-camoriano commented 5 years ago

Thanks @GiuliaP . I have one more doubt: in the dataConf_MNIST_inc.m file, there are some variables that are used,like ntr, ntem, nLow... And no comments on the code. Is the ntr the training number for all classes? what about the nLow? I am trying to identify which one are the variables that you cite on your paper , like nbal, nimb and ntest.

thank you

Hi @paulojunqueira and thank you for your interest.

Here are some comments on the variables in dataConf_MNIST_inc.m which is then used to produce the datasetin main.m (line 103):

The proportion of samples from the underrepresented class can also be manipulated by commenting out nLow and setting the relative factor lowFreq, depending on your needs.