Open GoogleCodeExporter opened 8 years ago
yup, RFs can do what decision trees can do including inherent support for
multi-class classification. I guess you meant multi-class classification right?
Original comment by abhirana
on 11 Dec 2012 at 10:49
Thank you for your fast answer Sir and yes i mean multi-class support. i have
looked to the example provided with the implementation (twonorm) and there are
only tow labels used -1 and +1. So i thought that the only available
classification is the binary one. now i understand that i can use 1, 2, 3,...
like labels (am i right?). Thank you very much.
Original comment by alain.ti...@gmail.com
on 12 Dec 2012 at 9:30
Yup, you can use any numerical labels you want, that is any integer label
Original comment by abhirana
on 12 Dec 2012 at 9:48
[deleted comment]
[deleted comment]
[deleted comment]
Hi, i am using a 3804729*60 double matrix (841705 ko) for training and when i
run the program, it produces an out of memory error within the classRF_train
function (line 347). however i have a 64 bits machine with 8 go Ram. when i
tape memory in matlab it returns:
Maximum possible array: 12026 MB (1.261e+10 bytes) *
Memory available for all arrays: 12026 MB (1.261e+10 bytes) *
Memory used by MATLAB: 2345 MB (2.459e+09 bytes)
Physical Memory (RAM): 8174 MB (8.571e+09 bytes)
is there a limit for the input matrix sir? i am using only 100 trees
Original comment by alain.ti...@gmail.com
on 13 Dec 2012 at 12:50
you will require more memory for internal bookkeeping. roughly 6 x N x Ntree.
Anyways i think you may also run into computational issue as RF may not scale
to that many examples and give you results in a reasonable time.
Original comment by abhirana
on 13 Dec 2012 at 9:22
i have avoided the memory error problem by increasing my virtual memory. The
training process took 6 hours and i saved the model like a matlab variable
using this cmd:
save -v7.3 LModel.mat model
but it results in a very big file ~=5 Go. is this normal?
Original comment by alain.ti...@gmail.com
on 14 Dec 2012 at 1:41
yup. this is normal.
right now the variables are stored in such a way that although they are mainly
zeros they are not compressed. you could maybe do a sparse for the larger
variables and then when you load them back unsparse them.
Original comment by abhirana
on 18 Dec 2012 at 4:58
Original issue reported on code.google.com by
alain.ti...@gmail.com
on 11 Dec 2012 at 7:19