Hellwalker / randomforest-matlab

Automatically exported from code.google.com/p/randomforest-matlab
0 stars 0 forks source link

Memory management #32

Open GoogleCodeExporter opened 8 years ago

GoogleCodeExporter commented 8 years ago
What steps will reproduce the problem?
1. With large datasets,  I get an out of memory error, is there any fix for 
this in Matlab?

Original issue reported on code.google.com by mahdieh....@gmail.com on 8 Apr 2012 at 5:10

GoogleCodeExporter commented 8 years ago
could you specify more information?

is it a 32 bit OS or 64 bit OS? 32-bit can have process sizes atmost 2GB or so 
for windows and around 3 GB or so for linux, 64 bit dont have that issue

what dataset size are you using like the number of examples, number of 
dimensions and the number of trees and type of algorithm 
classificaiton/regression

Original comment by abhirana on 10 Apr 2012 at 1:11

GoogleCodeExporter commented 8 years ago
 I recently use the random forest to classify the traffical sigh .And the number of the training subset feature of traffical sigh is  39210,the number of the test subset feature is 12630.When I impletment the code in Windows-Precompiled-RF_MexStandalone-v0.02- as follows:
X=importdata('F:\硕士论文\GTSRB\Random 
Forest\RF_Class_C\Train_HOG2_LDAData_noLabel_41dim.txt');
Y=importdata('F:\硕士论文\GTSRB\Random 
Forest\RF_Class_C\Train_HOG2_LabelData.txt');
 ntree=400;
model=classRF_train(X,Y,ntree);
But if I set the parameter ntree more than 300 ,for instance ntree set to 
400,it will appear the case represented below:

Error using ==> mexClassRF_train
Out of memory. Type HELP MEMORY for your options.
Error in ==> classRF_train at 347
    [nrnodes,ntree,xbestsplit,classwt,cutoff,treemap,nodestatus,nodeclass,bestvar,ndbigtree,mtry ...
Error in ==> Untitled at 13
model=classRF_train(X,Y,ntree);

I would like to ask why this error occur ??Thank you very much

Original comment by 563514...@qq.com on 14 Apr 2012 at 9:17

GoogleCodeExporter commented 8 years ago
??

Original comment by 563514...@qq.com on 15 Apr 2012 at 1:42

GoogleCodeExporter commented 8 years ago
to comment-2,3

i will need the information in comment-1 and additionally the amount of memory 
on your machine

Original comment by abhirana on 15 Apr 2012 at 7:28

GoogleCodeExporter commented 8 years ago
To answer comment 1:
Windows 64 bit OS, Matlab 2011, input feature matrix 700000*10, however I 
sample every 50 points but still I get out of memory message. We usually need 
ML, for very large datasets, what is the data size limit for each algorithm?

Original comment by mahdieh....@gmail.com on 18 Apr 2012 at 3:55