daugsbi / randomforest-matlab

Automatically exported from code.google.com/p/randomforest-matlab
0 stars 0 forks source link

how to do voting woth random forest? #25

Closed GoogleCodeExporter closed 8 years ago

GoogleCodeExporter commented 8 years ago
hi all
i am doing some project work using random forest where i need to use random 
forest for voting purpose. i mean every tree would vote for desired feature and 
the best feature is taken into account at the end.
how to use this random forest for this purpose? would the give code would help 
me to do so? if not, how can i approach??
kindly reply to guide me.

thank you

Original issue reported on code.google.com by abhi4emb...@gmail.com on 1 Feb 2012 at 7:04

GoogleCodeExporter commented 8 years ago
i am a bit confused by your issue.

so are the features individual examples? and you have test features for which 
you want a label for?

votes for each example can be obtained from the predict function for both 
classificaiton and regression
e.g. 
http://code.google.com/p/randomforest-matlab/source/browse/trunk/RF_Class_C/tuto
rial_ClassRF.m#225

Original comment by abhirana on 1 Feb 2012 at 7:29

GoogleCodeExporter commented 8 years ago
thank you for your reply.
actually, i am producing multiple feature based classifications of the same 
subject, storing result in different variables.
now how to use random forest to voting for the best classification available?
where should i put my variables and how to use them??
i have seen the source code but could not use it further.
any help would be highly appreciated!

Original comment by abhi4emb...@gmail.com on 1 Feb 2012 at 7:37

GoogleCodeExporter commented 8 years ago
i am still confused

so you have multiple examples per subject?

the simplest way is to take a look at the tutorial files and look at the 
representation of the datasets (either twonorm or diabetes dataset are run in 
those tutorial files)

assuming X is of size N x D where N is the number of examples and D is the 
number of features/variables. and Y is of size N x 1.

then you can use classRF_train(X,Y) and it will return back the forest 
structure, lets say model, which you can use with classRF_predict(X,model) to 
get for a different X matrix

does that help?

Original comment by abhirana on 1 Feb 2012 at 7:43

GoogleCodeExporter commented 8 years ago
u still do not get my question.
i am extracting features from my subject by different ways and each extracted 
feature is stored in a variable. now i want to use random forest in such a way 
so that it could take those feature variables as input, classify them based on 
features and produce me output for best possible features.
now, how we can use random forest in this case? 
hope i made my self clear this time.

Original comment by abhi4emb...@gmail.com on 1 Feb 2012 at 7:51

GoogleCodeExporter commented 8 years ago
i think its a representation issue. all i can say is that random forests or 
general purpose classifiers are useful if you can represent your data in a 
matrix for inputs (what i am guessing is in the variable right now) and your 
targets in a vector. 
i dont know how you can use random forests if you cannot represent in a 
matrix/vector form

Original comment by abhirana on 1 Feb 2012 at 7:56

GoogleCodeExporter commented 8 years ago
the variables that i have are in matrix form.
what next? 

Original comment by abhi4emb...@gmail.com on 1 Feb 2012 at 9:10

GoogleCodeExporter commented 8 years ago
do take a look at the tutorial file 
http://code.google.com/p/randomforest-matlab/source/browse/trunk/RF_Class_C/tuto
rial_ClassRF.m#38

model = classRF_train(X_trn,Y_trn);   %to train
Y_hat = classRF_predict(X_tst,model); %to get the labels for X_tst

Original comment by abhirana on 1 Feb 2012 at 9:24

GoogleCodeExporter commented 8 years ago
the input data in matrix form would be given as input to variable X_train, i 
guess.what would be input to Y_train??  

Original comment by abhi4emb...@gmail.com on 7 Mar 2012 at 4:05

GoogleCodeExporter commented 8 years ago

Original comment by abhirana on 31 Mar 2012 at 8:40