LizzieHerman / MachineLearningExperimentation

Third project for Artificial Intelligence
1 stars 0 forks source link

Our 5x2 cross validation needs a small fix #4

Open freivalds1 opened 8 years ago

freivalds1 commented 8 years ago

after imputing the testing and training sets, we need to input them again, reversed, before shuffling again (input the training set as the test set and the test set as the training set). Should be relatively easy to fix in the main alg method. "Alternately, the 5x2 fold cross-validation can be employed. It is generally better at detecting which algorithm is better (K-fold is generally better for determining approximate average error). In this case, randomly divide the data into 2 blocks (or, randomly divide each category into two blocks if doing stratified cross-validation). Then, train on block A and evaluate on B. Next, reserve it (train on B and evaluate on A). Then, repeat the process. Divide the data randomly into two blocks (use a different seed value). Do the two evaluations. Repeat again (actually, until you have done this 5 times). The statistical test performed is a bit different."