Hi, I am sorry about the previous question. the gcForest was working well under my environment. However, I am confused about the result after using gcForest to handle the multi-classification problem. Here is the related code of my issue.
In my case, I just change the number of classes to three. BTW, the input vector was the matrix(1280*320), and the labeled data was the matrix(1280,) . It turns out the accuracy of leave-one-group-out was just like this.
And, I used the MLP for my data also. the result is much better than gcForest. Do you have any clue for this problem?maybe the hyper-parameter of gcForest? Thanks for your patience.
Best regards
Irving
Hi, I am sorry about the previous question. the gcForest was working well under my environment. However, I am confused about the result after using gcForest to handle the multi-classification problem. Here is the related code of my issue.
In my case, I just change the number of classes to three. BTW, the input vector was the matrix(1280*320), and the labeled data was the matrix(1280,) . It turns out the accuracy of leave-one-group-out was just like this. And, I used the MLP for my data also. the result is much better than gcForest. Do you have any clue for this problem?maybe the hyper-parameter of gcForest? Thanks for your patience. Best regards Irving