-
Currently, moabb stratifies and picks random samples from X for training (T)/validation (V) respectively. A simple assignment vector could look like this:
```
[T,T,T,V,T,V,V,T,T,V,T,V,T,T]
```
…
-
Hello
I'm trying to understand the code. And... I have questions about model evaluation parts...
About evaluating RF classifier with the training data...
As I understand, the Random Forest cl…
-
Test k+1th data after k training and find accuracy
-
Hi,
I have two questions:
First : Why didn't you use K Fold Cross Validation?
Second : What is the reason use different learning rate for classifier? Is it for faster convergence?
I am tryin…
-
**What would you like to submit?** (put an 'x' inside the bracket that applies)
- [x] question
- [ ] bug report
- [x] feature request
**Issue description**
Hi. I need to figure out wh…
-
# Tweet summary
For model robustness, ensure entity in test data should **NOT** be used in training data set, i.e. state feature in mobile career churn data, or year feature in graduate test result.
…
-
Version: Online ebook (2020-02-01)
I.2.4.1 k-fold cross validation
The code below fails as `x` and `y` were not defined before calling the function
```
# Example using h2o
h2o.cv
-
The portion of the data used for evaluating and fine tuning the model is called validation set.
Try to split the test portion of the data at the .lvm file concatenating script (Tind_Tindn_Gama.py ).…
-
108라인의 integer형 변수 kk를 사용하는 k_fold 부분에 대한 이해가 필요합니다.
-
Welcome to 'DSWP' Team, good to see you here
This issue will helps readers in giving all the guidance that one needs to learn about Cross Validation Techniques. Tutorial to Cross Validation Techniq…