Open gksb88 opened 5 years ago
Currently the only method of comparison is cross-validation scores. Once accuracy is added, that will be another method.
Cross-entropy loss = Sum ( y * log yhat) Create a table accuracy and result of the 10 fold cross entropy loss. https://www.openml.org/a/estimation-procedures/1 https://ml-cheatsheet.readthedocs.io/en/latest/loss_functions.html#cross-entropy
Currently the only method of comparison is cross-validation scores. Once accuracy is added, that will be another method.
Cross-entropy loss = Sum ( y * log yhat) Create a table accuracy and result of the 10 fold cross entropy loss. https://www.openml.org/a/estimation-procedures/1 https://ml-cheatsheet.readthedocs.io/en/latest/loss_functions.html#cross-entropy