issues
search
HollowPrincess
/
GradBoostOptim
Distributed optimization of hyperparameters of gradient boosting algorithms
2
stars
0
forks
source link
Зафиксировать перечень датасетов
#79
Closed
HollowPrincess
closed
3 years ago
HollowPrincess
commented
3 years ago
Регрессия:
[x] Facebook OK
[x] Boston OK
[ ] ~
https://www.kaggle.com/alessandrosolbiati/using-xgboost-for-time-series-prediction-top-20
~
[x]
https://www.kaggle.com/robikscube/tutorial-time-series-forecasting-with-xgboost
OK
[ ] Датасеты 2 и 3
https://machinelearningmastery.com/time-series-datasets-for-machine-learning/
Классификация:
[x]
https://www.kaggle.com/c/flavours-of-physics/leaderboard
OK accuracy
[ ] *
https://www.kaggle.com/c/grasp-and-lift-eeg-detection/code
[ ] ~
https://www.kaggle.com/c/airbnb-recruiting-new-user-bookings/data
~
[x]
https://www.kaggle.com/c/homesite-quote-conversion/data
OK
[ ]
https://www.kaggle.com/c/otto-group-product-classification-challenge/data
https://www.kaggle.com/c/dato-native/leaderboard
https://www.kaggle.com/c/avito-context-ad-clicks/data
Регрессия:
Классификация: