-
Hi thank you for creating such a great package. I was wondering if it makes sense to create a function for hyperparameter tuning of the boosted_revression_forest function.
If not, feel free to clos…
-
Dear xgboost developers, I am "testing the waters" to see if some of my research is interesting and potentially could be implemented in xgboost either now or in the future.
**TLDR:**
Approximate t…
-
**Describe the solution you'd like**
Add PerpetualBooster as an additional algorithm. It does not need hyperparameter tuning and supports multi-output and multi-class cases.
https://github.com/perpe…
-
# Problem
Would be good to have some *hyperparameter tuning* tools available for finding optimal *hyperparameters* for the different autoencoder variants.
## References
+ https://neptune.ai/blog/…
-
We aim to implement a system that leverages distillation and quantization to create a "child" neural network by combining parameters from two "parent" neural networks. The child network should inherit…
-
The technique described in the paper "AutoNE: Hyperparameter Optimization for Massive Network Embedding" is interesting. Similar techniques should be incorporated into DGL-KE to tune hyperparameters o…
-
Add PerpetualBooster as an additional algorithm.
https://github.com/perpetual-ml/perpetual
It does not need hyperparameter tuning and supports multi-output and multi-class cases.
I can create…
-
Running [tutorials/data_driven/LSTM/hyperbola_calibration_mixed_hypertuning.py](https://github.com/GrainLearning/grainLearning/blob/68-rnn-integration-to-gl/tutorials/data_driven/LSTM/hyperbola_calibr…
-
**Describe the bug**
I want to optimize the hyperparameters of a simple pipeline. According to the [MLJ docs](https://alan-turing-institute.github.io/MLJ.jl/dev/tuning_models/#Tuning-multiple-neste…
-
https://arxiv.org/abs/1810.05934
看上去是目前为止世界最强 HPT 算法?