Closed leidadpig closed 4 years ago
I'm guessing you mean the Tree Parzen Estimator (TPE) algorithm, right? Afaik, ML.NET doesn't have it. But I'm moving this issue to the ML.NET repo for further discussion.
Can you provide the scenario where you'd like to use TPE with ML.NET? Thanks,
Also, regarding tuning hyper params (for classification and regression ML tasks, so far), we have the following, all of them based on AutoML.NET which is part of ML.NET:
- AutoML.NET: https://docs.microsoft.com/en-us/dotnet/machine-learning/how-to-guides/how-to-use-the-automl-api - ML.NET CLI: https://docs.microsoft.com/en-us/dotnet/machine-learning/automate-training-with-cli - Model Builder in Visual Studio: https://docs.microsoft.com/en-us/dotnet/machine-learning/automate-training-with-model-builder
Sorry, there is no support for the TPE in ML.NET. As commented above, we would like to hear your use case though.
Thank both of you, I think AutoML.net is want I want.
I use TPE to tune hyper params a lot in Python, but due to current work which require C# a lot, I have to train and use my model only in C#, it would be much appreciated if TPE is included in ML.NET. Maybe I can train the model in Python and then use ONNX to convert it
I go through the AutoML's doc, but still confused about how it tune the hyper params, is grid search or other method used?
Or is it possible that use ML.Net to load a pretrained model trained by LightGBM python library so that no need to tune the hyper params in C#
You can use Python to train your model, convert it to Onnx using Python and then use ML.NET to load and run inferencing on the model.
You can use Python to train your model, convert it to Onnx using Python and then use ML.NET to load and run inferencing on the model.
Thanks
I go through the AutoML's doc, but still confused about how it tune the hyper params, is grid search or other method used?
We are using SMAC for hyperparameter optimization. We are optimizing three trainers in parallel, each has a warm-up of 20 random iterations.
Basic sweep/search strategy:
I go through the AutoML's doc, but still confused about how it tune the hyper params, is grid search or other method used?
We are using SMAC for hyperparameter optimization. We are optimizing three trainers in parallel, each has a warm-up of 20 random iterations.
Basic sweep/search strategy:
Test default hyperparameters on each trainer for the task (power of defaults) -- iteration 1 to ~10
- Initial ordering is based on speed and general accuracy of the trainers, incase time expires before finishing all trainers; also allows ties to sway to the generally faster/better trainer
- Cull to top three trainers
Hyperparameter optimization
- Random sweeping as warm-up for SMAC -- iteration ~11 to ~70 (3 trainers * 20 iterations)
- SMAC -- iteration 71+ (until time expires)
Thank you very much
@leidadpig I am assuming your questions have been answered and am closing the issue. Please feel free to reopen if necessary. Thanks.
Is the TPE algo available in ML.NET? Or is there any library available which implement TPE algo in C# to tune hyper params? Thanks!