Closed QqQss closed 5 years ago
Hi,
We are working on automatic hyperparameters tuning. In the meantime, you can just train with less cells to choose hyperparameters (train_size < 1) and then re-train with optimal parameters with the training_set.
That's a solution! Thanks for your reply :)
You can check out PR #299 to follow along with our work on automatic hyperparameter tuning
Hi, guys, thanks for your great tools!
I'm a little confused in hyper-parameter tuning. In most case, we could select a part of cells as test_set, then compute likelihood error constantly. According to the likelihood change we could know whether the number of epoch is enough and which comb of hyper-parameters will get the minimum error in the last epoch. Is it right?
But in some of your tutorials, e.g. 'harmonization' and 'interaction with scanpy'. You choose all of cells as training set, and then I find that it cannot get likelihood error info anymore! Maybe cannot compute training likelihood alone? (i.e. set frequency=5 and train_size=1.0, then will return an zero division error). In that case, if we cannot trace likelihood change history, how could we know whether the number of epoch is enough? and how to tune the hyperparameters?