-
Currently every optimizer comes with a `config` specifically for that optimizer that manages the hyperparameters for the optimizer.
This is made because of the following reasons:
* A lot of hyp…
-
Hello,
I've been trying to run this model with the provided code.
I used the same parameters in the sample Train script with multiple GPUs as below.
```
export NGPU=2;
python -m torch.distribut…
-
There are several hyper parameters existing:
1. teacher model hyper parameters
2. student model hyper parameters
3. KD hyper parameters (e.g., balance weight for different losses)
4. Training hype…
-
I recently ran your code and found that the performance was not nearly as good as in the relevant papers.
Here is my result.
![result](https://user-images.githubusercontent.com/46221821/236816802-5c…
-
Dear all,
I am trying to use exact GPs and the hyperparameter optimization keeps producing errors, such as `AssertionError: isfinite(phi_c) && isfinite(dphi_c)`.
Consider this minimal example:
`N…
-
For example, I'm trying to use `drichlet` multivariate continuous distribution from `scipy.stats` to generate an array of random floats as one of my hyper parameters up for fine-tuning. However, if I …
-
A nice work! but I am confused about some hyper-parameters.
-
I was trying to plot SGOOP vectors (SGOOP1 vs SGOOP2) and plot timescales from a maxcal.traj. I have 28 reaction co-ordinates and they are already **sin and cos transformed**. I got a bit lost with th…
-
This looks like an awesome project.
Would be great if there was a way to report hyper parameters with each submission.
-
Hi,
I am using this package to reweight MC to look like sPlotted data, and I would like to scan the hyper parameters to look for the best configuration
scikit tools are available for this (e.g. GridS…