-
### Title
PHOTONAI
### Short description and the goals for the OHBM BrainHack
## INTRODUCTION
With photonai, we have designed a machine learning software that abstracts and condenses machine learn…
-
```
$ git clone https://github.com/abdur75648/Deep-Learning-Specialization-Coursera.git
Cloning into 'Deep-Learning-Specialization-Coursera'...
remote: Enumerating objects: 1028, done.
remote: Cou…
-
Would be nice to see a short description in the README describing what additional benefits this repo offers relative to pymatgen, [matminer](https://hackingmaterials.lbl.gov/matminer/) (featurization)…
-
Hi ,
I have encountering an error of the code where mentioned that **"Maximum parces depth exceed"** in the named of strategy **NostalgiaForInfinityX4.py** .
I am trying to solve it but unfortu…
-
Hi Yachen,
I used the Bayesian Optimization package for optimizing the hyperparameters in a Kaggle contest. I noticed that the same hyperparameters can be repeatedly sampled, which can be a waste on …
-
I actually find this book by googling for Bayesian hyperparameter optimization when trying to make a step beyond grid and random searches to something more sophisticated.
Any chance on new material o…
-
I currently have the problem that, a lot of times, the results Optuna optimization produces are not really too optimal, due to the stochastic nature of RL training. For example, training 3 agents with…
-
Noting these down for the [neurips bbo challenge](http://bbochallenge.com/leaderboard)
- idea 1: generate more suggestions and only send the top
`n_suggestions` ranked by value.
- idea 2: gener…
-
I am using two HPOs on the same task:
```
first_optimizer = HyperParameterOptimizer(
hyper_parameters=[
DiscreteParameterRange('Args/method__training/regularization_parameter', values=…
-
Hi all,
If possible, I would like to add support for saving optimization results in HDF5 format. I have been running many hyperparameter optimization procedures and have been thinking that, rather …