Open lucpaoli opened 11 months ago
How can we design our neural net to initialise in the "correct" region of parameter space
Haven't encountered since, interesting discussions wrt bounding the outputs of the NN can be had, though. e.g. tanh to bound between nominal bounds used in cambridge ML_SAFT paper
Two types of error:
Understand what regions of parameter space cause this and how to avoid them. Would we want to bound the parameters or introduce some form of barrier function in the loss?