Open BjarkeCK opened 4 years ago
Hi @EliasCK,
Sadly there is currently no easy way to do continued learning via the NeuralNet package. The training loop is hidden inside the NeuralNetLearner class with dependencies on local methods to copy the next minibatch, and much worse a dependency on calling initialize on all layers in the very beginning of the loop. I have been planning to refactor the NeuralNet project to make it possible to train using a separate minibatch source and an open loop, to make it support more flexible use cases like the one you need. However, it is a bit more long term.
Best regards Mads
Hey, first of thanks for a fantastic library!
The library is really easy and simple to use if you have a large dataset and want to train a network in one go.
But I'm building a DQN and I want to continuously improve a neural network from small batches of training data, with as little overhead as possible. Is that something that's easily possible in SharpLearning?
Right now, the only way I see it can be achive, is by doing something like this:
However there's a lot of overhead and data copying going on there. Are there better ways to go about it?
Thanks :)
Edit 1 : Seems like my example doesn't work either since the weights are randomized when a learning begins.
Edit 2: My seccond attempt, throws a nullreference exception on net.Forward(input, output). (Allthough I imagine that this is not a very good way to go about it either? And probably wrong on many levels 😊)