Open etotheipluspi opened 7 years ago
So the first thing that would be required is some kind of literature review to figure out what the minimal interface would be. From what I've seen, I think the basic interface would be something similar to MXNet in that you would have the forward pass, and a backward pass where you could specify the weights that would get backpropagated (e.g. gradient in DDPG).
Two things I've come across that might complicate things:
For the second point, I couldn't use mxnet.fit because there was a lot of overhead there that I didn't need.
Long story short, I think if you just want something specifically for DRL, then you can just include it in the package, since it has some specifics needs that I think don't cross over to more general machine learning.
It would be great to be able to switch between deep learning backends, so users aren't tied to mxnet. The three that come to mind are mxnet, tensorflow, and knet. This could also simplify model building.
Something like this is probably a big endeavor, but I don't think we need a complete framework like Keras to make it work.
It would be good to start by abstracting away the deep learning building, backprop, etc to something that can serve as a common interface for multiple backends.
The question is, should something like this have its own package?