Closed eaplatanios closed 6 years ago
I have only created a few relatively simple sequential CNNs for image classification so far, so I think I can only comment on the first point about the Model
class and even there I've only used one variant.
When I tried it the first time, I was a bit confused by the Model constructor parameters, especially trainInput
and trainInputLayer
. It took a bit until I had realized that they are just the dual of the feature input but for the labels.
Looking at it now I actually like how it's done because it really is quite flexible. I think the main issue is
missing documentation (I'd be happy to help where I can). Without it, all the overloaded model constructor variants and type parameter lists like [IT, IO, IDA, ID, IS, I, TT, TO, TDA, TD, TS, T]
can be a bit scary. :)
It might also make sense to create convenience functions for things that are probably often used in this way like different loss functions. So instead of
val loss = tf.learn.SparseSoftmaxCrossEntropy("Loss/CrossEntropy") >>
tf.learn.Mean("Loss/Mean") >> tf.learn.ScalarSummary("Loss/Summary", "Loss")
you could just do someting like val loss = SoftmaxCrossEntropyLoss()
You could still use the first variant to compose it yourself if you need it, the second would just be a wrapper for common cases.
I will close this because plans changed significantly and I will soon publish a proposal of a lot for new features I've been working on over the summer that involve compile-time support for TensorFlow, similar to what Swift for TensorFlow does, but (hopefully :)) with much stronger type guarantees.
@sbrunk The parameter lists like [IT, IO, IDA, ID, IS, I, TT, TO, TDA, TD, TS, T]
have now been simplified dramatically with #131. This is a good example. :)
This is a place to discuss issues with the current design of the learn API and ways to improve it, so that we can have a new implementation ready (hopefully) by the end of May.
The main issues I see currently are:
Model
class feels quite awkward to newcomers. Is there a nicer way to define models, while being flexible?Layer
s. Implementing a new layer adds an overhead and some boilerplate even for simple models. We should probably allow functions to be usable as layers. Also, variable scopes and the rest of the op creation context does not work really well with layers at this point.@sbrunk Do you have any thoughts based on use cases you might have tried?