Closed jimfleming closed 5 years ago
I like this. Would it be worth extending featurize
as a layer
? As in:
class MyModel(tf.keras.Model):
def __init__(self):
self.feature_layers = pynr.layers.Featurize({
'my_scalar': pynr.layers.StandardizeLayer(loc=<loc>, scale=<scale>),
'my_categorical': pynr.layers.OneHotLayer(depth=<depth>),
'my_other_categorical': keras.Embedding(bins=<bins>),
})
def call(features):
return self.feature_layers(features)
This removes the need for wrapping
Oh, I think I like that. I'll try it out in my implementation.
EDIT: Yep, that works very well.
Closing in favor of #6
TensorFlow briefly experimented with
tf.feature_column
as part of estimators. Whiletf.feature_column
is not supported by Eager we've experimented with a custom implementation for a few client projects and it worked very well.Pseudo-code:
Features:
Example Handlers:
tf.keras.layers.embeddings.Embedding
).FAQ: