fomorians-oss / pyoneer

Tensor utilities, reinforcement learning, and more!
https://pypi.org/project/fomoro-pyoneer/
Other
9 stars 2 forks source link

Features module #1

Closed jimfleming closed 5 years ago

jimfleming commented 5 years ago

TensorFlow briefly experimented with tf.feature_column as part of estimators. While tf.feature_column is not supported by Eager we've experimented with a custom implementation for a few client projects and it worked very well.

Pseudo-code:

class MyModel(tf.keras.Model):
    def __init__(self):
        self.feature_layers = pynr.layers.Featurize({
            'my_scalar': pynr.layers.StandardizeLayer(loc=<loc>, scale=<scale>),
            'my_categorical': pynr.layers.OneHotLayer(depth=<depth>),
            'my_other_categorical': keras.Embedding(bins=<bins>),
        })
    def call(features):
        return self.feature_layers(features)

some_features = {
    'my_scalar': [3.14],
    'my_categorical': [7],
    'my_other_categorical': [13],
}
my_model = MyModel()
my_model(some_features)

Features:

Example Handlers:

FAQ:

wenkesj commented 5 years ago

I like this. Would it be worth extending featurize as a layer? As in:

class MyModel(tf.keras.Model):
    def __init__(self):
        self.feature_layers = pynr.layers.Featurize({
            'my_scalar': pynr.layers.StandardizeLayer(loc=<loc>, scale=<scale>),
            'my_categorical': pynr.layers.OneHotLayer(depth=<depth>),
            'my_other_categorical': keras.Embedding(bins=<bins>),
        })
    def call(features):
        return self.feature_layers(features)

This removes the need for wrapping

jimfleming commented 5 years ago

Oh, I think I like that. I'll try it out in my implementation.

EDIT: Yep, that works very well.

wenkesj commented 5 years ago

Closing in favor of #6