tensorflow / tfjs

A WebGL accelerated JavaScript library for training and deploying ML models.
https://js.tensorflow.org
Apache License 2.0
18.44k stars 1.92k forks source link

Feature Request: Support for portable Lambda layers in tfjs-layers #283

Closed bileschi closed 3 years ago

bileschi commented 6 years ago

In Keras, Lambda layers are custom layers that allow a user to specify the function of a layer with arbitrary python code. This provides great flexibility, allowing users to perform operations beyond those possible by the provided stock layers. Unfortunately, the python-defined lambda layers are not portable to run in the JS runtime.

The current Keras Lambda Layer signature offers:

One solution would be to recapitulate the Lambda layer in JS, allowing for JS implementations of 'function, and 'output_shape'. This would allow for additional flexibility, but the lambda layers would not be portable. If a model with a lambda is saved, it would not execute in a python environment.

Ideally, we should be able to provide the flexible escape hatch of the Lambda layer, but maintain cross-environment compatibility.

We could imagine a version of Lambda that achieves portability by constraining the 'function' and 'output_shape' implementations to only use TensorFlow ops, and the 'arguments' to only allow TensorFlow primitives. Since these ops are implemented in JS and PY runtimes (not to mention Swift and Rust), we can expect that the Lambda would be portable as well.

davidsoergel commented 6 years ago

IIUC the proposal is effectively to add a GraphDefLambda layer (which might be a cleaner way to think about it than overloading the existing Lambda representation).

I feel this is not something we can address in the context of TF.js alone-- it's about the exchange format (e.g., keras.json). It's certainly worth tracking here because we are thinking about it, but actually making it work would require updating Python Keras as well. @fchollet

fchollet commented 6 years ago

I'm generally interested in serializing Lambda layers at the level of the graph of operations rather than as Python bytecode. However I would see the following requirements:

davidsoergel commented 6 years ago

This is not exactly a dup but still closely related to #254.

chenqing commented 5 years ago

any progress?

bileschi commented 5 years ago

I don't believe anyone is actively working on this, though several people have thoughts on how it would be best accomplished. In the short term, are you more interested in porting Lambda layers from a Python env to a JS env, or running custom layers defined in one JS env in a different JS env?

In the longer term, we could prioritize this if we found there is stronger demand here than for other priorities.

lkuich commented 5 years ago

I'm using TF Hub + Lambda to train my model and am trying to convert my model for TFjs, would you guys know of an alternative approach for layers.Lambda the below? I'm not an expert with Tensorflow and not sure where to start. Can I run the function (feature_extractor) and turn the results into another kind of layer supported by TFjs? Should I look into a custom Layer? Any advise?

I notice Hub for TF2 has a KerasLayer I wish I could use: https://www.tensorflow.org/hub/api_docs/python/hub/KerasLayer

...
def feature_extractor(x):
    feature_extractor_module = hub.Module(feature_extractor_url)
    return feature_extractor_module(x)

features_extractor_layer = layers.Lambda(feature_extractor, input_shape=input_shape)
features_extractor_layer.trainable = False

model = tf.keras.Sequential([
    features_extractor_layer,
    layers.Dense(image_data.num_classes, activation='softmax', name='softmax_input')
])
...

Thanks!

BenjaminWegener commented 4 years ago

https://twitter.com/BenjaminWegener/status/1231010778100178945

Ouwen commented 4 years ago

@bileschi I'm interested in loading a python keras model into a javascript environment. The typical workflow for us is to train a more complex model in python keras, then ideally export the model for inference in javascript.

When loading a keras model with custom layers, is it possible for the user to just supply the javascript native implementation? Similar to: https://github.com/tensorflow/tfjs-examples/blob/master/custom-layer/custom_layer.js#L34

This would be similar to the python tf.keras.models.load_model method of providing custom objects.

bileschi commented 4 years ago

Yes, you would need to take responsibility for the javascript layer implementation and assuring that it matches the python implementation.

Ouwen commented 4 years ago

So for the initial model.json that is exported by the tfjs python library. Would we rename the Lambda layer to the class name we use in javascript for deserialization?

{
    "class_name": "Lambda",
    "config": {
        "name": "lambda",
        "trainable": true,
        "dtype": "float32",
        "function": ["4wEAAAAAAAAAAQAAAAYAAABDAAAAcxQAAAB0AGoBfABkAWQBZAFkAmcEgwJTACkDTukBAAAA6QMA\nAAApAtoCdGbaBHRpbGUpAdoBeKkAcgYAAAD6HzxpcHl0aG9uLWlucHV0LTIxLTg3OGE0ZjE0MjEy\nZD7aCDxsYW1iZGE+DwAAAHMAAAAA\n", null, null],
        "function_type": "lambda",
        "module": "__main__",
        "output_shape": null,
        "output_shape_type": "raw",
        "output_shape_module": null,
        "arguments": {}
    },
    "name": "lambda",
    "inbound_nodes": [
        [
            ["input_4", 0, 0, {}]
        ]
    ]
}

So for example, if the Lambda class used here is supposed to be the Antirectifier we could just rename the class_name to Antirectifier and it will get handled by tf.serialization.registerClass(Antirectifier); ?

bileschi commented 4 years ago

Thats the gist of it, yes. There may be more fields heere that need to change

On Mon, May 11, 2020 at 3:21 PM Ouwen Huang notifications@github.com wrote:

So for the initial model.json that is exported by the tfjs python library. Would we rename the Lambda layer to the class name we use in javascript for deserialization?

{ "class_name": "Lambda", "config": { "name": "lambda", "trainable": true, "dtype": "float32", "function": ["4wEAAAAAAAAAAQAAAAYAAABDAAAAcxQAAAB0AGoBfABkAWQBZAFkAmcEgwJTACkDTukBAAAA6QMA\nAAApAtoCdGbaBHRpbGUpAdoBeKkAcgYAAAD6HzxpcHl0aG9uLWlucHV0LTIxLTg3OGE0ZjE0MjEy\nZD7aCDxsYW1iZGE+DwAAAHMAAAAA\n", null, null], "function_type": "lambda", "module": "main", "output_shape": null, "output_shape_type": "raw", "output_shape_module": null, "arguments": {} }, "name": "lambda", "inbound_nodes": [ [ ["input_4", 0, 0, {}] ] ] }

So for example, if the Lambda class used here is supposed to be the Antirectifier we could just rename the class_name to Antirectifier and it will get handled by tf.serialization.registerClass(Antirectifier); ?

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/tensorflow/tfjs/issues/283#issuecomment-626905684, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAEFSTXKJ2W2ZUDHVXTTM6DRRBF27ANCNFSM4E7D44SQ .

-- Stan Bileschi Ph.D. | SWE | bileschi@google.com | 617-230-8081

Ouwen commented 4 years ago

Sounds good, we'll go through the process and try it out. If you think it makes sense we can put together a quick documentation example for this as well. Unless the API to handle custom python keras layers is in the works/changing atm.

lychrel commented 4 years ago

Apologies if this is somewhat off-topic, but: my understanding is that tfjs supports importing GraphDef-based models. Could a model requiring a Lambda layer be successfully imported if instead defined as a tf graph with the same operations? Or, could a tf graph involving a functional model w/ a Lambda layer be imported into tfjs?

However, while I know imported GraphDef-based models can be retrained in Python, I've read that they can't be in tfjs. What would be the simplest (if not best-practices) way of transferring the gradient calculations from Python to tfjs? (Saving the trainable var names in Python and somehow corresponding them to those in the frozen imported graph?)

juiceboxjoe commented 4 years ago

@bileschi @Ouwen it worked without changing any of the other fields! Thanks for the info! The model I converted has lambda functions that depend on parameters outside the scope of the lambda function. To receive those parameters within my translated lambda classes (SpaceToDepth and ZScoreNorm) I added those parameters to the config object that accompanies those classes' entries within the model.json file and passed a config object to their constructors in the javascript code. This worked out nicely because my parameters were known (block size, mean, etc) but is was a hassle.

@bileschi is this feature request being considered for the python -> js use case?

tngan commented 3 years ago

@Ouwen It is working and I am wondering if we can tweak the component/layer name inside the model.json when we save the model.

BenjaminWegener commented 3 years ago

Is it implemented now? Thanks.

Dhruvagwal commented 3 years ago

@BenjaminWegener not yet

sickopickle commented 1 year ago

Not implemented yet? I would rather be able to train my model within my app rather then send it through websockets back to my computer and then send back the newly trained model :|