I implemented a custom translation decoding algorithm using transformer. But I encounter a problem when I want to export my model.
Is it possible to pass some python list to a model server (using tensorflow serving)? I noticed that the serving_input_fn only accept tensors. I want to use some extra information in a customed translation decoding. I tried to first pass as tensor then get static value. This works fine in an eager execution script. But when I tried to export my custom model, the get static value method return None. Is there any way to get around this?
Description
I implemented a custom translation decoding algorithm using transformer. But I encounter a problem when I want to export my model.
Is it possible to pass some python list to a model server (using tensorflow serving)? I noticed that the serving_input_fn only accept tensors. I want to use some extra information in a customed translation decoding. I tried to first pass as tensor then get static value. This works fine in an eager execution script. But when I tried to export my custom model, the get static value method return None. Is there any way to get around this?