tensorflow / serving

A flexible, high-performance serving system for machine learning models
https://www.tensorflow.org/serving
Apache License 2.0
6.18k stars 2.19k forks source link

Error with tf.estimator.export.ServingInputReceiver() #1553

Closed psyyip closed 1 year ago

psyyip commented 4 years ago

Bug Report

If this is a bug report, please fill out the following form in full:

System information

Describe the problem

I followed every instruction to build a serving_input_receiver_fn(). No solution works even though this is just a very simple function. I believe it is not the problem of the codes but some unknown issue.

Exact Steps to Reproduce

%%time !gcloud ai-platform local train \ --package-path=deploy_model \ --module-name=deploy_model.task \ -- \ --train_data_path=train.csv \ --eval_data_path=eval.csv \ --test_data_path=test.csv \ --train_steps=2500 \ --output_dir=deploy_trained

Source code / logs

This is the full function

def serving_input_receiver_fn():
    inputs = {
        'job' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None]),
        'marital' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None]),
        'education' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None]),
        'default' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'housing' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'loan' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'contact' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None]),
        'dayofmonth' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None]),
        'month' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None]),
        'duration' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'campaign' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'pdays' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'previous' : tf.compat.v1.placeholder(dtype = tf.int64, shape = [None]),
        'poutcome' : tf.compat.v1.placeholder(dtype = tf.string, shape = [None])
    }

    return tf.estimator.export.ServingInputReceiver(inputs, inputs)

This is the error message:

INFO:tensorflow:Signatures EXCLUDED from export because they cannot be be served via TensorFlow Serving APIs:
INFO:tensorflow:'serving_default' : Classification input must be a single string Tensor; got {'poutcome': <tf.Tensor 'Placeholder_13:0' shape=(?,) dtype=string>, 'campaign': <tf.Tensor 'Placeholder_10:0' shape=(?,) dtype=int64>, 'loan': <tf.Tensor 'Placeholder_5:0' shape=(?,) dtype=int64>, 'month': <tf.Tensor 'Placeholder_8:0' shape=(?,) dtype=string>, 'job': <tf.Tensor 'Placeholder:0' shape=(?,) dtype=string>, 'duration': <tf.Tensor 'Placeholder_9:0' shape=(?,) dtype=int64>, 'education': <tf.Tensor 'Placeholder_2:0' shape=(?,) dtype=string>, 'marital': <tf.Tensor 'Placeholder_1:0' shape=(?,) dtype=string>, 'dayofmonth': <tf.Tensor 'Placeholder_7:0' shape=(?,) dtype=string>, 'default': <tf.Tensor 'Placeholder_3:0' shape=(?,) dtype=int64>, 'pdays': <tf.Tensor 'Placeholder_11:0' shape=(?,) dtype=int64>, 'housing': <tf.Tensor 'Placeholder_4:0' shape=(?,) dtype=int64>, 'contact': <tf.Tensor 'Placeholder_6:0' shape=(?,) dtype=string>, 'previous': <tf.Tensor 'Placeholder_12:0' shape=(?,) dtype=int64>}
INFO:tensorflow:'regression' : Regression input must be a single string Tensor; got {'poutcome': <tf.Tensor 'Placeholder_13:0' shape=(?,) dtype=string>, 'campaign': <tf.Tensor 'Placeholder_10:0' shape=(?,) dtype=int64>, 'loan': <tf.Tensor 'Placeholder_5:0' shape=(?,) dtype=int64>, 'month': <tf.Tensor 'Placeholder_8:0' shape=(?,) dtype=string>, 'job': <tf.Tensor 'Placeholder:0' shape=(?,) dtype=string>, 'duration': <tf.Tensor 'Placeholder_9:0' shape=(?,) dtype=int64>, 'education': <tf.Tensor 'Placeholder_2:0' shape=(?,) dtype=string>, 'marital': <tf.Tensor 'Placeholder_1:0' shape=(?,) dtype=string>, 'dayofmonth': <tf.Tensor 'Placeholder_7:0' shape=(?,) dtype=string>, 'default': <tf.Tensor 'Placeholder_3:0' shape=(?,) dtype=int64>, 'pdays': <tf.Tensor 'Placeholder_11:0' shape=(?,) dtype=int64>, 'housing': <tf.Tensor 'Placeholder_4:0' shape=(?,) dtype=int64>, 'contact': <tf.Tensor 'Placeholder_6:0' shape=(?,) dtype=string>, 'previous': <tf.Tensor 'Placeholder_12:0' shape=(?,) dtype=int64>}
INFO:tensorflow:'classification' : Classification input must be a single string Tensor; got {'poutcome': <tf.Tensor 'Placeholder_13:0' shape=(?,) dtype=string>, 'campaign': <tf.Tensor 'Placeholder_10:0' shape=(?,) dtype=int64>, 'loan': <tf.Tensor 'Placeholder_5:0' shape=(?,) dtype=int64>, 'month': <tf.Tensor 'Placeholder_8:0' shape=(?,) dtype=string>, 'job': <tf.Tensor 'Placeholder:0' shape=(?,) dtype=string>, 'duration': <tf.Tensor 'Placeholder_9:0' shape=(?,) dtype=int64>, 'education': <tf.Tensor 'Placeholder_2:0' shape=(?,) dtype=string>, 'marital': <tf.Tensor 'Placeholder_1:0' shape=(?,) dtype=string>, 'dayofmonth': <tf.Tensor 'Placeholder_7:0' shape=(?,) dtype=string>, 'default': <tf.Tensor 'Placeholder_3:0' shape=(?,) dtype=int64>, 'pdays': <tf.Tensor 'Placeholder_11:0' shape=(?,) dtype=int64>, 'housing': <tf.Tensor 'Placeholder_4:0' shape=(?,) dtype=int64>, 'contact': <tf.Tensor 'Placeholder_6:0' shape=(?,) dtype=string>, 'previous': <tf.Tensor 'Placeholder_12:0' shape=(?,) dtype=int64>}
WARNING:tensorflow:Export includes no default signature!
INFO:tensorflow:Restoring parameters from deploy_trained/model.ckpt-2500
INFO:tensorflow:Assets added to graph.
INFO:tensorflow:No assets to write.
INFO:tensorflow:SavedModel written to: deploy_trained/export/exporter/temp-1581352036/saved_model.pb
INFO:tensorflow:Loss for final step: 15.162572.
CPU times: user 1.44 s, sys: 184 ms, total: 1.62 s
Wall time: 52.6 s

Below is the signature definition. The model is created. I can use the model to make prediction, but there is no signature for the Export.

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['campaign'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_10:0
    inputs['contact'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_6:0
    inputs['dayofmonth'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_7:0
    inputs['default'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_3:0
    inputs['duration'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_9:0
    inputs['education'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_2:0
    inputs['housing'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_4:0
    inputs['job'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder:0
    inputs['loan'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_5:0
    inputs['marital'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_1:0
    inputs['month'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_8:0
    inputs['pdays'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_11:0
    inputs['poutcome'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: Placeholder_13:0
    inputs['previous'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: Placeholder_12:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['all_class_ids'] tensor_info:
        dtype: DT_INT32
        shape: (-1, 2)
        name: dnn/head/predictions/Tile:0
    outputs['all_classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 2)
        name: dnn/head/predictions/Tile_1:0
    outputs['class_ids'] tensor_info:
        dtype: DT_INT64
        shape: (-1, 1)
        name: dnn/head/predictions/ExpandDims:0
    outputs['classes'] tensor_info:
        dtype: DT_STRING
        shape: (-1, 1)
        name: dnn/head/predictions/str_classes:0
    outputs['logistic'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: dnn/head/predictions/logistic:0
    outputs['logits'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 1)
        name: dnn/logits/BiasAdd:0
    outputs['probabilities'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 2)
        name: dnn/head/predictions/probabilities:0
  Method name is: tensorflow/serving/predict 

I tried using other method in the serving function as follows.

    serialized_tf_example = tf.compat.v1.placeholder(tf.string, name='input_example_tensor')
    tf_example = tf.io.parse_example(serialized=serialized_tf_example, features=tf.feature_column.make_parse_example_spec(feature_spec))
    receiver_tensors = {'examples': serialized_tf_example}

The original error disappear but another error comes up.

  File "/usr/local/lib/python2.7/dist-packages/tensorflow_estimator/python/estimator/estimator.py", line 925, in _add_meta_graph_for_mode
    input_receiver = input_receiver_fn()
  File "deploy_model/model.py", line 85, in serving_input_receiver_fn
    tf_example = tf.io.parse_example(serialized=serialized_tf_example, features=tf.feature_column.make_parse_example_spec(feature_spec))
  File "/usr/local/lib/python2.7/dist-packages/tensorflow_core/python/feature_column/feature_column.py", line 806, in make_parse_example_spec
    'Given: {}'.format(column))
ValueError: All feature_columns must be _FeatureColumn instances. Given: poutcome

Can anyone tell me how to fix it? Thank you

gowthamkpr commented 4 years ago

Please refer to the following issue and the doc

If you have any more questions, please post it stack overflow as there is a wider community to respond. Thanks!

psyyip commented 4 years ago

I did post my questions on stack overflow. But there is no solution till now.

I tried to modify my codes according to your suggestions. But there are still errors. I wonder if this is caused by problem of my own codes, so I use your suggested codes and then replace the data source and data specification with my own data. Here is the results.

  1. I firstly reused Ji Zhang's codes.
COLUMN_NAMES = ['job', 'marital', 'education', 'default', 'housing', 'loan', 'contact', 'dayofmonth', 'month', 'duration', 'campaign', 'pdays',\
            'previous', 'poutcome', 'subscribe']
BATCH_SIZE = 512
STEPS = 128
OUTDIR = 'test_trained'

# load data
y_name = 'subscribe'

train = pd.read_csv('train.csv', names=COLUMN_NAMES, header=0)
train_x, train_y = train, train.pop(y_name)

test = pd.read_csv('test.csv', names=COLUMN_NAMES, header=0)
test_x, test_y = test, test.pop(y_name)

train_steps = (10 * len(train_y)) / BATCH_SIZE

job_column = tf.feature_column.categorical_column_with_vocabulary_list("job", ["blue-collar", "management", "technician","admin.","services","retired", \
                                                                               "self-employed", "entrepreneur","unemployed", "housemaid", "student", "unknown"])
marital_column = tf.feature_column.categorical_column_with_vocabulary_list("marital", ["married", "single", "divorced"])
edu_column = tf.feature_column.categorical_column_with_vocabulary_list("education", ["secondary", "primary", "tertiary", "unknown"])
contact_column = tf.feature_column.categorical_column_with_vocabulary_list(key="contact", vocabulary_list=["cellular", "telephone", "unknown"], default_value=0)
day_column = tf.feature_column.crossed_column(["month", 'dayofmonth'], 12*31)
out_column = tf.feature_column.categorical_column_with_vocabulary_list("poutcome", ["failure", "success", "other", "unknown"])

INPUT_COLUMNS = [
    tf.feature_column.indicator_column(job_column),
    tf.feature_column.indicator_column(marital_column),
    tf.feature_column.indicator_column(edu_column),
    tf.feature_column.numeric_column("default"),
    tf.feature_column.numeric_column("housing"),
    tf.feature_column.numeric_column("loan"),
    tf.feature_column.indicator_column(contact_column),
    tf.feature_column.indicator_column(day_column),
    tf.feature_column.numeric_column("duration"),
    tf.feature_column.numeric_column("campaign"),
    tf.feature_column.numeric_column("pdays"),
    tf.feature_column.numeric_column("previous"),
    tf.feature_column.indicator_column(out_column)
]

def _make_input_parser(with_target=True):
    def _decode_csv(line):
        column_header = COLUMN_NAMES if with_target else COLUMN_NAMES[:14]
        record_defaults = [[''] for x in range(0,3)]+[[0] for x in range(0,3)]+[[''] for x in range(0,3)]+[[0] for x in range(0,4)]+[['']]
        # Pass label as integer.
        if with_target:
            record_defaults.append([0])
        columns = tf.decode_csv(line, record_defaults=record_defaults)
        features = dict(zip(column_header, columns))
        target = features.pop(column_names[14]) if with_target else None
        return features, target
    return _decode_csv

def serving_input_receiver_fn():
  #This is used to define inputs to serve the model.
  #Returns:
  #  A ServingInputReciever object.
    csv_row = tf.placeholder(shape=[None], dtype=tf.string)
    features, _ = _make_input_parser(with_target=False)(csv_row)
    return tf.estimator.export.ServingInputReceiver(features,{'csv_row': csv_row})

# prepare input / eval fn
def train_input_fn(features, labels, batch_size):
    dataset = tf.data.Dataset.from_tensor_slices((dict(features), labels))
    dataset = dataset.shuffle(1000).repeat().batch(batch_size)
    return dataset

def eval_input_fn(features, labels, batch_size):
    features = dict(features)
    inputs = (features, labels) if labels is not None else features
    dataset = tf.data.Dataset.from_tensor_slices(inputs)
    dataset = dataset.batch(batch_size)
    return dataset

train_spec = tf.estimator.TrainSpec(input_fn=lambda: train_input_fn(train_x, train_y, batch_size=BATCH_SIZE), max_steps=2500)
exporter = tf.estimator.FinalExporter('exporter_test2', serving_input_receiver_fn)
eval_spec = tf.estimator.EvalSpec(
      input_fn=lambda: eval_input_fn(test_x, test_y, batch_size=len(test_y)),
      steps=None,
      exporters=exporter)

estimator = tf.estimator.DNNClassifier(model_dir = OUTDIR,
                                       hidden_units = [30, 10],
                                       feature_columns=INPUT_COLUMNS,
                                       dropout = 0.1,
                                       config = tf.estimator.RunConfig(model_dir = OUTDIR),
                                       # The model must choose between 2 classes.
                                       n_classes=2)

tf.compat.v1.logging.set_verbosity(tf.compat.v1.logging.INFO)
shutil.rmtree(OUTDIR, ignore_errors = True) # start fresh each time
tf.summary.FileWriterCache.clear() # ensure filewriter cache is clear for TensorBoard events file

tf.estimator.train_and_evaluate(estimator, train_spec, eval_spec)

No error in running the codes above. This is the signature def.

The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: Placeholder:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 2)
      name: dnn/head/Tile:0
  outputs['scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 2)
      name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify

But when I try to use it for prediction, errors appear.

!saved_model_cli run --dir test_trained/export/exporter_test2/1581829703 \
  --tag_set serve --signature_def serving_default \
  --input_examples 'inputs=[{"job":["entrepreneur"],"marital":["married"],"education":["secondary"],"default":[1],"housing":[1],"loan":[1],"contact":["unknown"],"dayofmonth":["5"],"month":["may"],"duration":[127],"campaign":[1],"pdays":[-1],"previous":[0],"poutcome":["unknown"]}]'
Traceback (most recent call last):
  File "/usr/local/bin/saved_model_cli", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 990, in main
    args.func(args)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 720, in run
    args.inputs, args.input_exprs, args.input_examples)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 632, in load_inputs_from_input_arg_string
    input_examples = preprocess_input_examples_arg_string(input_examples_str)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in preprocess_input_examples_arg_string
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in <listcomp>
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 569, in _create_example_string
    feature_list)
TypeError: 'unknown' has type str, but expected one of: bytes

Because of the error message, I changed the string into bytes. The error becomes:

!saved_model_cli run --dir test_trained/export/exporter_test2/1581829703 \
  --tag_set serve --signature_def serving_default \
  --input_examples 'inputs=[{"job":[b"entrepreneur"],"marital":[b"married"],"education":[b"secondary"],"default":[1],"housing":[1],"loan":[1],"contact":[b"unknown"],"dayofmonth":[b"5"],"month":[b"may"],"duration":[127],"campaign":[1],"pdays":[-1],"previous":[0],"poutcome":[b"unknown"]}]'
Traceback (most recent call last):
  File "/usr/local/bin/saved_model_cli", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 990, in main
    args.func(args)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 720, in run
    args.inputs, args.input_exprs, args.input_examples)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 632, in load_inputs_from_input_arg_string
    input_examples = preprocess_input_examples_arg_string(input_examples_str)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in preprocess_input_examples_arg_string
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in <listcomp>
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 576, in _create_example_string
    (type(feature_list[0]), feature_list[0]))
ValueError: Type <class 'bytes'> for value b'entrepreneur' is not supported for tf.train.Feature.

When I served SavedModel, there is another error message.

predict_fn = tf.contrib.predictor.from_saved_model(export_dir)

# Test inputs represented by Pandas DataFrame.
inputs = pd.DataFrame({
    "job": ["entrepreneur"], 
    "marital": ["married"], 
    "education": ["secondary"], 
    "default": [0], 
    "housing": [1], 
    "loan": [1], 
    "contact": ["unknown"], 
    "dayofmonth": ["5"], 
    "month": ["may"], 
    "duration": [127], 
    "campaign": [1], 
    "pdays": [-1], 
    "previous": [0], 
    "poutcome": ["unknown"]
})

# Convert input data into serialized Example strings.
examples = []
for index, row in inputs.iterrows():
    feature = {}
    for col, value in row.iteritems():
        feature[col] = tf.train.Feature(float_list=tf.train.FloatList(value=[value]))
    example = tf.train.Example(
        features=tf.train.Features(
            feature=feature
        )
    )
    examples.append(example.SerializeToString())

# Make predictions.
predictions = predict_fn({'inputs': examples})
InvalidArgumentError: Name: <unknown>, Key: previous, Index: 0.  Data types don't match. Data type: int64 but expected type: float
     [[{{node ParseExample/ParseExample}}]]

During handling of the above exception, another exception occurred:

InvalidArgumentError                      Traceback (most recent call last)
<ipython-input-8-76346f872536> in <module>
     46 
     47 # Make predictions.
---> 48 predictions = predict_fn({'inputs': examples})

/usr/local/lib/python3.5/dist-packages/tensorflow_core/contrib/predictor/predictor.py in __call__(self, input_dict)
     75       if value is not None:
     76         feed_dict[self.feed_tensors[key]] = value
---> 77     return self._session.run(fetches=self.fetch_tensors, feed_dict=feed_dict)

/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/client/session.py in run(self, fetches, feed_dict, options, run_metadata)
    954     try:
    955       result = self._run(None, fetches, feed_dict, options_ptr,
--> 956                          run_metadata_ptr)
    957       if run_metadata:
    958         proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/client/session.py in _run(self, handle, fetches, feed_dict, options, run_metadata)
   1178     if final_fetches or final_targets or (handle and feed_dict_tensor):
   1179       results = self._do_run(handle, final_targets, final_fetches,
-> 1180                              feed_dict_tensor, options, run_metadata)
   1181     else:
   1182       results = []

/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/client/session.py in _do_run(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)
   1357     if handle is None:
   1358       return self._do_call(_run_fn, feeds, fetches, targets, options,
-> 1359                            run_metadata)
   1360     else:
   1361       return self._do_call(_prun_fn, handle, feeds, fetches)

/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/client/session.py in _do_call(self, fn, *args)
   1382                     '\nsession_config.graph_options.rewrite_options.'
   1383                     'disable_meta_optimizer = True')
-> 1384       raise type(e)(node_def, op, message)
   1385 
   1386   def _extend_graph(self):

InvalidArgumentError: Name: <unknown>, Key: previous, Index: 0.  Data types don't match. Data type: int64 but expected type: float
     [[node ParseExample/ParseExample (defined at /usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/ops.py:1748) ]]

Original stack trace for 'ParseExample/ParseExample':
  File "/usr/lib/python3.5/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.5/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/usr/local/lib/python3.5/dist-packages/ipykernel_launcher.py", line 16, in <module>
    app.launch_new_instance()
  File "/usr/local/lib/python3.5/dist-packages/traitlets/config/application.py", line 664, in launch_instance
    app.start()
  File "/usr/local/lib/python3.5/dist-packages/ipykernel/kernelapp.py", line 563, in start
    self.io_loop.start()
  File "/usr/local/lib/python3.5/dist-packages/tornado/platform/asyncio.py", line 132, in start
    self.asyncio_loop.run_forever()
  File "/usr/lib/python3.5/asyncio/base_events.py", line 421, in run_forever
    self._run_once()
  File "/usr/lib/python3.5/asyncio/base_events.py", line 1424, in _run_once
    handle._run()
  File "/usr/lib/python3.5/asyncio/events.py", line 126, in _run
    self._callback(*self._args)
  File "/usr/local/lib/python3.5/dist-packages/tornado/ioloop.py", line 758, in _run_callback
    ret = callback()
  File "/usr/local/lib/python3.5/dist-packages/tornado/stack_context.py", line 300, in null_wrapper
    return fn(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tornado/gen.py", line 1233, in inner
    self.run()
  File "/usr/local/lib/python3.5/dist-packages/tornado/gen.py", line 1147, in run
    yielded = self.gen.send(value)
  File "/usr/local/lib/python3.5/dist-packages/ipykernel/kernelbase.py", line 361, in process_one
    yield gen.maybe_future(dispatch(*args))
  File "/usr/local/lib/python3.5/dist-packages/tornado/gen.py", line 326, in wrapper
    yielded = next(result)
  File "/usr/local/lib/python3.5/dist-packages/ipykernel/kernelbase.py", line 268, in dispatch_shell
    yield gen.maybe_future(handler(stream, idents, msg))
  File "/usr/local/lib/python3.5/dist-packages/tornado/gen.py", line 326, in wrapper
    yielded = next(result)
  File "/usr/local/lib/python3.5/dist-packages/ipykernel/kernelbase.py", line 541, in execute_request
    user_expressions, allow_stdin,
  File "/usr/local/lib/python3.5/dist-packages/tornado/gen.py", line 326, in wrapper
    yielded = next(result)
  File "/usr/local/lib/python3.5/dist-packages/ipykernel/ipkernel.py", line 300, in do_execute
    res = shell.run_cell(code, store_history=store_history, silent=silent)
  File "/usr/local/lib/python3.5/dist-packages/ipykernel/zmqshell.py", line 536, in run_cell
    return super(ZMQInteractiveShell, self).run_cell(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/IPython/core/interactiveshell.py", line 2855, in run_cell
    raw_cell, store_history, silent, shell_futures)
  File "/usr/local/lib/python3.5/dist-packages/IPython/core/interactiveshell.py", line 2881, in _run_cell
    return runner(coro)
  File "/usr/local/lib/python3.5/dist-packages/IPython/core/async_helpers.py", line 68, in _pseudo_sync_runner
    coro.send(None)
  File "/usr/local/lib/python3.5/dist-packages/IPython/core/interactiveshell.py", line 3058, in run_cell_async
    interactivity=interactivity, compiler=compiler, result=result)
  File "/usr/local/lib/python3.5/dist-packages/IPython/core/interactiveshell.py", line 3249, in run_ast_nodes
    if (await self.run_code(code, result,  async_=asy)):
  File "/usr/local/lib/python3.5/dist-packages/IPython/core/interactiveshell.py", line 3326, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-8-76346f872536>", line 1, in <module>
    predict_fn = tf.contrib.predictor.from_saved_model(export_dir)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/contrib/predictor/predictor_factories.py", line 153, in from_saved_model
    config=config)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/contrib/predictor/saved_model_predictor.py", line 153, in __init__
    loader.load(self._session, tags.split(','), export_dir)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/util/deprecation.py", line 324, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/saved_model/loader_impl.py", line 269, in load
    return loader.load(sess, tags, import_scope, **saver_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/saved_model/loader_impl.py", line 422, in load
    **saver_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/saved_model/loader_impl.py", line 352, in load_graph
    meta_graph_def, import_scope=import_scope, **saver_kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/training/saver.py", line 1477, in _import_meta_graph_with_return_elements
    **kwargs))
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/meta_graph.py", line 809, in import_scoped_meta_graph_with_return_elements
    return_elements=return_elements)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/util/deprecation.py", line 507, in new_func
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/importer.py", line 405, in import_graph_def
    producer_op_list=producer_op_list)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/importer.py", line 517, in _import_graph_def_internal
    _ProcessNewOps(graph)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/importer.py", line 243, in _ProcessNewOps
    for new_op in graph._add_new_tf_operations(compute_devices=False):  # pylint: disable=protected-access
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/ops.py", line 3561, in _add_new_tf_operations
    for c_op in c_api_util.new_tf_operations(self)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/ops.py", line 3561, in <listcomp>
    for c_op in c_api_util.new_tf_operations(self)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/ops.py", line 3451, in _create_op_from_tf_operation
    ret = Operation(c_op, self)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/framework/ops.py", line 1748, in __init__
    self._traceback = tf_stack.extract_stack()

Next, I reused the codes you provided on stack overflow and replaced the data with mine.

COLUMN_NAMES = ['job', 'marital', 'education', 'default', 'housing', 'loan', 'contact', 'dayofmonth', 'month', 'duration', 'campaign', 'pdays',\
            'previous', 'poutcome', 'subscribe']
BATCH_SIZE = 200
STEPS = 128

# load data
y_name = 'subscribe'

train = pd.read_csv('train.csv', names=COLUMN_NAMES, header=0)
train_x, train_y = train, train.pop(y_name)

test = pd.read_csv('test.csv', names=COLUMN_NAMES, header=0)
test_x, test_y = test, test.pop(y_name)

# prepare input / eval fn
def train_input_fn(features, labels, batch_size):
    dataset = tf.data.Dataset.from_tensor_slices((dict(features), labels))
    dataset = dataset.shuffle(1000).repeat().batch(batch_size)
    return dataset

def eval_input_fn(features, labels, batch_size):
    features = dict(features)
    inputs = (features, labels) if labels is not None else features
    dataset = tf.data.Dataset.from_tensor_slices(inputs)
    dataset = dataset.batch(batch_size)
    return dataset

# create classifier

# feature columns

job_column = tf.feature_column.categorical_column_with_vocabulary_list("job", ["blue-collar", "management", "technician","admin.","services","retired", "self-employed", \
                                                                          "entrepreneur","unemployed", "housemaid", "student", "unknown"])
marital_column = tf.feature_column.categorical_column_with_vocabulary_list("marital", ["married", "single", "divorced"])
edu_column = tf.feature_column.categorical_column_with_vocabulary_list("education", ["secondary", "primary", "tertiary", "unknown"])
contact_column = tf.feature_column.categorical_column_with_vocabulary_list(key="contact", vocabulary_list=["cellular", "telephone", "unknown"], default_value=0)
day_column = tf.feature_column.crossed_column(["month", 'dayofmonth'], 12*31)
out_column = tf.feature_column.categorical_column_with_vocabulary_list("poutcome", ["failure", "success", "other", "unknown"])

feature_columns = [
    tf.feature_column.indicator_column(job_column),
    tf.feature_column.indicator_column(marital_column),
    tf.feature_column.indicator_column(edu_column),
    tf.feature_column.numeric_column("default"),
    tf.feature_column.numeric_column("housing"),
    tf.feature_column.numeric_column("loan"),
    tf.feature_column.indicator_column(contact_column),
    tf.feature_column.indicator_column(day_column),
    tf.feature_column.numeric_column("duration"),
    tf.feature_column.numeric_column("campaign"),
    tf.feature_column.numeric_column("pdays"),
    tf.feature_column.numeric_column("previous"),
    tf.feature_column.indicator_column(out_column)
]

classifier = tf.estimator.DNNClassifier(
    feature_columns=feature_columns,
    hidden_units=[20, 10],
    n_classes=2)

classifier.train(input_fn=lambda: train_input_fn(train_x, train_y, batch_size=BATCH_SIZE), steps=STEPS)

# evaluate
eval_result = classifier.evaluate(
    input_fn=lambda: eval_input_fn(test_x, test_y, batch_size=BATCH_SIZE))

print('Test set accuracy: {accuracy:0.3f}'.format(**eval_result))

# predict
expected = [0,1]
predict_x = {
        "job": ["entrepreneur"], 
        "marital": ["married"], 
        "education": ["secondary"], 
        "default": [0], 
        "housing": [1], 
        "loan": [1], 
        "contact": ["unknown"], 
        "dayofmonth": ["5"], 
        "month": ["may"], 
        "duration": [127], 
        "campaign": [1], 
        "pdays": [-1], 
        "previous": [0], 
        "poutcome": ["unknown"]
}

predictions = classifier.predict(
    input_fn=lambda: eval_input_fn(predict_x, labels=None, batch_size=BATCH_SIZE))

for prediction, expect in zip(predictions, expected):
    class_id = prediction['class_ids'][0]
    probability = prediction['probabilities'][class_id]
    print('Prediction is "{}" ({:.1f}%), expected "{}"'.format(
        class_id, 100 * probability, expect))

# export model

# feature specification

feature_spec = tf.feature_column.make_parse_example_spec(feature_columns)
serving_input_receiver_fn = tf.estimator.export.build_parsing_serving_input_receiver_fn(feature_spec)
export_dir = classifier.export_savedmodel('export2', serving_input_receiver_fn)
print('Exported to {}'.format(export_dir))

Similarly, there is no error in running the codes above. This is the signature:

The given SavedModel SignatureDef contains the following input(s):
  inputs['inputs'] tensor_info:
      dtype: DT_STRING
      shape: (-1)
      name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
  outputs['classes'] tensor_info:
      dtype: DT_STRING
      shape: (-1, 2)
      name: dnn/head/Tile:0
  outputs['scores'] tensor_info:
      dtype: DT_FLOAT
      shape: (-1, 2)
      name: dnn/head/predictions/probabilities:0
Method name is: tensorflow/serving/classify

Identical errors appear with I tried to make prediction.

Traceback (most recent call last):
  File "/usr/local/bin/saved_model_cli", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 990, in main
    args.func(args)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 720, in run
    args.inputs, args.input_exprs, args.input_examples)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 632, in load_inputs_from_input_arg_string
    input_examples = preprocess_input_examples_arg_string(input_examples_str)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in preprocess_input_examples_arg_string
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in <listcomp>
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 569, in _create_example_string
    feature_list)
TypeError: 'entrepreneur' has type str, but expected one of: bytes
Traceback (most recent call last):
  File "/usr/local/bin/saved_model_cli", line 8, in <module>
    sys.exit(main())
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 990, in main
    args.func(args)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 720, in run
    args.inputs, args.input_exprs, args.input_examples)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 632, in load_inputs_from_input_arg_string
    input_examples = preprocess_input_examples_arg_string(input_examples_str)
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in preprocess_input_examples_arg_string
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 552, in <listcomp>
    _create_example_string(example) for example in example_list
  File "/usr/local/lib/python3.5/dist-packages/tensorflow_core/python/tools/saved_model_cli.py", line 576, in _create_example_string
    (type(feature_list[0]), feature_list[0]))
ValueError: Type <class 'bytes'> for value b'5' is not supported for tf.train.Feature.

Do you have any idea where the problem is?

Is the format of my input to the prediction correct?

Thank you!

gowthamkpr commented 4 years ago

Have you figured out the error @psyyip ?

psyyip commented 4 years ago

No. I tried all solutions I found and those proposed by others. These solution solves one problem but then creates another problem, going back to one of the scenario I faced above. No further proposals right now. I have to move on to learn and practice, so I set aside this issue now.

Any solution to me?

gowthamkpr commented 4 years ago

What other solutions did you try and what issues are you facing @psyyip ?

psyyip commented 4 years ago

My codes are based on a Coursera course. What I did is, I opened a Jupyter Notebook I downloaded from the course and ran it using my GCP project & bucket. Everything is unchanged. The (1st) error is the same as mine. Therefore, I think the error is not due to the codes themselves but something else, like dependency. I can only do this so far.

This is the notebook I mentioned: https://github.com/psyyip/taxifare/blob/master/feateng.ipynb

ChildishChange commented 2 years ago

maybe you can try build_raw_serving_input_receiver_fn @psyyip https://stackoverflow.com/questions/55422537/testing-tf-serving-model-fails-with-bytes-as-strings-and-strings-as-bytes-confus

ChildishChange commented 2 years ago

saved_model_cli maybe for py2

The new bytes type is 3.x only. The 2.x bytes built-in is just an alias to the str type. https://stackoverflow.com/questions/5901706/the-bytes-type-in-python-2-7-and-pep-358

I edit some code in _create_example_string to:

...
elif isinstance(feature_list[0], str):
      example.features.feature[feature_name].bytes_list.value.extend(
          [f.encode("utf-8") for f in feature_list])
...

and it worked.

no need to use build_raw_serving_input_receiver_fn

maybe you can try build_raw_serving_input_receiver_fn @psyyip https://stackoverflow.com/questions/55422537/testing-tf-serving-model-fails-with-bytes-as-strings-and-strings-as-bytes-confus

singhniraj08 commented 1 year ago

@psyyip,

tf.estimator.export.TensorServingInputReceiver is deprecated. Also, TF v1.x is not actively supported, please upgrade to the latest TF versions and refer to the migration doc to know more on this.

You can follow tutorials [1] [2] to serve Tensorflow models with Tensorflow Serving and follow TFRecord and tf.train.Example guide for supported data types in tf.train.Example.

Thank you!

github-actions[bot] commented 1 year ago

This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.

github-actions[bot] commented 1 year ago

This issue was closed due to lack of activity after being marked stale for past 7 days.

google-ml-butler[bot] commented 1 year ago

Are you satisfied with the resolution of your issue? Yes No