i have trained a model using python3.7 and tf 2.7 , save the model by 'saved model' format , like this:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['serving_default']:
The given SavedModel SignatureDef contains the following input(s):
inputs['examples'] tensor_info:
dtype: DT_STRING
shape: (-1)
name: input_example_tensor:0
The given SavedModel SignatureDef contains the following output(s):
outputs['cf_1'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:0
outputs['cf_2'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:1
outputs['cf_label'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:2
outputs['cf_id'] tensor_info:
dtype: DT_INT64
shape: (-1, 1)
name: ParseExample/ParseExampleV2:3
outputs['score'] tensor_info:
dtype: DT_FLOAT
shape: (-1, 1)
name: score:0
Method name is: tensorflow/serving/predict
i wanna to load it and do some predict by spark, but really confused with the model input : Serialized Example Object. i can use python load the model and predict, just like this:
here, the example_proto is my Example Object, and use SerializeToString method , it is worked. but when i do the same thing by spark, there is always report error, such as:
val result = sparkEnv.spark.read.parquet(inputPath).map(item => {
val example = convert2Example(schemaInfo,item)
val map = new java.util.HashMap[String,Tensor]()
val tensor = TString.vectorOf(new String(example.toByteArray,Charset.forName("UTF-8")))
map.put("examples",tensor)
val score = model.value.call(map).get("score")
score.toString
}).rdd
Is there any method to deploy a estimator model which input is Example object by java ?
i have trained a model using python3.7 and tf 2.7 , save the model by 'saved model' format , like this:
MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:
signature_def['serving_default']: The given SavedModel SignatureDef contains the following input(s): inputs['examples'] tensor_info: dtype: DT_STRING shape: (-1) name: input_example_tensor:0 The given SavedModel SignatureDef contains the following output(s): outputs['cf_1'] tensor_info: dtype: DT_INT64 shape: (-1, 1) name: ParseExample/ParseExampleV2:0 outputs['cf_2'] tensor_info: dtype: DT_INT64 shape: (-1, 1) name: ParseExample/ParseExampleV2:1 outputs['cf_label'] tensor_info: dtype: DT_INT64 shape: (-1, 1) name: ParseExample/ParseExampleV2:2 outputs['cf_id'] tensor_info: dtype: DT_INT64 shape: (-1, 1) name: ParseExample/ParseExampleV2:3 outputs['score'] tensor_info: dtype: DT_FLOAT shape: (-1, 1) name: score:0 Method name is: tensorflow/serving/predict
i wanna to load it and do some predict by spark, but really confused with the model input : Serialized Example Object. i can use python load the model and predict, just like this:
def model_predict(example_proto): exam_input = tf.constant([example_proto.SerializeToString()])
return model.signatures'serving_default'
here, the example_proto is my Example Object, and use SerializeToString method , it is worked. but when i do the same thing by spark, there is always report error, such as:
val result = sparkEnv.spark.read.parquet(inputPath).map(item => { val example = convert2Example(schemaInfo,item) val map = new java.util.HashMap[String,Tensor]()
Is there any method to deploy a estimator model which input is Example object by java ?