intel-analytics / analytics-zoo

Distributed Tensorflow, Keras and PyTorch on Apache Spark/Flink & Ray
https://analytics-zoo.readthedocs.io/
Apache License 2.0
16 stars 3 forks source link

use InferenceModel to load tensorflow pretrain model, it occurs "Method inferenceModelTensorFlowLoadTF dosen't exist" #999

Closed linjiaqin closed 4 years ago

linjiaqin commented 5 years ago

code: from zoo.pipeline.inference import InferenceModel model = InferenceModel() model_path = "/software/hadoop3/Flink_export_20190812" model.load_tf(model_path)

error: py4j.protocol.Py4JError: An error occurred while calling o41.inferenceModelTensorFlowLoadTF. Trace: py4j.Py4JException: Method inferenceModelTensorFlowLoadTF([class com.intel.analytics.zoo.pipeline.inference.InferenceModel, class java.lang.String, class java.lang.Integer, class java.lang.Integer, class java.lang.Boolean]) does not exist at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318) at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326) at py4j.Gateway.invoke(Gateway.java:274) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:238) at java.lang.Thread.run(Thread.java:745)

glorysdj commented 5 years ago

not enough parameters, please check the load_tf api

def load_tf(self, model_path, backend="tensorflow", intra_op_parallelism_threads=1, inter_op_parallelism_threads=1, use_per_session_threads=True, model_type=None, ov_pipeline_config_path=None, ov_extensions_config_path=None): """ Load an TensorFlow model using tensorflow or openvino backend. :param model_path: String. The file path to the TensorFlow model. :param backend: String. The backend to use for inference. Either 'tensorflow' or 'openvino'. For 'tensorflow' backend, only need to specify arguments intra_op_parallelism_threads, inter_op_parallelism_threads and use_per_session_threads. For 'openvino' backend, only need to specify either model_type or pipeline_config_path together with extensions_config_path. Default is 'tensorflow'. :param intra_op_parallelism_threads: For 'tensorflow' backend only. Int. The number of intraOpParallelismThreads. Default is 1. :param inter_op_parallelism_threads: For 'tensorflow' backend only. Int. The number of interOpParallelismThreads. Default is 1. :param use_per_session_threads: For 'tensorflow' backend only. Boolean. Whether to use perSessionThreads. Default is True. :param model_type: For 'openvino' backend only. The type of the TensorFlow model, e.g. faster_rcnn_resnet101_coco, ssd_inception_v2_coco, etc. :param ov_pipeline_config_path: For 'openvino' backend only. String. The file path to the pipeline configure file. :param ov_extensions_config_path: For 'openvino' backend only. String. The file path to the extensions configure file. Need pipeline_config_path and extensions_config_path for 'openvino' backend if model_type is not specified. """

gjy1992 commented 4 years ago

It seems only one parameter is needed, and other parameters all have default values. I tried model.load_tf("./SavedModel", "tensorflow") and the error is same with linjiaqin's.

I use python 2.7 and analytics-zoo 0.6.0 on Ubuntu 16, the code is the same:

model_zoo=InferenceModel()
model_zoo.load_tf("./SavedModel", "tensorflow")

and the error message:
py4j.protocol.Py4JError: An error occurred while calling o38.inferenceModelTensorFlowLoadTF. Trace:
py4j.Py4JException: Method inferenceModelTensorFlowLoadTF([class com.intel.analytics.zoo.pipeline.inference.InferenceModel, class java.lang.String, class java.lang.Integer, class java.lang.Integer, class java.lang.Boolean]) does not exist
        at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:318)
        at py4j.reflection.ReflectionEngine.getMethod(ReflectionEngine.java:326)
        at py4j.Gateway.invoke(Gateway.java:274)
        at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
        at py4j.commands.CallCommand.execute(CallCommand.java:79)
        at py4j.GatewayConnection.run(GatewayConnection.java:238)
        at java.lang.Thread.run(Thread.java:748)
qiyuangong commented 4 years ago

@gjy1992

Python 2.7 will be end of life in near future, and we are on the way of removing its support. Can you try Python 3.X with analytics-zoo 0.6.0 ?

gjy1992 commented 4 years ago

@qiyuangong I switched to Python3, the problem is still there. And when I try to load a ResNet SavedModel, it tells that "Openvino optimize tf image classification model error". The full code is below:

import tensorflow as tf
from zoo.pipeline.inference import InferenceModel
from zoo.common.nncontext import *

sc = init_nncontext()
base_model = tf.keras.applications.ResNet50(include_top=False, weights="imagenet", 
    input_tensor=None, input_shape=(224,224,3), pooling='avg')
x = base_model.output
predictions = tf.keras.layers.Dense(100, activation= 'softmax')(x)
model_tf = tf.keras.Model(inputs = base_model.input, outputs = predictions)
model_tf.compile(optimizer='Adam', loss=tf.losses.softmax_cross_entropy, metrics=['accuracy'])
tf.keras.experimental.export_saved_model(model_tf,"./SavedModel")
model_zoo=InferenceModel()
model_zoo.load_tf_image_classification_as_openvino("./SavedModel", "", "", (224,224,3), True, (123.68,116.78,103.94), 1.0)

And the error is:
py4j.protocol.Py4JJavaError: An error occurred while calling o46.inferenceModelOpenVINOLoadTF.
: com.intel.analytics.zoo.pipeline.inference.InferenceRuntimeException: Openvino optimize tf image classification model error: 1,

The version of tensorflow is 0.14, and the version of analytics-zoo is 0.6.0

qiyuangong commented 4 years ago

It seems that you are using wrong API. load_tf_image_classification_as_openvino is designed for *.pb and *.ckpt, e.g., load_tf_image_classification_as_openvino example and load_tf example.

Actually, load_tf for saved model is not implemented in Python API. I will check details ASAP.

qiyuangong commented 4 years ago

Hi @gjy1992

We are adding saved model load API for Python. https://github.com/intel-analytics/analytics-zoo/pull/1723

qiyuangong commented 4 years ago

API changed. Issue closed.