Open nonew opened 1 year ago
I hope this could work but over time, I've forgotten whether to do it or not
Maybe you should modify the snippets.py of the bojone/bert4keras by forking this repository. Delete 1 line & Add 3 new lines.
- prediction = predict(self, inputs, output_ids, states)
+ prediction = predict(self, inputs, output_ids, states)[0]
+ if prediction.shape[0] > 1:
+ prediction = np.expand_dims(prediction[-1], 0)
Refer to: jackie930/t5-pegasus-textsummary
Hi, Zhaoqi
I installed all dependencies and run into error, can you help me out of here?
(tf15) D:\title-generation-v4-master>pip install -r requirements.txt Successfully installed Keras-2.3.1 bert4keras-0.11.4 coloredlogs-15.0.1 h5py-2.10.0 humanfriendly-10.0 jieba-0.42.1 onnx-1.11.0 onnxruntime-1.5.1 onnxruntime-tools-1.7.0 packaging-21.3 psutil-5.9.4 py-cpuinfo-9.0.0 py3nvml-0.2.7 pyparsing-3.0.9 pyreadline-2.1 pyyaml-6.0 tensorboard-1.15.0 tensorflow-estimator-1.15.1 xmltodict-0.13.0
(tf15) D:\title-generation-v4-master>python main.py Using TensorFlow backend. Building prefix dict from the default dictionary ... Loading model from cache C:\Users\nonew\AppData\Local\Temp\jieba.cache Loading model cost 0.483 seconds. Prefix dict has been built successfully. Traceback (most recent call last): File "main.py", line 22, in
encoder_session = onnxruntime.InferenceSession(encoder_onnx, providers=['CPUExecutionProvider'])
File "C:\Users\nonew.conda\envs\tf15\lib\site-packages\onnxruntime\capi\session.py", line 195, in init
self._create_inference_session(providers, provider_options)
File "C:\Users\nonew.conda\envs\tf15\lib\site-packages\onnxruntime\capi\session.py", line 200, in _create_inference_session
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidProtobuf: [ONNXRuntimeError] : 7 : INVALID_PROTOBUF : Load model from ./onnx/encoder.onnx failed:Protobuf parsing failed.