Some of this information can be collected via this script.
OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
TensorFlow installed from (source or binary):
TensorFlow version: 2.0
Python version: 3.6
bert-as-service version: v1.10.0
GPU model and memory:
CPU model and memory:
Description
Please replace YOUR_SERVER_ARGS and YOUR_CLIENT_ARGS accordingly. You can also write your own description for reproducing the issue.
hi, I'm wondering is there a way I can load .pb format model for bert-as-service instead .ckpt model?
I would like to transfer to use another BERT pre-trained model released here: https://tfhub.dev/tensorflow/bert_zh_L-12_H-768_A-12/1
which looks like a more recent chinese bert model but in format of .pb model.
Prerequisites
bert-as-service
?README.md
?README.md
?System information
bert-as-service
version: v1.10.0Description
hi, I'm wondering is there a way I can load .pb format model for bert-as-service instead .ckpt model? I would like to transfer to use another BERT pre-trained model released here: https://tfhub.dev/tensorflow/bert_zh_L-12_H-768_A-12/1 which looks like a more recent chinese bert model but in format of .pb model.
Thank you very much