Open JulianGerhard21 opened 5 years ago
@JulianGerhard21 hello! have you fixed this bug? I have the same proflem
@qiunian711
First, rename:
bert-base-german.data-00000-of-00001
to
bert-base-german-cased.data-00000-of-00001
so that you have index
, meta
and data
files with the same name. The content of the folder should look like this:
bert-base-german-cased.index
bert-base-german-cased.meta
bert-base-german-cased.data-00000-of-00001
bert_config.json
vocab.txt
Now, you need to pass an additional parameter to bert-serving-start
:
bert-serving-start -model_dir <folder-name> -ckpt_name bert-base-german-cased
This is because Bert expects ckpt_name = bert_model.ckpt
by default, as shown in the log under
ARG VALUE
table.
Another option would be to rename index
, meta
and data
files to bert_model.ckpt.index
, etc.
@qiunian711
First, rename:
bert-base-german.data-00000-of-00001
tobert-base-german-cased.data-00000-of-00001
so that you haveindex
,meta
anddata
files with the same name. The content of the folder should look like this:bert-base-german-cased.index bert-base-german-cased.meta bert-base-german-cased.data-00000-of-00001 bert_config.json vocab.txt
Now, you need to pass an additional parameter to
bert-serving-start
:bert-serving-start -model_dir <folder-name> -ckpt_name bert-base-german-cased
This is because Bert expects
ckpt_name = bert_model.ckpt
by default, as shown in the log underARG VALUE
table.Another option would be to rename
index
,meta
anddata
files tobert_model.ckpt.index
, etc.
It worked, but in my case, you have to rename 3 files index
, meta
and data
to bert_model
. If you add an additional param like -ckpt_name bert-base-german-cased
, it will assume that you have a subfolder inside the pretrained model named bert-base-german-cased
.
Thanks anyway
hello! have you fixed this bug? I have the same proflem
@meilanglang I made it work for me: 1) tensorflow==1.15.0 (tensorflow-gpu==1.15.0) Make sure to have CUDA = 10.0 installed 2) bert-serving-server==1.9.9 3) rename checkpoint files to:
\venv\Lib\site-packages\bert_serving\server\graph.py
. In the lines 61-63, replace the code with
input_ids = tf.placeholder(tf.int32, (args.max_batch_size, args.max_seq_len), 'input_ids')
input_mask = tf.placeholder(tf.int32, (args.max_batch_size, args.max_seq_len), 'input_mask')
input_type_ids = tf.placeholder(tf.int32, (args.max_batch_size, args.max_seq_len), 'input_type_ids')
For debugging purpose, I used the following script from https://www.programmersought.com/article/20774372335/:
Server:
from bert_serving.server import BertServer
from bert_serving.server.helper import get_args_parser
def main():
args = get_args_parser().parse_args(['-model_dir', r'C:\Users\annaz\PycharmProjects\bert-server\bert-base-german-cased',
'-port', '86500',
'-port_out', '86501',
'-max_seq_len', '512',
'-num_worker', '1',
'-mask_cls_sep',
# '-cpu',
'-max_batch_size', '1'
])
bs = BertServer(args)
print("bert server start....")
bs.start()
if __name__ == "__main__":
main()
Client:
from bert_serving.client import BertClient
bc = BertClient(port=86500, port_out=86501, show_server_config=True, timeout=100000)
vec = bc.encode(['test', 'test 2'])
Running the command from the terminal still doesn't work for me, but this client-server code in .py files does it job.
Prerequisites
bert-as-service
?README.md
?README.md
?System information
bert-as-service
version: newestCPU model and memory: Intel(R) Core(TM) i7-7700 CPU @ 3.60GH
Description
I'm using this command to start the server:
Then this issue shows up:
The content of the model dir:
bert-base-german-cased.index bert-base-german-cased.meta bert-base-german.data-00000-of-00001 bert_config.json vocab.txt
I followed the Chinese Law Tutorial on the same machine and it works perfectly. Since I want to evaluate document classification especially for german, I want to serve the features for this pretrained BERT - tensorflow-version.
...