jina-ai / clip-as-service

🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
https://clip-as-service.jina.ai
Other
12.39k stars 2.07k forks source link

Failed to start bert-serving-start #403

Open mahyoub opened 5 years ago

mahyoub commented 5 years ago

Prerequisites

Please fill in by replacing [ ] with [x].

ISSUE I got to the stage were I run this code bert-serving-start -model_dir=/uncased_L-12_H-768_A-12/ -num_worker=4 to start the Bert server, but for some reason, it fails. I searched online and there are very little pages addressing this issue and none of them worked for me as they suggested that it could be because of the python and TensorFlow versions but I am using the latest versions of both.

Here is the error code `c:\users\USERNAME\anaconda3\lib\site-packages\bert_serving\server\helper.py:170: UserWarning: Tensorflow 2.0.0-alpha0 is not tested! It may or may not work. Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/ 'Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/' % tf.version) usage: C:\Users\USERNAME\Anaconda3\Scripts\bert-serving-start -model_dir=/uncased_L-12_H-768_A-12/ -num_worker=4 ARG VALUE


       ckpt_name = bert_model.ckpt
     config_name = bert_config.json
            cors = *
             cpu = False
      device_map = []
   do_lower_case = True

fixed_embed_length = False fp16 = False gpu_memory_fraction = 0.5 graph_tmp_dir = None http_max_connect = 10 http_port = None mask_cls_sep = False max_batch_size = 256 max_seq_len = 25 model_dir = /uncased_L-12_H-768_A-12/ num_worker = 4 pooling_layer = [-2] pooling_strategy = REDUCE_MEAN port = 5555 port_out = 5556 prefetch_size = 10 priority_batch_size = 16 show_tokens_to_client = False tuned_model_dir = None verbose = False xla = False

I:VENTILATOR:freeze, optimize and export graph, could take a while... c:\users\USERNAME\anaconda3\lib\site-packages\bert_serving\server\helper.py:170: UserWarning: Tensorflow 2.0.0-alpha0 is not tested! It may or may not work. Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/ 'Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/' % tf.version) E:GRAPHOPT:fail to optimize the graph! Traceback (most recent call last): File "c:\users\USERNAME\anaconda3\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "c:\users\USERNAME\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\USERNAME\Anaconda3\Scripts\bert-serving-start.exe__main.py", line 9, in File "c:\users\USERNAME\anaconda3\lib\site-packages\bert_serving\server\cli__init__.py", line 4, in main with BertServer(get_run_args()) as server: File "c:\users\USERNAME\anaconda3\lib\site-packages\bert_serving\server\init.py", line 71, in init__ self.graph_path, self.bert_config = pool.apply(optimize_graph, (self.args,)) TypeError: cannot unpack non-iterable NoneType object`

bipedalBit commented 5 years ago

You should notice this, "UserWarning: Tensorflow 2.0.0-alpha0 is not tested!"

avinashok commented 4 years ago

I too have the same error. Is there anyway to fix this? Is it because I'm using the latest version of Tensorflow? I:VENTILATOR:freeze, optimize and export graph, could take a while... c:\users\avok\appdata\local\continuum\anaconda3\lib\site-packages\bert_serving\server\helper.py:174: UserWarning: Tensorflow 2.0.0 is not tested! It may or may not work. Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/ 'Feel free to submit an issue at https://github.com/hanxiao/bert-as-service/issues/' % tf.version) E:GRAPHOPT:fail to optimize the graph! Traceback (most recent call last): File "c:\users\avok\appdata\local\continuum\anaconda3\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "c:\users\avok\appdata\local\continuum\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\avok\AppData\Local\Continuum\anaconda3\Scripts\bert-serving-start.exe__main.py", line 9, in File "c:\users\avok\appdata\local\continuum\anaconda3\lib\site-packages\bert_serving\server\cli__init__.py", line 4, in main with BertServer(get_run_args()) as server: File "c:\users\avok\appdata\local\continuum\anaconda3\lib\site-packages\bert_serving\server\init.py", line 71, in init__ self.graph_path, self.bert_config = pool.apply(optimize_graph, (self.args,)) TypeError: cannot unpack non-iterable NoneType object

ansas1985 commented 4 years ago

I have a similar issue when running Bert cased model, I m running it on CPU

(base) C:\Users\Ishwar>bert-serving-start -model_dir "C:\Bert Models\cased_L-12_H-768_A-12\cased_L-12_H-768_A-12\" -cpu -max_batch_size 16 -num_worker=1 usage: C:\Users\Ishwar\Anaconda3\Scripts\bert-serving-start -model_dir C:\Bert Models\cased_L-12_H-768_A-12\cased_L-12_H-768_A-12" -cpu -max_batch_size 16 -num_worker=1 ARG VALUE


       ckpt_name = bert_model.ckpt
     config_name = bert_config.json
            cors = *
             cpu = False
      device_map = []
   do_lower_case = True

fixed_embed_length = False fp16 = False gpu_memory_fraction = 0.5 graph_tmp_dir = None http_max_connect = 10 http_port = None mask_cls_sep = False max_batch_size = 256 max_seq_len = 25 model_dir = C:\Bert Models\cased_L-12_H-768_A-12\cased_L-12_H-768_A-12" -cpu -max_batch_size 16 -num_worker=1 no_position_embeddings = False no_special_token = False num_worker = 1 pooling_layer = [-2] pooling_strategy = REDUCE_MEAN port = 5555 port_out = 5556 prefetch_size = 10 priority_batch_size = 16 show_tokens_to_client = False tuned_model_dir = None verbose = False xla = False

I:VENTILATOR:freeze, optimize and export graph, could take a while... WARNING:tensorflow:From c:\users\ishwar\anaconda3\lib\site-packages\bert_serving\server\helper.py:186: The name tf.logging.set_verbosity is deprecated. Please use tf.compat.v1.logging.set_verbosity instead.

WARNING:tensorflow:From c:\users\ishwar\anaconda3\lib\site-packages\bert_serving\server\helper.py:186: The name tf.logging.ERROR is deprecated. Please use tf.compat.v1.logging.ERROR instead.

I:GRAPHOPT:model config: C:\Bert Models\cased_L-12_H-768_A-12\cased_L-12_H-768_A-12" -cpu -max_batch_size 16 -num_worker=1\bert_config.json I:GRAPHOPT:checkpoint: C:\Bert Models\cased_L-12_H-768_A-12\cased_L-12_H-768_A-12" -cpu -max_batch_size 16 -num_worker=1\bert_model.ckpt E:GRAPHOPT:fail to optimize the graph! Traceback (most recent call last): File "c:\users\ishwar\anaconda3\lib\runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "c:\users\ishwar\anaconda3\lib\runpy.py", line 85, in _run_code exec(code, run_globals) File "C:\Users\Ishwar\Anaconda3\Scripts\bert-serving-start.exe__main.py", line 9, in File "c:\users\ishwar\anaconda3\lib\site-packages\bert_serving\server\cli__init__.py", line 4, in main with BertServer(get_run_args()) as server: File "c:\users\ishwar\anaconda3\lib\site-packages\bert_serving\server\init.py", line 71, in init__ self.graph_path, self.bert_config = pool.apply(optimize_graph, (self.args,)) TypeError: cannot unpack non-iterable NoneType object

ansas1985 commented 4 years ago

TypeError: cannot unpack non-iterable NoneType object. I m getting same error message everytime I run the Bert service using CPU, can anyone explain