import requests
ner_api_url='http://127.0.0.1:5005/stream'
word='由匈牙利政府派出,計劃運輸由中方出口的新冠疫苗的包機,今日凌晨飛抵北京並在機場完成裝箱後,即啟程返航,預計於今天傍晚抵達匈牙利。這是中方向匈牙利出口的首批疫苗。'
word = word.encode('utf-8').strip()
url_response = requests.post(ner_api_url, data=word)
if url_response.status_code != 200:
print('status code error:',url_response.status_code)
else:
print(url_response.content)
Error:
loading NER labels and arguments
loading BERT Config
loading BERT tokenizer
loading BERT Model
* Running on http://127.0.0.1:5005/ (Press CTRL+C to quit)
Model successfully loaded.
* Serving Flask app "service_streamer_test" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: off
receive inputs: 由匈牙利政府派出,計劃運輸由中方出口的新冠疫苗的包機,今日凌晨飛抵北京並在機場完成裝箱後,即啟程返航,預計於今天傍晚抵達匈牙利。這是中方向匈牙利出口的首批疫苗。
Exception in thread thread_worker:
Traceback (most recent call last):
File "C:\Program Files\Python37\lib\threading.py", line 926, in _bootstrap_inner
self.run()
File "C:\Program Files\Python37\lib\threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "C:\Users\marcus\Desktop\boc_app_nlp\lib\site-packages\service_streamer\service_streamer.py", line 154, in run_forever
handled = self._run_once()
File "C:\Users\marcus\Desktop\boc_app_nlp\lib\site-packages\service_streamer\service_streamer.py", line 189, in _ run_once
self._send_response(client_id, task_id, request_id, model_outputs[i])
KeyError: 0
from my server side:
for my request side:
Error: