ShannonAI / service-streamer

Boosting your Web Services of Deep Learning Applications.
Apache License 2.0
1.22k stars 187 forks source link

Error on multi-threading !!! #84

Open marcusau opened 3 years ago

marcusau commented 3 years ago

from my server side:

from backend.bert_test import NERServeHandler
from flask import Flask, request, jsonify
from service_streamer import ThreadedStreamer

app = Flask(__name__)
model = None
streamer = None

@app.route("/stream", methods=["POST"])
def stream_predict():
    inputs =  request.data.strip()
    inputs = inputs.decode('utf-8')
    print(f'receive inputs: {inputs}')
    outputs = streamer.predict([inputs])
    return jsonify(outputs)

if __name__ == "__main__":
    model=NERServeHandler()
    model.initialize()
    streamer = ThreadedStreamer(model.predict, batch_size=64, max_latency=0.1)
    app.run(host='127.0.0.1',port=5005, debug=False)

for my request side:

import requests
ner_api_url='http://127.0.0.1:5005/stream'

word='由匈牙利政府派出,計劃運輸由中方出口的新冠疫苗的包機,今日凌晨飛抵北京並在機場完成裝箱後,即啟程返航,預計於今天傍晚抵達匈牙利。這是中方向匈牙利出口的首批疫苗。'

word = word.encode('utf-8').strip()

url_response = requests.post(ner_api_url, data=word)
if url_response.status_code != 200:
    print('status code error:',url_response.status_code)
else:
    print(url_response.content)

Error:

    loading NER labels and arguments
    loading BERT Config
    loading BERT tokenizer
    loading BERT Model
     * Running on http://127.0.0.1:5005/ (Press CTRL+C to quit)
    Model successfully loaded.
     * Serving Flask app "service_streamer_test" (lazy loading)
     * Environment: production
       WARNING: This is a development server. Do not use it in a production deployment.
       Use a production WSGI server instead.
     * Debug mode: off

    receive inputs: 由匈牙利政府派出,計劃運輸由中方出口的新冠疫苗的包機,今日凌晨飛抵北京並在機場完成裝箱後,即啟程返航,預計於今天傍晚抵達匈牙利。這是中方向匈牙利出口的首批疫苗。
    Exception in thread thread_worker:
    Traceback (most recent call last):
      File "C:\Program Files\Python37\lib\threading.py", line 926, in _bootstrap_inner
        self.run()
      File "C:\Program Files\Python37\lib\threading.py", line 870, in run
        self._target(*self._args, **self._kwargs)
      File "C:\Users\marcus\Desktop\boc_app_nlp\lib\site-packages\service_streamer\service_streamer.py", line 154, in run_forever
        handled = self._run_once()
      File "C:\Users\marcus\Desktop\boc_app_nlp\lib\site-packages\service_streamer\service_streamer.py", line 189, in _        run_once
        self._send_response(client_id, task_id, request_id, model_outputs[i])
    KeyError: 0
kuangdd commented 2 years ago

我也遇到同样的问题,什么原因呢? 是代码的bug,还是我们的使用方式有问题?还是某些环境变量需要专门设定?或者是其他什么原因呢?

kuangdd commented 2 years ago

我的原因找到了,model.predict的逻辑必须都是张量计算的步骤,不能新添加非张量计算的逻辑,否则就这样的报错。