PaddlePaddle / FastDeploy

⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support.
https://www.paddlepaddle.org.cn/fastdeploy
Apache License 2.0
3.01k stars 465 forks source link

fastdeploy UIEmodel,尝试线程池、异步,并行运行 -->结果全是串行 #2106

Open lzh1998-jansen opened 1 year ago

lzh1998-jansen commented 1 year ago

温馨提示:根据社区不完全统计,按照模板提问,可以加快回复和解决问题的速度


环境

问题日志及出现问题的操作流程

代码如下: def func_information_extract(item_dict): print('fun1 start') information_extraion_model.set_schema(item_dict['prompt_ie']) ie_result = information_extraion_model.predict(item_dict['data'],return_dict=True) return ie_result

def func_sentiment(item_dict): print('fun2 start') sentiment_analysis_model.set_schema(item_dict['prompt_se']) se_result = sentiment_analysis_model.predict(item_dict['data'],return_dict=True) return se_result

def func_cls(item_dict): print('fun3 start') cls_model.set_schema(item_dict['prompt_cls']) cls_result = cls_model.predict(item_dict['data'],return_dict=True) return cls_result

@app.post('/test/') async def parallel_run(item: Item_merge): try: item_dict = item.dict() with concurrent.futures.ThreadPoolExecutor(max_workers=3) as executor: futures = [ executor.submit(func_information_extract, item_dict), executor.submit(func_sentiment, item_dict), executor.submit(func_cls, item_dict) ]

        results = []
        for future in concurrent.futures.as_completed(futures):
            result = future.result()
            results.append(result)

    # 封装成 FastAPI 可以返回的格式
    response = {
        'information_extraction': results[0][0],
        'sentiment_analysis': results[1][0],
        'classification': results[2][0]
    }
    return JSONResponse(status_code=200, content={'status': 200, 'msg': '执行成功', 'code': 200, 'data': response})

except Exception as e:
    return JSONResponse(status_code=418, content={'status': 500, 'msg': '程序运行错误', 'code': 500, 'data': str(e)})

三个model 是加载自fastdeploy.text 的UIEModel,是三个不同的model,分别加载不同的权重。我在尝试在一个函数里并行运行三个model的推理,我尝试了线程池和python 异步,结果都是串行执行,不起作用。请问是不是fastdeploy uie的底层实现限制了无法并行运行?

leiqing1 commented 1 year ago

参考FastDeploy的多线程文档教程来实现:https://github.com/PaddlePaddle/FastDeploy/tree/develop/tutorials/multi_thread

lzh1998-jansen commented 1 year ago

执行uie c++ 案例的时候,./infer_demo uie-base 1,报错。信息为 image 尝试过重新安装fastdeploy,尝试过export 手动加这个库的路径进入环境变量,都未起作用。