PaddlePaddle / Serving

A flexible, high-performance carrier for machine learning models(『飞桨』服务化部署框架)
Apache License 2.0
898 stars 250 forks source link

本地ubuntu16.04用conda安装paddle后布置serve报错 #823

Closed liangruofei closed 3 years ago

liangruofei commented 4 years ago

卡死到这了: (paddle) liangruofei@liangruofei-System-Product-Name:~/paddle/serving_model/cascade_rcnn_dcn_r50_vd_fpn_3x_server_side$ python -m paddle_serving_server_gpu.serve --model serving_server --port 9999 --gpu_id 0 --use_multilang Going to Run Comand /home/liangruofei/anaconda3/envs/paddle/lib/python3.7/site-packages/paddle_serving_server_gpu/serving-gpu-cuda9-0.3.2/serving -enable_model_toolkit -inferservice_path workdir_0 -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 2 -port 12000 -reload_interval_s 10 -resource_path workdir_0 -resource_file resource.prototxt -workflow_path workdir_0 -workflow_file workflow.prototxt -bthread_concurrency 2 -gpuid 0 -max_body_size 536870912 WARNING: Logging before InitGoogleLogging() is written to STDERR I0904 22:27:13.266221 3173 naming_service_thread.cpp:209] brpc::policy::ListNamingService("0.0.0.0:12000"): added 1 Segmentation fault (core dumped)

MRXLT commented 4 years ago

你好,请在启动server端之前,export GLOG_v=3 这样可以获取更多的信息,然后在log/serving.INFO中查看相关信息,找到报错的信息贴一下

liangruofei commented 4 years ago

输出结果是这样的

(paddle) liangruofei@liangruofei-System-Product-Name:~/paddle/serving_model/cascade_rcnn_dcn_r50_vd_fpn_3x_server_side$ python -m paddle_serving_server_gpu.serv e --model serving_server --port 9999 --gpu_id 0 --use_multilang mkdir: 无法创建目录"workdir_0": 文件已存在 Going to Run Comand /home/liangruofei/anaconda3/envs/paddle/lib/python3.7/site-packages/paddle_serving_server_gpu/serving-gpu-cuda9-0.3.2/serving -enable_model_toolkit -inferservice_path workdir_0 -inferservice_file infer_service.prototxt -max_concurrency 0 -num_threads 2 -port 12000 -reload_interval_s 10 -resource_path workdir_0 -resource_file resource.prototxt -workflow_path workdir_0 -workflow_file workflow.prototxt -bthread_concurrency 2 -gpuid 0 -max_body_size 536870912 WARNING: Logging before InitGoogleLogging() is written to STDERR I0907 20:11:25.472918 15419 general_model.cpp:71] feed var num: 3fetch_var_num: 1 I0907 20:11:25.472955 15419 general_model.cpp:75] feed alias name: image index: 0 I0907 20:11:25.472970 15419 general_model.cpp:78] feed[0] shape: I0907 20:11:25.472980 15419 general_model.cpp:82] shape[0]: 3 I0907 20:11:25.472992 15419 general_model.cpp:85] feed[0] feed type: 1 I0907 20:11:25.473006 15419 general_model.cpp:75] feed alias name: im_info index: 1 I0907 20:11:25.473019 15419 general_model.cpp:78] feed[1] shape: I0907 20:11:25.473032 15419 general_model.cpp:82] shape[0]: 3 I0907 20:11:25.473044 15419 general_model.cpp:85] feed[1] feed type: 1 I0907 20:11:25.473060 15419 general_model.cpp:75] feed alias name: im_shape index: 2 I0907 20:11:25.473073 15419 general_model.cpp:78] feed[2] shape: I0907 20:11:25.473085 15419 general_model.cpp:82] shape[0]: 3 I0907 20:11:25.473099 15419 general_model.cpp:85] feed[2] feed type: 1 I0907 20:11:25.473117 15419 general_model.cpp:93] fetch [0] alias name: multiclass_nms_0.tmp_0 I0907 20:11:25.473237 15419 general_model.cpp:53] Init commandline: dummy /home/liangruofei/anaconda3/envs/paddle/lib/python3.7/site-packages/paddle_serving_server_gpu/serve.py --tryfromenv=profile_client,profile_server,max_body_size I0907 20:11:25.473467 15419 predictor_sdk.cpp:34] M default��� d(���������0:pooled Defaultla" baidu_std � general_model@baidu.paddle_serving.predictor.general_model.GeneralModelServiceWeightedRandomRender" 100*5 default_tag_139926840719568list://0.0.0.0:12000 I0907 20:11:25.473532 15419 predictor_sdk.cpp:28] Succ register all components! I0907 20:11:25.473563 15419 config_manager.cpp:217] Not found key in configue: cluster I0907 20:11:25.473577 15419 config_manager.cpp:234] Not found key in configue: split_tag_name I0907 20:11:25.473590 15419 config_manager.cpp:235] Not found key in configue: tag_candidates I0907 20:11:25.473604 15419 config_manager.cpp:263] split info not set, skip... I0907 20:11:25.473628 15419 abtest.cpp:55] Succ read weights list: 100, count: 1, normalized: 100 I0907 20:11:25.473642 15419 config_manager.cpp:202] Not found key in configue: connect_timeout_ms I0907 20:11:25.473655 15419 config_manager.cpp:203] Not found key in configue: rpc_timeout_ms I0907 20:11:25.473668 15419 config_manager.cpp:205] Not found key in configue: hedge_request_timeout_ms I0907 20:11:25.473681 15419 config_manager.cpp:207] Not found key in configue: connect_retry_count I0907 20:11:25.473695 15419 config_manager.cpp:209] Not found key in configue: hedge_fetch_retry_count I0907 20:11:25.473708 15419 config_manager.cpp:211] Not found key in configue: max_connection_per_host I0907 20:11:25.473721 15419 config_manager.cpp:212] Not found key in configue: connection_type I0907 20:11:25.473735 15419 config_manager.cpp:219] Not found key in configue: load_balance_strategy I0907 20:11:25.473749 15419 config_manager.cpp:221] Not found key in configue: cluster_filter_strategy I0907 20:11:25.473762 15419 config_manager.cpp:226] Not found key in configue: protocol I0907 20:11:25.473775 15419 config_manager.cpp:227] Not found key in configue: compress_type I0907 20:11:25.473788 15419 config_manager.cpp:228] Not found key in configue: package_size I0907 20:11:25.473801 15419 config_manager.cpp:230] Not found key in configue: max_channel_per_request I0907 20:11:25.473814 15419 config_manager.cpp:234] Not found key in configue: split_tag_name I0907 20:11:25.473826 15419 config_manager.cpp:235] Not found key in configue: tag_candidates I0907 20:11:25.473839 15419 config_manager.cpp:263] split info not set, skip... I0907 20:11:25.473853 15419 config_manager.cpp:186] Succ load one endpoint, name: general_model, count of variants: 1. I0907 20:11:25.473873 15419 config_manager.cpp:85] Success reload endpoint config file, id: 1 I0907 20:11:25.476961 15419 naming_service_thread.cpp:209] brpc::policy::ListNamingService("0.0.0.0:12000"): added 1 I0907 20:11:25.479220 15419 stub_impl.hpp:376] Succ create parallel channel, count: 3 I0907 20:11:25.479265 15419 stub_impl.hpp:42] Create stub without tag, ep general_model I0907 20:11:25.480393 15419 variant.cpp:69] Succ create default debug I0907 20:11:25.480413 15419 endpoint.cpp:38] Succ create variant: 0, endpoint:general_model I0907 20:11:25.480422 15419 predictor_sdk.cpp:69] Succ create endpoint instance with name: general_model Segmentation fault (core dumped)

在 2020-09-07 10:42:22,"MRXLT" notifications@github.com 写道:

你好,请在启动server端之前,export GLOG_v=3 这样可以获取更多的信息,然后在log/serving.INFO中查看相关信息,找到报错的信息贴一下

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

liangruofei commented 4 years ago

(paddle) liangruofei@liangruofei-System-Product-Name:~/cascade$ python -m paddle_serving_server_gpu.serve --thread 10 --model serving_server --port 8888 --gpu_id 0 --use_multilang mkdir: 无法创建目录"workdir_0": 文件已存在 WARNING: Logging before InitGoogleLogging() is written to STDERR I0908 00:11:18.977010 2546 general_model.cpp:73] feed var num: 3fetch_var_num: 1 I0908 00:11:18.977036 2546 general_model.cpp:77] feed alias name: image index: 0 I0908 00:11:18.977043 2546 general_model.cpp:80] feed[0] shape: I0908 00:11:18.977048 2546 general_model.cpp:84] shape[0]: 3 I0908 00:11:18.977053 2546 general_model.cpp:87] feed[0] feed type: 1 I0908 00:11:18.977059 2546 general_model.cpp:77] feed alias name: im_info index: 1 I0908 00:11:18.977064 2546 general_model.cpp:80] feed[1] shape: I0908 00:11:18.977074 2546 general_model.cpp:84] shape[0]: 3 I0908 00:11:18.977083 2546 general_model.cpp:87] feed[1] feed type: 1 I0908 00:11:18.977094 2546 general_model.cpp:77] feed alias name: im_shape index: 2 I0908 00:11:18.977103 2546 general_model.cpp:80] feed[2] shape: I0908 00:11:18.977111 2546 general_model.cpp:84] shape[0]: 3 I0908 00:11:18.977120 2546 general_model.cpp:87] feed[2] feed type: 1 I0908 00:11:18.977131 2546 general_model.cpp:95] fetch [0] alias name: multiclass_nms_0.tmp_0 I0908 00:11:18.977571 2546 general_model.cpp:55] Init commandline: dummy /home/liangruofei/anaconda3/envs/paddle/lib/python3.7/site-packages/paddle_serving_server_gpu/serve.py --tryfromenv=profile_client,profile_server,max_body_size I0908 00:11:18.977820 2546 predictor_sdk.cpp:34] M default��� d(���������0:pooled Defaultla" baidu_std � general_model@baidu.paddle_serving.predictor.general_model.GeneralModelServiceWeightedRandomRender" 100*5 default_tag_140187620328912list://0.0.0.0:12000 I0908 00:11:18.977841 2546 predictor_sdk.cpp:28] Succ register all components! I0908 00:11:18.977867 2546 config_manager.cpp:217] Not found key in configue: cluster I0908 00:11:18.977875 2546 config_manager.cpp:234] Not found key in configue: split_tag_name I0908 00:11:18.977882 2546 config_manager.cpp:235] Not found key in configue: tag_candidates I0908 00:11:18.977890 2546 config_manager.cpp:263] split info not set, skip... I0908 00:11:18.977906 2546 abtest.cpp:55] Succ read weights list: 100, count: 1, normalized: 100 I0908 00:11:18.977918 2546 config_manager.cpp:202] Not found key in configue: connect_timeout_ms I0908 00:11:18.977927 2546 config_manager.cpp:203] Not found key in configue: rpc_timeout_ms I0908 00:11:18.977936 2546 config_manager.cpp:205] Not found key in configue: hedge_request_timeout_ms I0908 00:11:18.977944 2546 config_manager.cpp:207] Not found key in configue: connect_retry_count I0908 00:11:18.977952 2546 config_manager.cpp:209] Not found key in configue: hedge_fetch_retry_count I0908 00:11:18.977959 2546 config_manager.cpp:211] Not found key in configue: max_connection_per_host I0908 00:11:18.977967 2546 config_manager.cpp:212] Not found key in configue: connection_type I0908 00:11:18.977975 2546 config_manager.cpp:219] Not found key in configue: load_balance_strategy I0908 00:11:18.977984 2546 config_manager.cpp:221] Not found key in configue: cluster_filter_strategy I0908 00:11:18.977993 2546 config_manager.cpp:226] Not found key in configue: protocol I0908 00:11:18.977999 2546 config_manager.cpp:227] Not found key in configue: compress_type I0908 00:11:18.978008 2546 config_manager.cpp:228] Not found key in configue: package_size I0908 00:11:18.978015 2546 config_manager.cpp:230] Not found key in configue: max_channel_per_request I0908 00:11:18.978024 2546 config_manager.cpp:234] Not found key in configue: split_tag_name I0908 00:11:18.978030 2546 config_manager.cpp:235] Not found key in configue: tag_candidates I0908 00:11:18.978039 2546 config_manager.cpp:263] split info not set, skip... I0908 00:11:18.978049 2546 config_manager.cpp:186] Succ load one endpoint, name: general_model, count of variants: 1. I0908 00:11:18.978063 2546 config_manager.cpp:85] Success reload endpoint config file, id: 1 I0908 00:11:18.984396 2546 naming_service_thread.cpp:209] brpc::policy::ListNamingService("0.0.0.0:12000"): added 1 I0908 00:11:18.984692 2546 stub_impl.hpp:376] Succ create parallel channel, count: 3 I0908 00:11:18.984704 2546 stub_impl.hpp:42] Create stub without tag, ep general_model I0908 00:11:18.985944 2546 variant.cpp:69] Succ create default debug I0908 00:11:18.985960 2546 endpoint.cpp:38] Succ create variant: 0, endpoint:general_model I0908 00:11:18.985971 2546 predictor_sdk.cpp:69] Succ create endpoint instance with name: general_model

一直是这种状态,不知道是否是成功的标志

在 2020-09-07 10:42:22,"MRXLT" notifications@github.com 写道:

你好,请在启动server端之前,export GLOG_v=3 这样可以获取更多的信息,然后在log/serving.INFO中查看相关信息,找到报错的信息贴一下

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

liangruofei commented 4 years ago

以前的问题我解决了,现在配好了,执行是这样的 serve端执行: (paddle) liangruofei@liangruofei-System-Product-Name:~/cascade$ python -m paddle_serving_server_gpu.serve --thread 10 --model serving_server --port 8888 --gpu_id 0 --use_multilang mkdir: 无法创建目录"workdir_0": 文件已存在 WARNING: Logging before InitGoogleLogging() is written to STDERR I0908 20:27:05.811311 10181 general_model.cpp:73] feed var num: 3fetch_var_num: 1 I0908 20:27:05.811334 10181 general_model.cpp:77] feed alias name: image index: 0 I0908 20:27:05.811340 10181 general_model.cpp:80] feed[0] shape: I0908 20:27:05.811347 10181 general_model.cpp:84] shape[0]: 3 I0908 20:27:05.811352 10181 general_model.cpp:87] feed[0] feed type: 1 I0908 20:27:05.811357 10181 general_model.cpp:77] feed alias name: im_info index: 1 I0908 20:27:05.811362 10181 general_model.cpp:80] feed[1] shape: I0908 20:27:05.811367 10181 general_model.cpp:84] shape[0]: 3 I0908 20:27:05.811372 10181 general_model.cpp:87] feed[1] feed type: 1 I0908 20:27:05.811378 10181 general_model.cpp:77] feed alias name: im_shape index: 2 I0908 20:27:05.811383 10181 general_model.cpp:80] feed[2] shape: I0908 20:27:05.811388 10181 general_model.cpp:84] shape[0]: 3 I0908 20:27:05.811393 10181 general_model.cpp:87] feed[2] feed type: 1 I0908 20:27:05.811398 10181 general_model.cpp:95] fetch [0] alias name: multiclass_nms_0.tmp_0 I0908 20:27:05.811477 10181 general_model.cpp:55] Init commandline: dummy /home/liangruofei/anaconda3/envs/paddle/lib/python3.7/site-packages/paddle_serving_server_gpu/serve.py --tryfromenv=profile_client,profile_server,max_body_size I0908 20:27:05.811640 10181 predictor_sdk.cpp:34] M default��� d(���������0:pooled Defaultla" baidu_std � general_model@baidu.paddle_serving.predictor.general_model.GeneralModelServiceWeightedRandomRender" 100*5 default_tag_139745050093072list://0.0.0.0:12000 I0908 20:27:05.811657 10181 predictor_sdk.cpp:28] Succ register all components! I0908 20:27:05.811672 10181 config_manager.cpp:217] Not found key in configue: cluster I0908 20:27:05.811678 10181 config_manager.cpp:234] Not found key in configue: split_tag_name I0908 20:27:05.811681 10181 config_manager.cpp:235] Not found key in configue: tag_candidates I0908 20:27:05.811686 10181 config_manager.cpp:263] split info not set, skip... I0908 20:27:05.811693 10181 abtest.cpp:55] Succ read weights list: 100, count: 1, normalized: 100 I0908 20:27:05.811698 10181 config_manager.cpp:202] Not found key in configue: connect_timeout_ms I0908 20:27:05.811702 10181 config_manager.cpp:203] Not found key in configue: rpc_timeout_ms I0908 20:27:05.811705 10181 config_manager.cpp:205] Not found key in configue: hedge_request_timeout_ms I0908 20:27:05.811709 10181 config_manager.cpp:207] Not found key in configue: connect_retry_count I0908 20:27:05.811713 10181 config_manager.cpp:209] Not found key in configue: hedge_fetch_retry_count I0908 20:27:05.811717 10181 config_manager.cpp:211] Not found key in configue: max_connection_per_host I0908 20:27:05.811720 10181 config_manager.cpp:212] Not found key in configue: connection_type I0908 20:27:05.811724 10181 config_manager.cpp:219] Not found key in configue: load_balance_strategy I0908 20:27:05.811728 10181 config_manager.cpp:221] Not found key in configue: cluster_filter_strategy I0908 20:27:05.811733 10181 config_manager.cpp:226] Not found key in configue: protocol I0908 20:27:05.811735 10181 config_manager.cpp:227] Not found key in configue: compress_type I0908 20:27:05.811739 10181 config_manager.cpp:228] Not found key in configue: package_size I0908 20:27:05.811743 10181 config_manager.cpp:230] Not found key in configue: max_channel_per_request I0908 20:27:05.811746 10181 config_manager.cpp:234] Not found key in configue: split_tag_name I0908 20:27:05.811750 10181 config_manager.cpp:235] Not found key in configue: tag_candidates I0908 20:27:05.811754 10181 config_manager.cpp:263] split info not set, skip... I0908 20:27:05.811758 10181 config_manager.cpp:186] Succ load one endpoint, name: general_model, count of variants: 1. I0908 20:27:05.811767 10181 config_manager.cpp:85] Success reload endpoint config file, id: 1 I0908 20:27:05.814249 10181 naming_service_thread.cpp:209] brpc::policy::ListNamingService("0.0.0.0:12000"): added 1 I0908 20:27:05.814441 10181 stub_impl.hpp:376] Succ create parallel channel, count: 3 I0908 20:27:05.814448 10181 stub_impl.hpp:42] Create stub without tag, ep general_model I0908 20:27:05.815412 10181 variant.cpp:69] Succ create default debug I0908 20:27:05.815429 10181 endpoint.cpp:38] Succ create variant: 0, endpoint:general_model I0908 20:27:05.815438 10181 predictor_sdk.cpp:69] Succ create endpoint instance with name: general_model 然后在client端执行程序:输出 (paddle) liangruofei@liangruofei-System-Product-Name:~$ python test.py {'serving_status_code': <StatusCode.DEADLINE_EXCEEDED: (4, 'deadline exceeded')>, 'image': '/home/liangruofei/darknet_uav/test.jpg'}

然而serve端输出显示: (paddle) liangruofei@liangruofei-System-Product-Name:~/cascade$ python -m paddle_serving_server_gpu.serve --thread 10 --model serving_server --port 8888 --gpu_id 0 --use_multilang mkdir: 无法创建目录"workdir_0": 文件已存在 WARNING: Logging before InitGoogleLogging() is written to STDERR I0908 20:27:05.811311 10181 general_model.cpp:73] feed var num: 3fetch_var_num: 1 I0908 20:27:05.811334 10181 general_model.cpp:77] feed alias name: image index: 0 I0908 20:27:05.811340 10181 general_model.cpp:80] feed[0] shape: I0908 20:27:05.811347 10181 general_model.cpp:84] shape[0]: 3 I0908 20:27:05.811352 10181 general_model.cpp:87] feed[0] feed type: 1 I0908 20:27:05.811357 10181 general_model.cpp:77] feed alias name: im_info index: 1 I0908 20:27:05.811362 10181 general_model.cpp:80] feed[1] shape: I0908 20:27:05.811367 10181 general_model.cpp:84] shape[0]: 3 I0908 20:27:05.811372 10181 general_model.cpp:87] feed[1] feed type: 1 I0908 20:27:05.811378 10181 general_model.cpp:77] feed alias name: im_shape index: 2 I0908 20:27:05.811383 10181 general_model.cpp:80] feed[2] shape: I0908 20:27:05.811388 10181 general_model.cpp:84] shape[0]: 3 I0908 20:27:05.811393 10181 general_model.cpp:87] feed[2] feed type: 1 I0908 20:27:05.811398 10181 general_model.cpp:95] fetch [0] alias name: multiclass_nms_0.tmp_0 I0908 20:27:05.811477 10181 general_model.cpp:55] Init commandline: dummy /home/liangruofei/anaconda3/envs/paddle/lib/python3.7/site-packages/paddle_serving_server_gpu/serve.py --tryfromenv=profile_client,profile_server,max_body_size I0908 20:27:05.811640 10181 predictor_sdk.cpp:34] M default��� d(���������0:pooled Defaultla" baidu_std � general_model@baidu.paddle_serving.predictor.general_model.GeneralModelServiceWeightedRandomRender" 100*5 default_tag_139745050093072list://0.0.0.0:12000 I0908 20:27:05.811657 10181 predictor_sdk.cpp:28] Succ register all components! I0908 20:27:05.811672 10181 config_manager.cpp:217] Not found key in configue: cluster I0908 20:27:05.811678 10181 config_manager.cpp:234] Not found key in configue: split_tag_name I0908 20:27:05.811681 10181 config_manager.cpp:235] Not found key in configue: tag_candidates I0908 20:27:05.811686 10181 config_manager.cpp:263] split info not set, skip... I0908 20:27:05.811693 10181 abtest.cpp:55] Succ read weights list: 100, count: 1, normalized: 100 I0908 20:27:05.811698 10181 config_manager.cpp:202] Not found key in configue: connect_timeout_ms I0908 20:27:05.811702 10181 config_manager.cpp:203] Not found key in configue: rpc_timeout_ms I0908 20:27:05.811705 10181 config_manager.cpp:205] Not found key in configue: hedge_request_timeout_ms I0908 20:27:05.811709 10181 config_manager.cpp:207] Not found key in configue: connect_retry_count I0908 20:27:05.811713 10181 config_manager.cpp:209] Not found key in configue: hedge_fetch_retry_count I0908 20:27:05.811717 10181 config_manager.cpp:211] Not found key in configue: max_connection_per_host I0908 20:27:05.811720 10181 config_manager.cpp:212] Not found key in configue: connection_type I0908 20:27:05.811724 10181 config_manager.cpp:219] Not found key in configue: load_balance_strategy I0908 20:27:05.811728 10181 config_manager.cpp:221] Not found key in configue: cluster_filter_strategy I0908 20:27:05.811733 10181 config_manager.cpp:226] Not found key in configue: protocol I0908 20:27:05.811735 10181 config_manager.cpp:227] Not found key in configue: compress_type I0908 20:27:05.811739 10181 config_manager.cpp:228] Not found key in configue: package_size I0908 20:27:05.811743 10181 config_manager.cpp:230] Not found key in configue: max_channel_per_request I0908 20:27:05.811746 10181 config_manager.cpp:234] Not found key in configue: split_tag_name I0908 20:27:05.811750 10181 config_manager.cpp:235] Not found key in configue: tag_candidates I0908 20:27:05.811754 10181 config_manager.cpp:263] split info not set, skip... I0908 20:27:05.811758 10181 config_manager.cpp:186] Succ load one endpoint, name: general_model, count of variants: 1. I0908 20:27:05.811767 10181 config_manager.cpp:85] Success reload endpoint config file, id: 1 I0908 20:27:05.814249 10181 naming_service_thread.cpp:209] brpc::policy::ListNamingService("0.0.0.0:12000"): added 1 I0908 20:27:05.814441 10181 stub_impl.hpp:376] Succ create parallel channel, count: 3 I0908 20:27:05.814448 10181 stub_impl.hpp:42] Create stub without tag, ep general_model I0908 20:27:05.815412 10181 variant.cpp:69] Succ create default debug I0908 20:27:05.815429 10181 endpoint.cpp:38] Succ create variant: 0, endpoint:general_model I0908 20:27:05.815438 10181 predictor_sdk.cpp:69] Succ create endpoint instance with name: general_model I0908 20:28:07.590495 10250 general_model.cpp:366] batch size: 1 I0908 20:28:07.590517 10250 stub_impl.hpp:149] Succ thread initialize stub impl! I0908 20:28:07.590524 10250 endpoint.cpp:53] Succ thrd initialize all vars: 1 I0908 20:28:07.590530 10250 predictor_sdk.cpp:129] Succ thrd initialize endpoint:general_model I0908 20:28:07.590725 10250 general_model.cpp:377] fetch general model predictor done. I0908 20:28:07.590732 10250 general_model.cpp:378] float feed name size: 3 I0908 20:28:07.590737 10250 general_model.cpp:379] int feed name size: 0 I0908 20:28:07.590744 10250 general_model.cpp:380] max body size : 536870912 I0908 20:28:07.590750 10250 general_model.cpp:388] prepare batch 0 I0908 20:28:07.590759 10250 general_model.cpp:401] batch [0] int_feed_name and float_feed_name prepared I0908 20:28:07.590764 10250 general_model.cpp:405] tensor_vec size 3 float shape 3 I0908 20:28:07.590770 10250 general_model.cpp:410] prepare float feed image shape size 3 I0908 20:28:07.608628 10250 general_model.cpp:410] prepare float feed im_info shape size 1 I0908 20:28:07.608654 10250 general_model.cpp:410] prepare float feed im_shape shape size 1 I0908 20:28:07.608662 10250 general_model.cpp:462] batch [0] float feed value prepared I0908 20:28:07.608667 10250 general_model.cpp:545] batch [0] int feed value prepared W0908 20:28:07.627852 10250 predictor.hpp:129] inference call failed, message: [E112]1/1 channels failed, fail_limit=1 [C0][E111]Fail to connect SocketId=113@0.0.0.0:12000: Connection refused [R1][E112]Fail to select server from list://0.0.0.0:12000 lb=la [R2][E112]Fail to select server from list://0.0.0.0:12000 lb=la I0908 20:28:07.728153 10200 socket.cpp:2370] Checking SocketId=0@0.0.0.0:12000 E0908 20:28:10.197585 10250 general_model.cpp:567] failed call predictor with req: insts { tensor_array { float_data: 1.3271606 float_data: 1.3271606 float_data: 1.3271606 float_data: 1.3271606 float_data: 1.3271606 float_data: 1.3271606 float_data: 1.3271606 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.3513678 float_data: 1.355725 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3755751 float_data: 1.3997821 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4239894 float_data: 1.4380295 float_data: 1.4690149 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4912854 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4724039 float_data: 1.4825709 float_data: 1.4966111 float_data: 1.472888 float_data: 1.4786979 float_data: 1.4966111 float_data: 1.4767611 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4951587 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4777296 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4941905 float_data: 1.5009685 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5542245 float_data: 1.5692327 float_data: 1.5919877 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.6021541 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692327 float_data: 1.5692328 float_data: 1.5692329 float_data: 1.5692328 float_data: 1.5692329 float_data: 1.5692328 float_data: 1.5692327 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5726218 float_data: 1.6036071 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176472 float_data: 1.6176474 float_data: 1.6176473 float_data: 1.6176472 float_data: 1.6176473 float_data: 1.6176472 float_data: 1.6190997 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6529899 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6491165 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6549261 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6607361 float_data: 1.6660616 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.690269 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7231903 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7769302 float_data: 1.8079156 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8752124 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.9323409 float_data: 1.932341 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.932341 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.932341 float_data: 1.9323411 float_data: 1.932341 float_data: 1.932341 float_data: 1.932341 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.932341 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.932341 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.9323409 float_data: 1.9323411 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.9323409 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9323409 float_data: 1.9323409 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9565482 float_data: 1.9807553 float_data: 1.9807553 float_data: 1.9807553 float_data: 1.9807553 float_data: 1.9807556 float_data: 1.9807553 float_data: 1.9807556 float_data: 1.9807556 float_data: 1.9807553 float_data: 1.9807556 float_data: 1.9807553 float_data: 1.9807553 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 1.9652624 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.994313 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0248127 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0170662 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 1.9851124 float_data: 2.0073829 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0519247 float_data: 2.0775843 float_data: 2.0654807 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0093198 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0170662 float_data: 2.0291698 float_data: 2.0049627 float_data: 1.9885005 float_data: 1.9565482 float_data: 2.0001223 float_data: 2.0136769 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0393364 float_data: 2.0291698 float_data: 2.0320852 float_data: 2.0615807 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0577343 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0732272 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0654807 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0620914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0557973 float_data: 2.0620914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1008222 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0974343 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.0896878 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0732272 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0993712 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1013069 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.1017914 float_data: 2.0896878 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0654807 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0732257 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0775843 float_data: 2.0693545 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0335283 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0480523 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0533772 float_data: 2.0354638 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0170662 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0291698 float_data: 2.0170662 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 2.0049627 float_data: 1.9337947 float_data: 1.9323409 float_data: 1.9323411 float_data: 1.932341 float_data: 1.9371812 float_data: 1.9715565 float_data: 1.9725258 float_data: 1.958002 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 1.9565482 float_data: 2.0291698 float_data: 2.0190017 float_data: 1.9880188 float_data: 1.9807553 float_data: 1.9502542 float_data: 1.9323411 float_data: 1.932341 float_data: 1.932341 float_data: 1.9323409 float_data: 1.9323411 float_data: 1.932341 float_data: 1.932341 float_data: 1.932341 float_data: 1.932341 float_data: 1.932341 float_data: 1.9323411 float_data: 1.9323409 float_data: 1.932341 float_data: 1.932341 float_data: 1.9323411 float_data: 1.9323409 float_data: 1.932341 float_data: 1.932341 float_data: 1.932341 float_data: 1.9323409 float_data: 1.9323409 float_data: 1.932341 float_data: 1.932341 float_data: 1.9086185 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9173326 float_data: 1.9323411 float_data: 1.932341 float_data: 1.932341 float_data: 1.9202373 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.9081337 float_data: 1.8882852 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8839266 float_data: 1.8708538 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597195 float_data: 1.8597193 float_data: 1.8597195 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8597193 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8355122 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.811305 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.811305 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.8113048 float_data: 1.7992013 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7870977 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7628905 float_data: 1.7386833 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7265795 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386832 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7386833 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.7144761 float_data: 1.6747787 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6781652 float_data: 1.6713865 float_data: 1.6433082 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.661703 float_data: 1.6418544 float_data: 1.6597676 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6660616 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418543 float_data: 1.6660616 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.690269 float_data: 1.690269 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6902688 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176473 float_data: 1.6176474 float_data: 1.6176473 float_data: 1.6084484 float_data: 1.59344 float_data: 1.5706867 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692327 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.5692329 float_data: 1.5692328 float_data: 1.5692328 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6418544 float_data: 1.6355603 float_data: 1.6045744 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.5692328 float_data: 1.5813365 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.59344 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.5450256 float_data: 1.532922 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5000005 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.5208182 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4966111 float_data: 1.4816027 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724038 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724038 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4724039 float_data: 1.4481966

这样的输出和在aistudio不对啊

在 2020-09-07 10:42:22,"MRXLT" notifications@github.com 写道:

你好,请在启动server端之前,export GLOG_v=3 这样可以获取更多的信息,然后在log/serving.INFO中查看相关信息,找到报错的信息贴一下

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.

wangqianyi2017 commented 3 years ago

遇到同样的问题,请问解决了吗?

TeslaZhao commented 3 years ago

您好,在最新版本v0.4.0已经解决