PaddlePaddle / FastDeploy

⚡️An Easy-to-use and Fast Deep Learning Model Deployment Toolkit for ☁️Cloud 📱Mobile and 📹Edge. Including Image, Video, Text and Audio 20+ main stream scenarios and 150+ SOTA models with end-to-end optimization, multi-platform and multi-framework support.
https://www.paddlepaddle.org.cn/fastdeploy
Apache License 2.0
2.93k stars 456 forks source link

在rk3588上部署paddleseg模型,发现在性能十分不如意,有没有办法提高npu的占用率或者是多核调用? #2478

Closed Gary-zyt closed 3 months ago

Gary-zyt commented 3 months ago

在瑞芯微rk3588上部署paddleseg的模型(如pp-liteseg),发现在npu只调用Core0,且占用率只有5%-15%左右,有没有办法提高npu的占用率或者是多核调用?python语言下,预测代码如下所示: import fastdeploy as fd import cv2 import os

def parse_arguments(): import argparse import ast parser = argparse.ArgumentParser() parser.add_argument( "--model_file", required=True, help="Path of PaddleSeg model.") parser.add_argument( "--config_file", required=True, help="Path of PaddleSeg config.") parser.add_argument( "--image", type=str, required=True, help="Path of test image file.") return parser.parse_args()

def build_option(args): option = fd.RuntimeOption() option.use_rknpu2() return option

args = parse_arguments()

setup runtime

runtime_option = build_option(args) model_file = args.model_file params_file = "" config_file = args.config_file model = fd.vision.segmentation.PaddleSegModel( model_file, params_file, config_file, runtime_option=runtime_option, model_format=fd.ModelFormat.RKNN)

model.preprocessor.disable_normalize() model.preprocessor.disable_permute()

predict

im = cv2.imread(args.image) result = model.predict(im) print(result)

visualize

vis_im = fd.vision.vis_segmentation(im, result, weight=0.5) cv2.imwrite("vis_img.png", vis_im)

Gary-zyt commented 3 months ago

能否在runtime_option中设定调用多个npu?官方文档中我只看到,选择调用npu或cpu,但目前调用npu发现只能调用单个的(Core0) model = fd.vision.segmentation.PaddleSegModel( model_file, params_file, config_file, runtime_option=runtime_option, model_format=fd.ModelFormat.RKNN)

Jiang-Jia-Jun commented 3 months ago

暂时不支持

Gary-zyt commented 3 months ago

暂时不支持

暂时不支持

好的