Open Vincent630 opened 2 months ago
用1.5转试试
v2我不确定 depv1可以
用1.5转试试 rknn1.5的版本是吗?我来试下,感谢大佬指点
v2我不确定 depv1可以
W init: rknn-toolkit2 version: 1.5.2+b642f30c --> Loading model E load_onnx: Catch exception when loading onnx model: weights/depth_anything_vitb14.onnx! E load_onnx: Traceback (most recent call last): E load_onnx: File "rknn/api/rknn_base.py", line 1466, in rknn.api.rknn_base.RKNNBase.load_onnx E load_onnx: File "/anaconda3/envs/python36/lib/python3.6/site-packages/onnx/init.py", line 121, in load_model E load_onnx: model = load_model_from_string(s, format=format) E load_onnx: File "/anaconda3/envs/python36/lib/python3.6/site-packages/onnx/init.py", line 158, in load_model_from_string E load_onnx: return _deserialize(s, ModelProto()) E load_onnx: File "/anaconda3/envs/python36/lib/python3.6/site-packages/onnx/init.py", line 99, in _deserialize E load_onnx: decoded = cast(Optional[int], proto.ParseFromString(s)) E load_onnx: google.protobuf.message.DecodeError: Error parsing message W If you can't handle this error, please try updating to the latest version of the toolkit2 and runtime from: https://eyun.baidu.com/s/3eTDMk6Y (Pwd: rknn) Path: RK_NPU_SDK / RK_NPU_SDK_1.X.0 / develop / If the error still exists in the latest version, please collect the corresponding error logs and the model, convert script, and input data that can reproduce the problem, and then submit an issue on: https://redmine.rock-chips.com (Please consult our sales or FAE for the redmine account) load model failed!
难受,试了下官方的V1的onnx转rknn,然后在加装onnx的时候就失败了。
请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点
你现在还需要吗?需要的话我可以发你
@yanshuangyingying could you share your converted depth anything V2 '.rknn' format, I have faced some problems during the inference of my model, thanks !!
@yanshuangyingying could you share your converted depth anything V2 '.rknn' format, I have faced some problems during the inference of my model, thanks !!
wait a minute
@yanshuangyingying could you share your converted depth anything V2 '.rknn' format, I have faced some problems during the inference of my model, thanks !!
wait a minute it seems like I can not upload my rknn model here give me your email i send it to you
@yanshuangyingying could you share your converted depth anything V2 '.rknn' format, I have faced some problems during the inference of my model, thanks !!
I have upload the rknn model and the inference code in my project "depthv2_rknn".
@yanshuangyingying Thank your for quick responce and sharing .onnx model . Did you use metric models? , I couldn't find .rknn model in your GitHub repo , thanks ?
请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点
你现在还需要吗?需要的话我可以发你
可以,麻烦发我下v2的rknn模型,请教下你是用什么版本的rk模具链把onnx转的rknn?
请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点
你现在还需要吗?需要的话我可以发你
可以,麻烦发我下v2的rknn模型,请教下你是用什么版本的rk模具链把onnx转的rknn? 我用的最新的rknn包转的,模型在我的主页你进去就能看到
请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点
你现在还需要吗?需要的话我可以发你
可以,麻烦发我下v2的rknn模型,请教下你是用什么版本的rk模具链把onnx转的rknn? 我用的最新的rknn包转的,模型在我的主页你进去就能看到
请问,您转换后的.rknn实在哪款芯片上测试的啊?
Rk 3566
发自我的iPhone
------------------ 原始邮件 ------------------ 发件人: PPPengxinyu @.> 发送时间: 2024年10月19日 21:30 收件人: airockchip/rknn-toolkit2 @.> 抄送: yanshuangyingying @.>, Mention @.> 主题: Re: [airockchip/rknn-toolkit2] depth anythingv2 rknn部署 (Issue #125)
请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点
你现在还需要吗?需要的话我可以发你
可以,麻烦发我下v2的rknn模型,请教下你是用什么版本的rk模具链把onnx转的rknn? 我用的最新的rknn包转的,模型在我的主页你进去就能看到
请问,您转换后的.rknn实在哪款芯片上测试的啊?
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>
Rk 3566 发自我的iPhone … ------------------ 原始邮件 ------------------ 发件人: PPPengxinyu @.> 发送时间: 2024年10月19日 21:30 收件人: airockchip/rknn-toolkit2 @.> 抄送: yanshuangyingying @.>, Mention @.> 主题: Re: [airockchip/rknn-toolkit2] depth anythingv2 rknn部署 (Issue #125) 请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点 你现在还需要吗?需要的话我可以发你 可以,麻烦发我下v2的rknn模型,请教下你是用什么版本的rk模具链把onnx转的rknn? 我用的最新的rknn包转的,模型在我的主页你进去就能看到 请问,您转换后的.rknn实在哪款芯片上测试的啊? — Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you were mentioned.Message ID: @.***>
请问您是否在rk1266上测试过,谢谢
@yanshuangyingying Thank your for quick responce and sharing .onnx model . Did you use metric models? , I couldn't find .rknn model in your GitHub repo , thanks ?感谢您的快速回复和分享 .onnx 模型。您是否使用了度量模型?,我在您的 GitHub 存储库中找不到 .rknn 模型,谢谢 ?
,i see your message just now the rknn model is in the release you can check it carefully,and i did not use the metric model
请教下rk官方有试过部署depthanythingv2吗?我看了下网上有人尝试转换成功了而且用python的rknn跑通了,但是我尝试用python3.11的rknn_toolkit2-2.1.0+708089d1-cp311-cp311-linux_x86_64.whl转换没有成功,跪求指点